WO2020084827A1 - Dispositif d'affichage d'image superposée et programme informatique - Google Patents

Dispositif d'affichage d'image superposée et programme informatique Download PDF

Info

Publication number
WO2020084827A1
WO2020084827A1 PCT/JP2019/023606 JP2019023606W WO2020084827A1 WO 2020084827 A1 WO2020084827 A1 WO 2020084827A1 JP 2019023606 W JP2019023606 W JP 2019023606W WO 2020084827 A1 WO2020084827 A1 WO 2020084827A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
eye
user
superimposed
guide
Prior art date
Application number
PCT/JP2019/023606
Other languages
English (en)
Japanese (ja)
Inventor
賢二 渡邊
広之 三宅
Original Assignee
アイシン・エィ・ダブリュ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by アイシン・エィ・ダブリュ株式会社 filed Critical アイシン・エィ・ダブリュ株式会社
Publication of WO2020084827A1 publication Critical patent/WO2020084827A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/001Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Arrangement of adaptations of instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/10Intensity circuits
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/37Details of the operation on graphic patterns
    • G09G5/377Details of the operation on graphic patterns for mixing or overlaying two or more graphic patterns
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/38Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory with means for controlling the display position
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/39Control of the bit-mapped memory
    • G09G5/391Resolution modifying circuits, e.g. variable screen formats
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • H04N13/31Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using parallax barriers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/346Image reproducers using prisms or semi-transparent mirrors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/356Image reproducers having separate monoscopic and stereoscopic modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/363Image reproducers using image projection screens
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking

Definitions

  • the present invention relates to a superimposed image display device and a computer program that support driving of a vehicle.
  • various means have been used as information providing means for providing various information to vehicle occupants such as route guidance and warning of obstacles to assist vehicle traveling.
  • it is a display on a liquid crystal display installed in the vehicle, a voice output from a speaker, or the like.
  • an image to be superimposed on the surrounding environment (landscape, real scene) of a vehicle such as a head-up display (hereinafter referred to as HUD) or a window shield display (hereinafter referred to as WSD).
  • HUD head-up display
  • WSD window shield display
  • the HUD and WSD as described above, it is possible to provide various information to the occupant without deviating the direction of the line of sight of the occupant from the front. Further, when the occupant visually recognizes the image to be superimposed on the surrounding environment, in addition to visually recognizing the image with both eyes, the image is visually recognized with only one eye.
  • the virtual image is generated by visually recognizing the image with both eyes.
  • the stereoscopic viewing distance is 50 m or more, it is disclosed that the virtual image is generated by visually recognizing the image with only one eye. This makes it possible to reduce the burden on the user and generate a virtual image with good visibility.
  • Patent Document 1 since the device for detecting the position of the eyes of the occupant is not provided, it is not possible to specify the exact positions of the eyes of the left and right of the occupant on the device side, and the position may be correct depending on the position of the eyes of the occupant. There is a problem that the virtual image cannot be visually recognized at the position or the virtual image itself cannot be visually recognized. In particular, when the image is viewed with only one eye (when only the image for the right eye or the image for the left eye is displayed), the range of the eye position where the virtual image can be viewed becomes narrow, which is a larger problem. It was
  • the present invention has been made in order to solve the above-mentioned problems in the related art.
  • a phenomenon in which the guide image is double viewed due to a difference between the two eyes occurs. It is an object of the present invention to provide a superimposed image display device and a computer program capable of preventing the above and visually recognizing a guide image superimposed at an appropriate position regardless of the current eye position of the user.
  • the superimposed image display device is mounted on a vehicle, and an image display surface capable of displaying a right-eye image visually recognized only by the user's right eye and a left-eye image visually recognized only by the user's left eye.
  • a superimposed image display device for visually recognizing the image displayed on the image display surface by superimposing the image displayed on the image display surface on the surrounding environment around the vehicle by visually recognizing with both eyes of the user.
  • a display position determination unit that determines the display position of the image on the image display surface based on the positions of both eyes of the user detected by the position detection unit, and is a guidance target for the user.
  • the guide image When the guide image is displayed on the image display surface, it is displayed as either the right-eye image or the left-eye image.
  • "to superimpose an image on the surrounding environment to be visually recognized” means that the image is at least overlapped with somewhere in the surrounding environment (including everything visible to the user such as the sky and buildings in addition to the road surface and obstacles). It means that the image is visually recognized in a closed state, and it does not matter whether or not a state in which the image is visually recognized by superimposing an image on a specific object (for example, road surface, obstacle, etc.) in the surrounding environment is maintained.
  • the computer program according to the present invention is a program for supporting driving of a vehicle. Specifically, by mounting on a vehicle and visually recognizing, with both eyes of the user, an image display surface capable of displaying a right-eye image to be visually recognized only by the user's right eye and a left-eye image to be visually recognized only by the user's left eye, The superimposed image display device that superimposes the image displayed on the image display surface on the surrounding environment around the vehicle for visual recognition, and position detection means for detecting the positions of both eyes of the user, and the user's position detected by the position detection means.
  • the guide image to be guided to the user is the image.
  • it When displayed on the display surface, it is displayed as either the image for the right eye or the image for the left eye.
  • the guide image when the guide image is visually recognized by being superimposed on the surrounding environment of the vehicle, the guide image is visually recognized by only one eye of the user, It is possible to prevent a phenomenon in which the guide image is visually recognized twice due to the difference between the two eyes. Further, by making it possible to detect the positions of both eyes of the user, it is possible to visually recognize the guide image superimposed at an appropriate position regardless of the current position of the user's eyes and the type of eyes to be visually recognized.
  • FIG. 1 is a schematic configuration diagram of a superimposed image display device 1 according to the present embodiment.
  • the superimposed image display device 1 basically has a navigation device 3 mounted on a vehicle 2 and a front display 4 also mounted on the vehicle 2 and connected to the navigation device 3.
  • the front display 4 functions as a head-up display together with the windshield 5 of the vehicle 2 as described later, and serves as an information providing means for providing various information to the occupant 6 of the vehicle 2.
  • the navigation device 3 searches for a recommended route to the destination, displays a map image around the current position of the vehicle 2 based on the map data acquired from the server or stored in the memory, and is set.
  • the front display 4 has a function of providing driving guidance along the guide route, warning of obstacles, and the like.
  • the navigation device 3 does not have to have all of the functions described above, and the present invention can be configured as long as the navigation device 3 has at least a function of performing traveling guidance along the guide route, warning of obstacles, and the like. is there. The details of the structure of the navigation device 3 will be described later.
  • the front display 4 is a liquid crystal display installed inside the dashboard 7 of the vehicle 2 and having a function of displaying an image on an image display surface provided on the front surface.
  • the backlight for example, CCFL (cold cathode tube) or white LED is used.
  • CCFL cold cathode tube
  • white LED white LED is used.
  • the front display 4 a combination of an organic EL display or a liquid crystal projector and a screen may be used instead of the liquid crystal display.
  • the front display 4 functions as a head-up display together with the windshield 5 of the vehicle 2, and the image output from the front display 4 is reflected on the windshield 5 in front of the driver's seat so that the occupant 6 of the vehicle 2 can visually recognize the image. Is configured.
  • the images displayed on the front display 4 include information about the vehicle 2 and various information used to support the driving of the occupant 6.
  • a warning for an object that is a warning target for the occupant 6, a guide route set by the navigation device 3 or guide information based on the guide route (arrows indicating right / left turn directions, etc.) , Warnings displayed on the road surface (crash collision, speed limits, etc.), lane markings on which the vehicle is traveling, current vehicle speed, shift position, energy level, advertisement image, facility information, map image, traffic information, news, weather forecast , Time, connected smartphone screen, TV program, etc.
  • the front display 4 of the present embodiment is a liquid crystal display provided with a parallax barrier or a lenticular lens, and an image to be viewed only by the right eye of the occupant 6 (hereinafter referred to as an image for the right eye) and an image viewed by only the left eye of the occupant 6
  • the images to be displayed (hereinafter, referred to as left-eye images) can be displayed. Therefore, if the above-mentioned warning, guidance information, and the like are displayed only as an image for the right eye, the information can be visually recognized only by the right eye. Further, if it is displayed only as the image for the left eye, it becomes possible to make the information visible only to the left eye. Further, if the information is displayed on both the right-eye image and the left-eye image, it becomes possible for both eyes to visually recognize the information.
  • a parallax barrier (parallax barrier) 9 having lattice-shaped gaps is arranged in front of the image display area 8 of the front display 4.
  • the image displayed using the area (pixel) 8A that is visible from the right eye of the occupant 6 through the gap of the parallax barrier 9 is an image that is visible only to the right eye (that is, the image for the right eye).
  • the image displayed using the area (pixel) 8B that is visible from the left eye of the occupant 6 through the gap of the parallax barrier 9 is an image that is visible only to the left eye (that is, the image for the left eye).
  • the area 8A for displaying the image for the right eye and the area 8B for displaying the image for the left eye can also be corrected according to the position of the eyes of the occupant 6. For example, if it is detected that the position of the left eye of the occupant 6 has moved from L1 to R1, or if the position of the right eye of the occupant 6 has been moved from R1 to L1, the image for the right eye is displayed. The area 8A to be displayed and the area 8B to display the image for the left eye are switched. Alternatively, as shown in FIG.
  • the area 8A for displaying the image for the right eye and the area 8B for displaying the image for the left eye are composed of a plurality of pixels in the horizontal direction
  • the area 8A for displaying the image for the right eye and the image for the left eye are Instead of switching the display area 8B at once, it is possible to move the display area 8B pixel by pixel according to the change in the position of the eyes of the user.
  • the right-eye image can be accurately viewed by the user's right eye
  • the left-eye image can be accurately viewed by the user's left eye, regardless of the position of the user's eyes.
  • an organic EL display As a means for displaying an image, an organic EL display, a combination of a liquid crystal projector and a screen may be used instead of a liquid crystal display, but it is a means capable of displaying an image for the right eye and an image for the left eye respectively. Is desirable.
  • polarizing glasses or liquid crystal shutter glasses may be used as a means for displaying the image for the right eye and the image for the left eye.
  • the occupant 6 wears polarized glasses or liquid crystal shutter glasses to visually recognize the front display 4 through the polarized glasses or liquid crystal shutter glasses.
  • the front display 4 displays an image for a right eye image and a left eye image by displaying an image of a method corresponding to the glasses worn (for example, liquid crystal shutter glasses alternately display a right eye image and a left eye image).
  • any display means such as a head-mounted display (HMD) type capable of displaying a right-eye image and a left-eye image can be applied to the present application.
  • HMD head-mounted display
  • the occupant 6 sees the image displayed on the front display 4 by reflecting the windshield 5, the occupant 6 does not see the position of the windshield 5 but the front display at a position far away from the windshield 5.
  • the image displayed in 4 is configured to be visually recognized as the virtual image 10.
  • the virtual image 10 is displayed by being superimposed on the surrounding environment (landscape, real scene) in front of the vehicle, and is superimposed on, for example, an arbitrary object (road surface, building, object to be warned, etc.) located in front of the vehicle. It is also possible to display them.
  • the position at which the virtual image 10 is generated is determined by the position of the front display 4.
  • the imaging distance L is determined by the distance (optical path length) along the optical path from the position where the image is displayed on the front display 4 to the windshield 5.
  • the optical path length is set so that the imaging distance L is 1.5 m.
  • the front display 4 is used as the means for displaying the image to be superimposed on the surrounding environment in front of the vehicle, but other means may be used.
  • a window shield display that displays an image on the windshield 5 may be used.
  • WSD window shield display
  • an image may be displayed from a projector using the windshield 5 as a screen, or the windshield 5 may be used as a transmissive liquid crystal display.
  • the image displayed on the windshield 5 by the WSD is an image to be superimposed on the surrounding environment in front of the vehicle, like the front display 4.
  • a front camera 11 is installed above the front bumper of the vehicle, behind the room mirror, and the like.
  • the front camera 11 is an imaging device having a camera that uses a solid-state imaging device such as a CCD, and is installed with the optical axis direction facing forward in the traveling direction of the vehicle. Then, by performing image processing on the captured image captured by the front camera 11, the situation of the front environment (that is, the environment in which the virtual image 10 is superimposed) visually recognized by the occupant 6 through the windshield is detected. It A sensor such as a millimeter wave radar may be used instead of the front camera 11.
  • an in-vehicle camera 12 is installed on the upper surface of the instrument panel of the vehicle.
  • the in-vehicle camera 12 is an imaging device having a camera using a solid-state imaging device such as a CCD, and is installed with the optical axis direction facing the driver's seat.
  • the range in which the occupant's face is generally expected to be located inside the vehicle is set as the detection range (imaging range of the in-vehicle camera 12), and the face of the occupant 6 sitting in the driver's seat is imaged.
  • image processing is performed on the captured image captured by the in-vehicle camera 12 to detect the position of the left and right eyes of the occupant 6 (line-of-sight start point) and the line-of-sight direction.
  • FIG. 4 is a block diagram showing the navigation device 3 according to the present embodiment.
  • the navigation device 3 includes a current position detection unit 13 that detects the current position of the vehicle 2 in which the navigation device 3 is mounted, and a data recording unit 14 that records various data.
  • a navigation ECU 15 that performs various kinds of arithmetic processing based on the input information, an operation unit 16 that receives an operation from a user, and a liquid crystal display 17 that displays a map around the vehicle and facility information related to the facility to the user.
  • Communication for communication between a speaker 18 that outputs voice guidance regarding route guidance, a DVD drive 19 that reads a DVD that is a storage medium, and an information center such as a VICS (registered trademark: Vehicle Information and Communication System) center.
  • a module 20 is connected to the front display 4, the front camera 11, the in-vehicle camera 12 and the like described above via an in-vehicle network such as CAN.
  • the current position detection unit 13 includes a GPS 21, a vehicle speed sensor 22, a steering sensor 23, a gyro sensor 24, and the like, and is capable of detecting the current vehicle position, azimuth, vehicle traveling speed, current time, and the like.
  • the vehicle speed sensor 22 is a sensor for detecting the moving distance and the vehicle speed of the vehicle, generates a pulse in accordance with the rotation of the drive wheels of the vehicle, and outputs a pulse signal to the navigation ECU 15. Then, the navigation ECU 15 calculates the rotational speed and the moving distance of the drive wheels by counting the generated pulses. It is not necessary for the navigation device 3 to include all of the four types of sensors, and the navigation device 3 may include only one or a plurality of types of these sensors.
  • the data recording unit 14 also reads a hard disk (not shown) as an external storage device and a recording medium, a map information DB 31 and a predetermined program recorded in the hard disk, and a driver for writing predetermined data in the hard disk. And a recording head (not shown).
  • the data recording unit 14 may have a flash memory, a memory card, or an optical disk such as a CD or a DVD instead of the hard disk.
  • the map information DB 31 may be stored in an external server and the navigation device 3 may obtain it by communication.
  • the map information DB 31 is, for example, link data 32 regarding roads (links), node data 33 regarding node points, point data 34 regarding points such as facilities, intersection data regarding each intersection, and map display data for displaying a map.
  • the storage means stores search data for searching a route, search data for searching a spot, and the like.
  • the link data 32 for example, a link ID for identifying the link, end node information for identifying a node located at the end of the link, a road type of a road formed by the link, and the like are stored.
  • the node data 33 a node ID for identifying the node, a position coordinate of the node, connection destination node information for identifying a connection destination node to which the node is connected via a link, and the like are stored.
  • the point data 34 various kinds of information regarding facilities to be set as a destination are stored. For example, the ID for identifying the facility, the facility name, the position coordinates, the genre, the address, etc. are stored.
  • the navigation ECU (electronic control unit) 15 is an electronic control unit that controls the entire navigation device 3, and a CPU 41 as an arithmetic unit and a control unit, and a working memory for the CPU 41 to perform various arithmetic processes.
  • a driving support processing program FIG. 5
  • An internal storage device such as a flash memory 44 for storing a program is provided.
  • the navigation ECU 15 has various means as a processing algorithm.
  • the position detecting means detects the positions of both eyes of the user.
  • the display position determining means determines the display position of the image on the front display 4 based on the positions of the eyes of the user detected by the position detecting means.
  • the operation unit 16 is operated when inputting a departure point as a travel start point and a destination as a travel end point, and has a plurality of operation switches (not shown) such as various keys and buttons. Then, the navigation ECU 15 performs control so as to execute various corresponding operations based on the switch signal output by pressing each switch.
  • the operation unit 16 may have a touch panel provided on the front surface of the liquid crystal display 17. Further, it may have a microphone and a voice recognition device.
  • liquid crystal display 17 a map image including roads, traffic information, operation guidance, operation menu, key guidance, guidance route from the departure point to the destination, guidance information along the guidance route, news, weather forecast, Time, mail, TV programs, etc. are displayed.
  • the liquid crystal display 17 may be omitted if the front display 4 is used to display the map image and the like.
  • the speaker 18 outputs voice guidance for guiding the driver to travel along the guide route and guidance for traffic information based on an instruction from the navigation ECU 15.
  • the DVD drive 19 is a drive that can read data recorded on a recording medium such as a DVD or a CD. Then, based on the read data, reproduction of music or video, updating of the map information DB 31, etc. are performed.
  • a card slot for reading and writing a memory card may be provided instead of the DVD drive 19.
  • the communication module 20 is a communication device for receiving traffic information including traffic congestion information, regulation information, traffic accident information and the like transmitted from a traffic information center, for example, a VICS center or a probe center.
  • a traffic information center for example, a VICS center or a probe center.
  • a mobile phone or DCM is applicable.
  • FIG. 5 is a flowchart of the driving support processing program according to the present embodiment.
  • the driving support processing program is executed after the ACC power supply (accessory power supply) of the vehicle is turned on, and by superimposing the image displayed on the front display 4 on the surrounding environment of the vehicle for visual recognition, It is a program that provides various information to passengers.
  • the programs shown in the flowcharts of FIGS. 5 and 8 below are stored in the RAM 42 or the ROM 43 included in the navigation device 3, and are executed by the CPU 41.
  • a warning for an object for example, another vehicle, a pedestrian, a guide sign
  • a warning displayed on the road surface a rear-end collision warning, a speed limit, etc.
  • a lane section in which the vehicle travels It is possible to display lines, advertisement images, facility information, map images, traffic information, news, weather forecast, time, screens of connected smartphones, TV programs, and the like.
  • step (hereinafter abbreviated as S) 1 the CPU 41 acquires the current position information of the own vehicle, the map information around the own vehicle, and the guide route set by the navigation device 3, respectively.
  • the current position of the host vehicle is detected using, for example, the GPS 21 or the vehicle speed sensor 22.
  • the map information around the own vehicle is acquired from the map information DB 31 or an external server.
  • the map information includes information that identifies the shape of the road and the lane segment.
  • the CPU 41 performs image processing such as binarization processing on the captured image captured by the front camera 11 to display the road surface of the road on which the vehicle is running, which is the target of superimposition of the virtual image of the guide arrow. To detect.
  • the CPU 41 specifies a specific position of the road surface of the road on which the vehicle on which the virtual image of the guide arrow is to be superimposed (hereinafter referred to as a superimposition target road surface) is three-dimensional position coordinates.
  • a specific position of the road surface of the road on which the vehicle on which the virtual image of the guide arrow is to be superimposed is three-dimensional position coordinates.
  • the road surface of the road including the guidance intersection if there is a guidance intersection
  • 100 m ahead from the current position of the vehicle along the guidance route is set as the superimposition target road surface.
  • the position of the road surface to be superimposed may be specified based on the imaged image taken by the front camera 11, or may be specified using a sensor such as a millimeter wave radar.
  • the navigation device 3 can acquire map information including three-dimensional information, it is also possible to specify the position of the road surface to be superimposed on the basis of the map information and the current position of the vehicle.
  • the position of the road surface to be superimposed will be continuously specified until the driving support processing program ends. For example, if the position of the road surface to be superimposed is specified for each frame of the image captured by the front camera 11, the current position of the road surface to be superimposed can be continuously specified.
  • the CPU 41 determines whether or not to provide information about the vehicle via the front display 4.
  • the information regarding the own vehicle includes, for example, the current vehicle speed, the shift position, the remaining energy level, and the like. It should be noted that the ON / OFF switching of the provision of information regarding the own vehicle can be performed by the operation of the vehicle occupant. Further, ON or OFF may be automatically switched based on the surrounding situation or the state of the vehicle.
  • the CPU 41 performs display range determination processing (FIG. 8) described later.
  • the display range determination processing the range in which the guide image is projected on the windshield 5 (that is, the range in which the guide image is displayed on the front display 4) is determined.
  • an image of a guide arrow (hereinafter referred to as the first guide image) indicating the traveling direction of the vehicle along the guide route set by the navigation device 3 and an image (hereinafter referred to as the first guide image) that guides information about the vehicle.
  • Two types of images (referred to as 2 guide images) are displayed. Therefore, in S5, the display range of each of the first guide image and the second guide image is determined.
  • the first guide image is an image that needs to be maintained in a state of being superposed on a specific object (specifically, a superposition target road surface) included in the surrounding environment visually recognized from the vehicle.
  • the first guide image is one of the images selected for the right-eye image or the left-eye image on the front display 4 in order to prevent the image from being visually recognized twice due to the difference between the two eyes. It is displayed (that is, the first guidance image is visually recognized only by one of the right eye and the left eye). Therefore, in S5, the display range of the first guide image in the selected one of the right-eye image and the left-eye image is determined as described later.
  • the second guide image is an image that does not need to be maintained in a state in which it is superposed on a specific object included in the surrounding environment visually recognized by the vehicle. Further, since the accuracy of the overlapping position of the second guide image does not matter as compared with the first guide image, both the right-eye image and the left-eye image are displayed on the front display 4 so that they can be visually recognized more comfortably. The image is displayed (that is, the second guidance image is visually recognized by both the right eye and the left eye). Therefore, in S5, the display range of the second guide image in both the right-eye image and the left-eye image is determined as described later.
  • the CPU 41 transmits a control signal to the front display 4 and displays the first guide image on the front display 4 in the display range of the image for the right eye or the image for the left eye determined in S5. .
  • the second guide image is displayed in each display range of the image for the right eye and the image for the left eye determined in S5.
  • the virtual images of the first guide image and the second guide image are visually recognized by the occupant of the vehicle in an overlapping manner with the surrounding environment in front of the vehicle.
  • the virtual image of the first guide image is visually recognized as being overlapped with the road surface to be superimposed when visually recognized by an occupant of the vehicle.
  • FIG. 6 shows the contents of the left-eye image and the right-eye image displayed on the front display 4 in S6, and the first guide image visually recognized by the occupant of the vehicle as a result of visually recognizing the left-eye image and the right-eye image.
  • FIG. 6 is a diagram showing an example of a virtual image of a second guide image.
  • the first guidance image 55 is displayed as the left-eye image 56
  • the second guidance image 57 is displayed as the left-eye image 56 and the right-eye image 58.
  • the display area in which the first guide image 55 and the second guide image 57 are not displayed is a black image or an image with zero brightness. Therefore, the region of the right-eye image 58 corresponding to the first guide image 55 displayed on the left-eye image 56 becomes a black image or an image of 0 brightness, and even when the front display 4 is visually recognized by both eyes, It is possible for only one eye to visually recognize the first guide image.
  • the virtual image of the first guide image 55 superimposed on the surrounding environment is displayed with the left eye over the windshield 5 of the vehicle.
  • the virtual image of the second guide image 57 is visually recognized by both eyes.
  • the virtual image of the first guide image 55 is visually recognized in the surrounding environment, in particular, on the road surface 59 to be superimposed in front of the vehicle.
  • FIG. 6 is an example of the first guidance image 55 displayed when the guidance route is a route that turns right at the front guidance intersection, and the first guidance image 55 is an image of an arrow indicating that the first guidance image 55 turns right at the guidance intersection.
  • the occupant can grasp the position of the guided intersection and the traveling direction of the vehicle at the guided intersection without moving the line of sight from the front of the traveling direction.
  • the virtual image of the first guide image 55 is visually recognized only by the left eye of the occupant of the vehicle, it is possible to prevent a phenomenon in which the image is double visually recognized due to the difference between both eyes.
  • the display of the first guide image 55 is basically performed while the guide route is set (a straight arrow is displayed when going straight), but is only displayed when approaching the guidance intersection. You may do it.
  • the virtual image of the second guide image 57 is visually recognized in the vicinity of the center of the lower edge of the windshield 5 of the vehicle, being superimposed on the surrounding environment. Unlike the first guide image, the second guide image 57 is not superposed on a specific object, but is fixed at the same position on the windshield 5 and is visually recognized.
  • the second guidance image 57 is an image showing the current vehicle speed of the vehicle. As a result, the occupant can grasp the current vehicle speed of the vehicle without moving the line of sight from the front in the traveling direction. Further, since the virtual image of the second guide image 57 is visually recognized by both eyes of the occupant of the vehicle, there is no sense of discomfort as compared with the case of viewing with one eye.
  • the display of the second guidance image 57 is basically always performed when the information provision of the own vehicle using the front display 4 is turned on.
  • the display range of the first guide image 55 is newly determined in S5, and the display range is updated. That is, when the surrounding environment visually recognized from the vehicle changes, the display position and the display size of the first guide image 55 also change in association with it.
  • the first guidance image 55 is an image associated with the position where it is superimposed in the surrounding environment.
  • the second guide image 57 since the second guide image 57 does not need to be superimposed on a specific object, it is basically displayed at a fixed position of the windshield 5 (for example, near the center of the lower edge). That is, the display position of the second guide image 57 is not linked to the change of the surrounding environment visually recognized from the own vehicle, and the display range initially determined in S5 is basically unchanged. In other words, the second guide image 57 is an image in which the positions to be superimposed in the surrounding environment are not associated.
  • the CPU 41 determines whether to switch the eyes of the occupant who visually recognizes the first guidance image to the other eye. For example, when the occupant's left eye visually recognizes, when the occupant changes his / her posture and the left eye deviates from the detection area where the eye position of the occupant 6 can be detected by the in-vehicle camera 12, or the edge of the detection area. On the other hand, when it approaches within a predetermined distance (for example, within 5 cm) or when the fatigue level of the left eye exceeds the threshold value, it is determined that the occupant's eye visually recognizing the first guidance image is switched to the right eye.
  • a predetermined distance for example, within 5 cm
  • the eyes of the occupant who visually recognizes the first guidance image are switched to the left eye when the distance to the part is within a predetermined distance (for example, within 5 cm) or when the fatigue level of the right eye exceeds a threshold value.
  • the degree of eye fatigue is estimated from the viewing time of the first guide image. For example, it is determined that the fatigue level of the left eye exceeds the threshold value when the state in which the left eye visually recognizes the first guidance image becomes a predetermined time (for example, 10 minutes) or more.
  • the timing of switching the eyes of the passenger who visually recognizes the first guidance image to the other eye is the timing when the first guidance image is hidden.
  • the first guide image is basically displayed while the guide route is set, but is displayed in a display mode in which display and non-display are repeated at regular intervals (for example, in units of 1 sec).
  • the timing of switching the eyes of the passenger who visually recognizes the first guidance image to the other eye may be the timing of changing the display mode of the first guidance image.
  • the first guide image is a guide arrow indicating the traveling direction of the vehicle along the guide route
  • the timing of changing the display mode of the first guide image may be, in addition to the timing of changing the interval of the guide arrows, the timing of changing the shape of the guide arrows, the timing of changing the display color of the guide arrows, or the like.
  • the eyes of the occupant who visually recognizes the first guidance image is switched to the other eye (S7: YES).
  • the eyes of the occupant who visually recognizes the first guidance image is switched to the other eye (S8).
  • the display range of the first guidance image 55 is determined based on the position of the eyes of the occupant newly switched in S5, and the display range is updated.
  • the first guide image 55 is displayed on the left-eye image 56
  • the first guide image 55 is displayed on the right-eye image 58 thereafter, while the first guide image 55 is displayed on the right-eye image 58. If it is displayed, the first guide image 55 is displayed as the left-eye image 56 thereafter.
  • the first guidance image 55 is visually recognized by the occupant at the same position before and after the eyes of the occupant visually recognizing the first guidance image 55 are switched.
  • S9 similarly to S5, the display range determination described below is performed. Processing (FIG. 8) is performed. However, in S9, only the first guide image is displayed and only the display range of the first guide image is determined.
  • the CPU 41 transmits a control signal to the front display 4, and displays the first guide image on the front display 4 in the display range of the image for the right eye or the image for the left eye determined in S9.
  • the virtual image of the first guidance image is visually recognized by the occupant of the vehicle in a superimposed manner on the surrounding environment in front of the vehicle.
  • the virtual image of the first guide image is visually recognized as being overlapped with the road surface to be superimposed when visually recognized by an occupant of the vehicle.
  • FIG. 7 shows the contents of the left-eye image and the right-eye image displayed on the front display 4 in S10, and the first guide image visually recognized by the occupant of the vehicle as a result of visually recognizing the left-eye image and the right-eye image.
  • FIG. 6 is a diagram showing an example of a virtual image of a second guide image.
  • the first guidance image 55 is displayed as the left-eye image 56. Further, in the left-eye image 56, the display area in which the first guide image 55 is not displayed and the entire right-eye image 58 are black images or images with zero brightness. Therefore, the region of the right-eye image 58 corresponding to the first guide image 55 displayed on the left-eye image 56 becomes a black image or an image of 0 brightness, and even when the front display 4 is visually recognized by both eyes, It is possible for only one eye to visually recognize the first guide image.
  • the virtual image of the first guide image 55 superimposed on the surrounding environment is displayed with the left eye over the windshield 5 of the vehicle. To be seen.
  • the virtual image of the first guide image 55 is visually recognized in the surrounding environment, in particular, on the road surface 59 to be superimposed in front of the vehicle.
  • FIG. 7 is an example of the first guidance image 55 displayed when the guidance route is a route that makes a right turn at the guidance intersection ahead, and the first guidance image 55 is an image of an arrow indicating that the guidance route turns right at the guidance intersection.
  • the occupant can grasp the position of the guided intersection and the traveling direction of the vehicle at the guided intersection without moving the line of sight from the front of the traveling direction.
  • the virtual image of the first guide image 55 is visually recognized only by the left eye of the occupant of the vehicle, it is possible to prevent a phenomenon in which the image is double visually recognized due to the difference between both eyes.
  • the display of the first guide image 55 is basically performed while the guide route is set (a straight arrow is displayed when going straight), but is only displayed when approaching the guidance intersection. You may do it. Then, the process proceeds to S7.
  • FIG. 8 is a flowchart of a sub-process program of the display range determination process.
  • the CPU 41 detects the positions of the left and right eyes of the occupant of the vehicle (starting point of sight line) based on the captured image captured by the in-vehicle camera 12.
  • the detected eye position is specified by three-dimensional position coordinates.
  • the CPU 41 selects either the right eye or the left eye as the eyes of the occupant who visually recognizes the first guidance image. For example, the selection is made according to the following criteria (1) to (4).
  • the condition (1) is given the highest priority, and any of the conditions (2) to (4) may be adopted. (1) If only one eye is included in the detection area in which the position of the eyes of the occupant 6 can be detected by the in-vehicle camera 12, the included eyes are selected. (2) If both eyes are included in the detection area and the dominant eye of the occupant can be determined, the dominant eye of the occupant is selected.
  • the fatigue level of the left and right eyes is detected from the image captured by the in-vehicle camera 12, and the eye with the lesser fatigue level is selected.
  • the detection area includes both eyes, the eye closer to the center of the detection area is selected.
  • the eyes of the occupant who visually recognizes the first guidance image selected in S12 may not be continuously fixed thereafter, but may be switched to the other eye in the subsequent processing of S8. However, the eyes of the occupant who visually recognizes the first guidance image may be fixed.
  • the CPU 41 determines whether or not the display on the front display 4 is ON.
  • the display of the front display 4 can be switched between ON and OFF by an operation of a vehicle occupant. Further, ON or OFF may be automatically switched based on the surrounding situation or the state of the vehicle.
  • the CPU 41 acquires the position coordinates of the windshield 5 on which the image is projected by the front display 4.
  • the position coordinates of the windshield 5 are specified by three-dimensional position coordinates.
  • the CPU 41 causes the position coordinates of the road surface to be superimposed detected in S3, the position coordinates of the position of the eyes of the occupant of the vehicle (gaze starting point) detected in S11, and the front acquired in S14.
  • position coordinates for displaying the first guide image and the second guide image on the windshield 5 are calculated.
  • the display position of the first guide image is set to a position where the virtual image of the first guide image overlaps with the road surface to be superimposed and is visually recognized by the vehicle occupant as shown in FIGS. 8 and 9.
  • the display position of the second guide image is the fixed position of the windshield 5 (for example, near the center of the lower edge) as shown in FIG.
  • the CPU 41 determines the projection range of the first guidance image and the second guidance image on the windshield 5 based on the position coordinates calculated in S15. Further, the display range of the first guide image and the second guide image on the front display 4 is also determined from the determined projection range. Further, the front display 4 of the present embodiment can respectively display a right-eye image that is viewed by only the right eye of the occupant 6 and a left-eye image that is viewed by only the left eye of the occupant 6. Then, for the first guide image, the display range in the image corresponding to the eye selected in S12 (the image for the right eye if viewed by the right eye) is specified from among the image for the left eye and the image for the right eye. On the other hand, for the second guidance image, the display range in both the left-eye image and the right-eye image is specified.
  • FIG. 9 is a diagram illustrating the details of the determination process of the display range of the first guide image on the front display 4.
  • the windshield 5 is connected by a straight line to connect the line of sight starting point S and the road surface 59 to be superimposed.
  • the projection range of the first guidance image at is determined.
  • the display range of the first guide image on the front display 4 is also determined based on the visual angle from the line-of-sight start point S to the projection range and the inclination angle of the windshield 5.
  • the line-of-sight start point S is one point, and therefore the processing relating to the determination of the display range of the first guidance image on the front display 4 is viewed with both eyes. It can be simplified as compared with the case.
  • the image for the right eye to be viewed only by the right eye of the user and the left eye that is viewed only by the left eye of the user is provided to detect the positions of both eyes of the vehicle occupant (S11), and the display position of the guide image on the front display 4 is determined based on the detected positions of both eyes of the occupant. (S16), in particular, by displaying the first guidance image as either the right-eye image or the left-eye image, the first guidance image displayed on the front display 4 is superimposed on the surrounding environment around the vehicle.
  • both eyes can be viewed by making only one eye of the occupant visually recognize the first guidance image.
  • First guide image it is possible to prevent that the event is visible to the doubly caused by. Further, by making it possible to detect the positions of both eyes of the occupant, it is possible to visually recognize the first guide image superimposed at an appropriate position regardless of the current position of the occupant's eyes and the type of eyes to be visually recognized. To do.
  • the present invention is not limited to the above embodiment, and it is needless to say that various improvements and modifications can be made without departing from the gist of the present invention.
  • the virtual image is generated in front of the windshield 5 of the vehicle 2 by the front display 4, but the virtual image may be generated in front of the window other than the windshield 5.
  • the object on which the image is reflected by the front display 4 may be the visor (combiner) installed around the windshield 5 instead of the windshield 5 itself.
  • the front display 4 is used as a means for displaying an image superimposed on the surrounding environment, but a window shield display (WSD) for displaying an image on the windshield 5 may be used.
  • WSD window shield display
  • the first guide image is displayed as either the right-eye image or the left-eye image
  • the second guide image is displayed as both the right-eye image and the left-eye image.
  • the two-guidance image may be displayed as either the right-eye image or the left-eye image.
  • the first guidance image and the second guidance image may be viewed with the same eye or different eyes (for example, the first guidance image is the left eye, and the second guidance image is the right eye. ).
  • the first guide image is an image of a guide arrow indicating the traveling direction of the vehicle along the guide route set by the navigation device 3, but other images may be used.
  • a warning image of an object for example, another vehicle, a pedestrian, a guide sign
  • an image of a lane marking in which the vehicle travels, or the like may be used.
  • the first guidance image is an image that needs to be maintained in a state of being superimposed on a specific object (for example, a road surface) included in the surrounding environment visually recognized from the vehicle. It may be an image (for example, an image of a smartphone, a map image, etc.) that does not need to be maintained in the state of being superposed on.
  • the processing of the driving support processing program (FIG. 5) is configured to be executed by the navigation ECU 15 of the navigation device 3, but the execution subject can be appropriately changed.
  • it may be configured to be executed by the control unit of the front display 4, the vehicle control ECU, and other vehicle-mounted devices.
  • the control unit of the front display 4 executes the superimposing image display device according to the present invention
  • the superimposition image display device may be configured by only the front display 4.
  • the superimposed image display device can also have the following configuration, and in that case, the following effects are exhibited.
  • the first configuration is as follows.
  • An image display surface (4) mounted on the vehicle (2) and capable of displaying a right-eye image (58) that is visible only to the right eye of the user (6) and a left-eye image (56) that is visible only to the user's left eye.
  • a superimposed image display device (1) for visually recognizing an image displayed on the image display surface by observing the image with both eyes of a user, by detecting the positions of both eyes of the user.
  • the guide image (55) to be guided to the user is displayed on the image display surface, it is displayed as either the right-eye image or the left-eye image.
  • the guide image is visually recognized only by one of the eyes of the user. It is possible to prevent the occurrence of a heavily visually recognized event. Further, by making it possible to detect the positions of both eyes of the user, it is possible to visually recognize the guide image superimposed at an appropriate position regardless of the current position of the user's eyes and the type of eyes to be visually recognized.
  • the second configuration is as follows.
  • the guidance image is displayed as both a first guidance image (55) displayed as either the right-eye image (58) or the left-eye image (56), and both the right-eye image and the left-eye image.
  • the superimposed image display device having the above configuration, by visually recognizing the first guide image with only one eye, it is possible to prevent the phenomenon in which the guide image is double recognized due to the difference between the two eyes.
  • the second guide image can be visually recognized by both eyes without any discomfort.
  • the third configuration is as follows.
  • the first guide image (55) is an image that needs to be maintained in a state of being superposed on a specific object (59) included in the surrounding environment visually recognized by the vehicle (2)
  • the second guide image (55) is an image that does not need to be maintained in a state of being superimposed on a specific target object included in the surrounding environment visually recognized by the vehicle.
  • the superimposed image display device having the above-described configuration, when the first guide image, which requires the accuracy of the superimposed position, is visually recognized by only one eye, there is a phenomenon that the guide image is double visually recognized due to the difference between both eyes. While it is possible to prevent the second guide image from being generated, it is possible to visually recognize the second guide image that does not require the accuracy of the overlapping position with both eyes.
  • the fourth configuration is as follows.
  • the guide image (55) is displayed as either the right-eye image (58) or the left-eye image (56)
  • the area of the other image corresponding to the guide image displayed as one image Is a black image.
  • the fifth configuration is as follows.
  • the guide image (55) is displayed as either the right-eye image (58) or the left-eye image (56)
  • the area of the other image corresponding to the guide image displayed as one image Is an image with 0 brightness.
  • the superimposed image display device having the above configuration even when the image display surface is viewed by both eyes of the user, it is possible to allow only one eye to view the guide image.
  • the sixth configuration is as follows.
  • the guide image (55) is displayed on the image display surface (4)
  • the guide image has an eye selecting unit (41) for selecting the eyes of the user (6) who visually recognizes the guide image.
  • the image for the right eye (58) or the image for the left eye (56) is displayed in the image corresponding to the eye selected by the eye selecting means.
  • the guide image when the guide image is viewed by being superimposed on the surrounding environment of the vehicle, the guide image can be viewed only by one eye of the selected user.
  • the seventh configuration is as follows.
  • the eye selecting means (41) preferentially selects eyes included in the detection range of the position detecting means (41).
  • the guide image can be visually recognized by the eye capable of detecting the current position on the device side. As a result, it is possible to visually recognize the guide image superimposed on an appropriate position.
  • the eighth configuration is as follows.
  • the eye selecting means (41) switches the user's eyes to be selected as a target for visually recognizing the guidance image based on the positional relationship of the user's eyes with respect to the detection range of the position detecting means (41).
  • the guide image is displayed.
  • the ninth configuration is as follows.
  • a target for visualizing the guide image (55) based on the degree of fatigue of the eyes of the user which has a degree of fatigue detecting means (41) for detecting the degree of fatigue of the eyes of the user.
  • the superimposed image display device having the above-described configuration even if the fatigue of the eye visually recognizing the guide image is accumulated, the burden on the user is increased by switching the eye to visually recognize the guide image to the other eye. Instead, it is possible to continuously make the guide image visible only to one eye.
  • the tenth configuration is as follows.
  • the eye selecting means (41) switches the eyes of the user to be selected as a target for visually recognizing the guide image at the timing when the guide image (55) is hidden. According to the superimposed image display device having the above configuration, even if the eye for visually recognizing the guide image is switched to the other eye, the guide image visually recognized by the user does not feel uncomfortable.
  • the eleventh configuration is as follows.
  • the eye selecting means (41) switches the eyes of the user to be selected as the target for visually confirming the guide image at the timing of changing the display mode of the guide image (55). According to the superimposed image display device having the above configuration, even if the eye for visually recognizing the guide image is switched to the other eye, the guide image visually recognized by the user does not feel uncomfortable.
  • the twelfth configuration is as follows.
  • the display position determining means (41) is arranged on the image display surface (4) based on the position where the guide image (55) is superimposed in the surrounding environment and the eye position selected by the eye selecting means (41).
  • the display position of the guide image is determined. According to the superimposed image display device having the above configuration, it is possible to simplify the processing relating to the determination of the display position of the guide image, as compared with the case where the guide image is visually recognized by both eyes. Further, regardless of the position of the eyes of the user, it is possible to surely make the guide image visible only to one eye of the selected user.
  • the thirteenth configuration is as follows. By reflecting the image displayed on the image display surface (4) on the windshield (5) of the vehicle (2) and making it visible to the user (6), a virtual image of the image displayed on the image display surface is displayed. Make it visible in the surrounding environment around the vehicle. According to the superimposed image display device having the above configuration, it is possible to visually recognize the guide image by superimposing it on the surrounding environment of the vehicle by the virtual image of the image.
  • the fourteenth configuration is as follows. By viewing the image display surface (4) through polarizing glasses or liquid crystal shutter glasses worn by the user (6), only the right-eye image (58) to be viewed by only the right eye of the user and the left eye of the user are displayed. The left-eye image (56) to be visually recognized can be displayed on the image display surface. According to the superimposed image display device having the above configuration, it is possible to allow the user to visually recognize the guide image only by wearing the polarizing glasses or the liquid crystal shutter glasses.

Abstract

L'invention concerne un dispositif d'affichage d'image superposée et un programme informatique qui permettent d'empêcher l'apparition d'une vision double d'une image de guidage due à la parallaxe des yeux, et d'amener l'image de guidage à être visible superposée à une position appropriée quelle que soit la position actuelle de l'œil d'un utilisateur. Le dispositif d'affichage d'image superposée est pourvu d'une unité d'affichage avant 4 permettant d'afficher, respectivement, une image d'œil droit visible uniquement par l'œil droit de l'utilisateur et une image d'œil gauche visible uniquement par l'œil gauche de l'utilisateur, et est configuré de telle sorte que : les positions de chaque œil d'un occupant de véhicule sont détectées ; une position d'affichage d'image de guidage sur l'unité d'affichage avant 4 est déterminée sur la base des positions détectées des yeux de l'occupant ; et une première image de guidage en particulier est affichée sous la forme d'une image d'œil droit ou d'une image d'œil gauche, ce qui amène la première image de guidage affichée sur l'unité d'affichage avant 4 à être visible superposée sur l'environnement ambiant autour du véhicule.
PCT/JP2019/023606 2018-10-25 2019-06-14 Dispositif d'affichage d'image superposée et programme informatique WO2020084827A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-200551 2018-10-25
JP2018200551A JP2020067576A (ja) 2018-10-25 2018-10-25 重畳画像表示装置及びコンピュータプログラム

Publications (1)

Publication Number Publication Date
WO2020084827A1 true WO2020084827A1 (fr) 2020-04-30

Family

ID=70331506

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/023606 WO2020084827A1 (fr) 2018-10-25 2019-06-14 Dispositif d'affichage d'image superposée et programme informatique

Country Status (2)

Country Link
JP (1) JP2020067576A (fr)
WO (1) WO2020084827A1 (fr)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007199295A (ja) * 2006-01-25 2007-08-09 Epson Imaging Devices Corp 表示装置
JP2007310285A (ja) * 2006-05-22 2007-11-29 Denso Corp 表示装置
JP2010072596A (ja) * 2008-09-22 2010-04-02 Toshiba Corp 表示装置及び移動体
JP2012169726A (ja) * 2011-02-10 2012-09-06 Seiko Epson Corp 頭部装着型表示装置および頭部装着型表示装置の制御方法
JP2012211959A (ja) * 2011-03-30 2012-11-01 Brother Ind Ltd ヘッドマウントディスプレイ
JP2015194709A (ja) * 2014-03-28 2015-11-05 パナソニックIpマネジメント株式会社 画像表示装置
JP2017185988A (ja) * 2016-04-01 2017-10-12 株式会社デンソー 車両用装置、車両用プログラム、フィルタ設計プログラム

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007199295A (ja) * 2006-01-25 2007-08-09 Epson Imaging Devices Corp 表示装置
JP2007310285A (ja) * 2006-05-22 2007-11-29 Denso Corp 表示装置
JP2010072596A (ja) * 2008-09-22 2010-04-02 Toshiba Corp 表示装置及び移動体
JP2012169726A (ja) * 2011-02-10 2012-09-06 Seiko Epson Corp 頭部装着型表示装置および頭部装着型表示装置の制御方法
JP2012211959A (ja) * 2011-03-30 2012-11-01 Brother Ind Ltd ヘッドマウントディスプレイ
JP2015194709A (ja) * 2014-03-28 2015-11-05 パナソニックIpマネジメント株式会社 画像表示装置
JP2017185988A (ja) * 2016-04-01 2017-10-12 株式会社デンソー 車両用装置、車両用プログラム、フィルタ設計プログラム

Also Published As

Publication number Publication date
JP2020067576A (ja) 2020-04-30

Similar Documents

Publication Publication Date Title
WO2018066711A1 (fr) Dispositif d'aide aux déplacements et programme informatique
WO2019097763A1 (fr) Dispositif d'affichage d'image superposée et programme informatique
JP6516642B2 (ja) 電子装置、画像表示方法および画像表示プログラム
WO2014208165A1 (fr) Dispositif d'affichage tête haute
WO2014208164A1 (fr) Dispositif d'affichage tête haute
JP2017094882A (ja) 虚像生成システム、虚像生成方法及びコンピュータプログラム
WO2019097762A1 (fr) Dispositif d'affichage d'image superposée et programme informatique
JP5327025B2 (ja) 車両用走行案内装置、車両用走行案内方法及びコンピュータプログラム
US20140043466A1 (en) Environment image display apparatus for transport machine
JP2014120111A (ja) 走行支援システム、走行支援方法及びコンピュータプログラム
JP2019116229A (ja) 表示システム
JP4908779B2 (ja) ナビゲーション装置、画像表示方法および画像表示プログラム
JP2019056884A (ja) 重畳画像表示装置
JP6825433B2 (ja) 虚像表示装置及びコンピュータプログラム
JP2014120114A (ja) 走行支援システム、走行支援方法及びコンピュータプログラム
WO2022209439A1 (fr) Dispositif d'affichage d'image virtuelle
JP2018173399A (ja) 表示装置及びコンピュータプログラム
WO2019208365A1 (fr) Dispositif d'affichage d'informations
JP6287351B2 (ja) ヘッドアップディスプレイ装置
JP2023012793A (ja) 重畳画像表示装置
JP2014120113A (ja) 走行支援システム、走行支援方法及びコンピュータプログラム
JP6485310B2 (ja) 情報提供システム、情報提供方法及びコンピュータプログラム
JP2009098501A (ja) 視覚情報呈示装置及び視覚情報呈示方法
JP6993068B2 (ja) 表示システム
JP6805974B2 (ja) 走行支援装置及びコンピュータプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19877042

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19877042

Country of ref document: EP

Kind code of ref document: A1