WO2020084827A1 - Superimposed-image display device, and computer program - Google Patents

Superimposed-image display device, and computer program Download PDF

Info

Publication number
WO2020084827A1
WO2020084827A1 PCT/JP2019/023606 JP2019023606W WO2020084827A1 WO 2020084827 A1 WO2020084827 A1 WO 2020084827A1 JP 2019023606 W JP2019023606 W JP 2019023606W WO 2020084827 A1 WO2020084827 A1 WO 2020084827A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
eye
user
superimposed
guide
Prior art date
Application number
PCT/JP2019/023606
Other languages
French (fr)
Japanese (ja)
Inventor
賢二 渡邊
広之 三宅
Original Assignee
アイシン・エィ・ダブリュ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by アイシン・エィ・ダブリュ株式会社 filed Critical アイシン・エィ・ダブリュ株式会社
Publication of WO2020084827A1 publication Critical patent/WO2020084827A1/en

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/001Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Arrangement of adaptations of instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/10Intensity circuits
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/37Details of the operation on graphic patterns
    • G09G5/377Details of the operation on graphic patterns for mixing or overlaying two or more graphic patterns
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/38Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory with means for controlling the display position
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/39Control of the bit-mapped memory
    • G09G5/391Resolution modifying circuits, e.g. variable screen formats
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • H04N13/31Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using parallax barriers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/346Image reproducers using prisms or semi-transparent mirrors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/356Image reproducers having separate monoscopic and stereoscopic modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/363Image reproducers using image projection screens
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking

Definitions

  • the present invention relates to a superimposed image display device and a computer program that support driving of a vehicle.
  • various means have been used as information providing means for providing various information to vehicle occupants such as route guidance and warning of obstacles to assist vehicle traveling.
  • it is a display on a liquid crystal display installed in the vehicle, a voice output from a speaker, or the like.
  • an image to be superimposed on the surrounding environment (landscape, real scene) of a vehicle such as a head-up display (hereinafter referred to as HUD) or a window shield display (hereinafter referred to as WSD).
  • HUD head-up display
  • WSD window shield display
  • the HUD and WSD as described above, it is possible to provide various information to the occupant without deviating the direction of the line of sight of the occupant from the front. Further, when the occupant visually recognizes the image to be superimposed on the surrounding environment, in addition to visually recognizing the image with both eyes, the image is visually recognized with only one eye.
  • the virtual image is generated by visually recognizing the image with both eyes.
  • the stereoscopic viewing distance is 50 m or more, it is disclosed that the virtual image is generated by visually recognizing the image with only one eye. This makes it possible to reduce the burden on the user and generate a virtual image with good visibility.
  • Patent Document 1 since the device for detecting the position of the eyes of the occupant is not provided, it is not possible to specify the exact positions of the eyes of the left and right of the occupant on the device side, and the position may be correct depending on the position of the eyes of the occupant. There is a problem that the virtual image cannot be visually recognized at the position or the virtual image itself cannot be visually recognized. In particular, when the image is viewed with only one eye (when only the image for the right eye or the image for the left eye is displayed), the range of the eye position where the virtual image can be viewed becomes narrow, which is a larger problem. It was
  • the present invention has been made in order to solve the above-mentioned problems in the related art.
  • a phenomenon in which the guide image is double viewed due to a difference between the two eyes occurs. It is an object of the present invention to provide a superimposed image display device and a computer program capable of preventing the above and visually recognizing a guide image superimposed at an appropriate position regardless of the current eye position of the user.
  • the superimposed image display device is mounted on a vehicle, and an image display surface capable of displaying a right-eye image visually recognized only by the user's right eye and a left-eye image visually recognized only by the user's left eye.
  • a superimposed image display device for visually recognizing the image displayed on the image display surface by superimposing the image displayed on the image display surface on the surrounding environment around the vehicle by visually recognizing with both eyes of the user.
  • a display position determination unit that determines the display position of the image on the image display surface based on the positions of both eyes of the user detected by the position detection unit, and is a guidance target for the user.
  • the guide image When the guide image is displayed on the image display surface, it is displayed as either the right-eye image or the left-eye image.
  • "to superimpose an image on the surrounding environment to be visually recognized” means that the image is at least overlapped with somewhere in the surrounding environment (including everything visible to the user such as the sky and buildings in addition to the road surface and obstacles). It means that the image is visually recognized in a closed state, and it does not matter whether or not a state in which the image is visually recognized by superimposing an image on a specific object (for example, road surface, obstacle, etc.) in the surrounding environment is maintained.
  • the computer program according to the present invention is a program for supporting driving of a vehicle. Specifically, by mounting on a vehicle and visually recognizing, with both eyes of the user, an image display surface capable of displaying a right-eye image to be visually recognized only by the user's right eye and a left-eye image to be visually recognized only by the user's left eye, The superimposed image display device that superimposes the image displayed on the image display surface on the surrounding environment around the vehicle for visual recognition, and position detection means for detecting the positions of both eyes of the user, and the user's position detected by the position detection means.
  • the guide image to be guided to the user is the image.
  • it When displayed on the display surface, it is displayed as either the image for the right eye or the image for the left eye.
  • the guide image when the guide image is visually recognized by being superimposed on the surrounding environment of the vehicle, the guide image is visually recognized by only one eye of the user, It is possible to prevent a phenomenon in which the guide image is visually recognized twice due to the difference between the two eyes. Further, by making it possible to detect the positions of both eyes of the user, it is possible to visually recognize the guide image superimposed at an appropriate position regardless of the current position of the user's eyes and the type of eyes to be visually recognized.
  • FIG. 1 is a schematic configuration diagram of a superimposed image display device 1 according to the present embodiment.
  • the superimposed image display device 1 basically has a navigation device 3 mounted on a vehicle 2 and a front display 4 also mounted on the vehicle 2 and connected to the navigation device 3.
  • the front display 4 functions as a head-up display together with the windshield 5 of the vehicle 2 as described later, and serves as an information providing means for providing various information to the occupant 6 of the vehicle 2.
  • the navigation device 3 searches for a recommended route to the destination, displays a map image around the current position of the vehicle 2 based on the map data acquired from the server or stored in the memory, and is set.
  • the front display 4 has a function of providing driving guidance along the guide route, warning of obstacles, and the like.
  • the navigation device 3 does not have to have all of the functions described above, and the present invention can be configured as long as the navigation device 3 has at least a function of performing traveling guidance along the guide route, warning of obstacles, and the like. is there. The details of the structure of the navigation device 3 will be described later.
  • the front display 4 is a liquid crystal display installed inside the dashboard 7 of the vehicle 2 and having a function of displaying an image on an image display surface provided on the front surface.
  • the backlight for example, CCFL (cold cathode tube) or white LED is used.
  • CCFL cold cathode tube
  • white LED white LED is used.
  • the front display 4 a combination of an organic EL display or a liquid crystal projector and a screen may be used instead of the liquid crystal display.
  • the front display 4 functions as a head-up display together with the windshield 5 of the vehicle 2, and the image output from the front display 4 is reflected on the windshield 5 in front of the driver's seat so that the occupant 6 of the vehicle 2 can visually recognize the image. Is configured.
  • the images displayed on the front display 4 include information about the vehicle 2 and various information used to support the driving of the occupant 6.
  • a warning for an object that is a warning target for the occupant 6, a guide route set by the navigation device 3 or guide information based on the guide route (arrows indicating right / left turn directions, etc.) , Warnings displayed on the road surface (crash collision, speed limits, etc.), lane markings on which the vehicle is traveling, current vehicle speed, shift position, energy level, advertisement image, facility information, map image, traffic information, news, weather forecast , Time, connected smartphone screen, TV program, etc.
  • the front display 4 of the present embodiment is a liquid crystal display provided with a parallax barrier or a lenticular lens, and an image to be viewed only by the right eye of the occupant 6 (hereinafter referred to as an image for the right eye) and an image viewed by only the left eye of the occupant 6
  • the images to be displayed (hereinafter, referred to as left-eye images) can be displayed. Therefore, if the above-mentioned warning, guidance information, and the like are displayed only as an image for the right eye, the information can be visually recognized only by the right eye. Further, if it is displayed only as the image for the left eye, it becomes possible to make the information visible only to the left eye. Further, if the information is displayed on both the right-eye image and the left-eye image, it becomes possible for both eyes to visually recognize the information.
  • a parallax barrier (parallax barrier) 9 having lattice-shaped gaps is arranged in front of the image display area 8 of the front display 4.
  • the image displayed using the area (pixel) 8A that is visible from the right eye of the occupant 6 through the gap of the parallax barrier 9 is an image that is visible only to the right eye (that is, the image for the right eye).
  • the image displayed using the area (pixel) 8B that is visible from the left eye of the occupant 6 through the gap of the parallax barrier 9 is an image that is visible only to the left eye (that is, the image for the left eye).
  • the area 8A for displaying the image for the right eye and the area 8B for displaying the image for the left eye can also be corrected according to the position of the eyes of the occupant 6. For example, if it is detected that the position of the left eye of the occupant 6 has moved from L1 to R1, or if the position of the right eye of the occupant 6 has been moved from R1 to L1, the image for the right eye is displayed. The area 8A to be displayed and the area 8B to display the image for the left eye are switched. Alternatively, as shown in FIG.
  • the area 8A for displaying the image for the right eye and the area 8B for displaying the image for the left eye are composed of a plurality of pixels in the horizontal direction
  • the area 8A for displaying the image for the right eye and the image for the left eye are Instead of switching the display area 8B at once, it is possible to move the display area 8B pixel by pixel according to the change in the position of the eyes of the user.
  • the right-eye image can be accurately viewed by the user's right eye
  • the left-eye image can be accurately viewed by the user's left eye, regardless of the position of the user's eyes.
  • an organic EL display As a means for displaying an image, an organic EL display, a combination of a liquid crystal projector and a screen may be used instead of a liquid crystal display, but it is a means capable of displaying an image for the right eye and an image for the left eye respectively. Is desirable.
  • polarizing glasses or liquid crystal shutter glasses may be used as a means for displaying the image for the right eye and the image for the left eye.
  • the occupant 6 wears polarized glasses or liquid crystal shutter glasses to visually recognize the front display 4 through the polarized glasses or liquid crystal shutter glasses.
  • the front display 4 displays an image for a right eye image and a left eye image by displaying an image of a method corresponding to the glasses worn (for example, liquid crystal shutter glasses alternately display a right eye image and a left eye image).
  • any display means such as a head-mounted display (HMD) type capable of displaying a right-eye image and a left-eye image can be applied to the present application.
  • HMD head-mounted display
  • the occupant 6 sees the image displayed on the front display 4 by reflecting the windshield 5, the occupant 6 does not see the position of the windshield 5 but the front display at a position far away from the windshield 5.
  • the image displayed in 4 is configured to be visually recognized as the virtual image 10.
  • the virtual image 10 is displayed by being superimposed on the surrounding environment (landscape, real scene) in front of the vehicle, and is superimposed on, for example, an arbitrary object (road surface, building, object to be warned, etc.) located in front of the vehicle. It is also possible to display them.
  • the position at which the virtual image 10 is generated is determined by the position of the front display 4.
  • the imaging distance L is determined by the distance (optical path length) along the optical path from the position where the image is displayed on the front display 4 to the windshield 5.
  • the optical path length is set so that the imaging distance L is 1.5 m.
  • the front display 4 is used as the means for displaying the image to be superimposed on the surrounding environment in front of the vehicle, but other means may be used.
  • a window shield display that displays an image on the windshield 5 may be used.
  • WSD window shield display
  • an image may be displayed from a projector using the windshield 5 as a screen, or the windshield 5 may be used as a transmissive liquid crystal display.
  • the image displayed on the windshield 5 by the WSD is an image to be superimposed on the surrounding environment in front of the vehicle, like the front display 4.
  • a front camera 11 is installed above the front bumper of the vehicle, behind the room mirror, and the like.
  • the front camera 11 is an imaging device having a camera that uses a solid-state imaging device such as a CCD, and is installed with the optical axis direction facing forward in the traveling direction of the vehicle. Then, by performing image processing on the captured image captured by the front camera 11, the situation of the front environment (that is, the environment in which the virtual image 10 is superimposed) visually recognized by the occupant 6 through the windshield is detected. It A sensor such as a millimeter wave radar may be used instead of the front camera 11.
  • an in-vehicle camera 12 is installed on the upper surface of the instrument panel of the vehicle.
  • the in-vehicle camera 12 is an imaging device having a camera using a solid-state imaging device such as a CCD, and is installed with the optical axis direction facing the driver's seat.
  • the range in which the occupant's face is generally expected to be located inside the vehicle is set as the detection range (imaging range of the in-vehicle camera 12), and the face of the occupant 6 sitting in the driver's seat is imaged.
  • image processing is performed on the captured image captured by the in-vehicle camera 12 to detect the position of the left and right eyes of the occupant 6 (line-of-sight start point) and the line-of-sight direction.
  • FIG. 4 is a block diagram showing the navigation device 3 according to the present embodiment.
  • the navigation device 3 includes a current position detection unit 13 that detects the current position of the vehicle 2 in which the navigation device 3 is mounted, and a data recording unit 14 that records various data.
  • a navigation ECU 15 that performs various kinds of arithmetic processing based on the input information, an operation unit 16 that receives an operation from a user, and a liquid crystal display 17 that displays a map around the vehicle and facility information related to the facility to the user.
  • Communication for communication between a speaker 18 that outputs voice guidance regarding route guidance, a DVD drive 19 that reads a DVD that is a storage medium, and an information center such as a VICS (registered trademark: Vehicle Information and Communication System) center.
  • a module 20 is connected to the front display 4, the front camera 11, the in-vehicle camera 12 and the like described above via an in-vehicle network such as CAN.
  • the current position detection unit 13 includes a GPS 21, a vehicle speed sensor 22, a steering sensor 23, a gyro sensor 24, and the like, and is capable of detecting the current vehicle position, azimuth, vehicle traveling speed, current time, and the like.
  • the vehicle speed sensor 22 is a sensor for detecting the moving distance and the vehicle speed of the vehicle, generates a pulse in accordance with the rotation of the drive wheels of the vehicle, and outputs a pulse signal to the navigation ECU 15. Then, the navigation ECU 15 calculates the rotational speed and the moving distance of the drive wheels by counting the generated pulses. It is not necessary for the navigation device 3 to include all of the four types of sensors, and the navigation device 3 may include only one or a plurality of types of these sensors.
  • the data recording unit 14 also reads a hard disk (not shown) as an external storage device and a recording medium, a map information DB 31 and a predetermined program recorded in the hard disk, and a driver for writing predetermined data in the hard disk. And a recording head (not shown).
  • the data recording unit 14 may have a flash memory, a memory card, or an optical disk such as a CD or a DVD instead of the hard disk.
  • the map information DB 31 may be stored in an external server and the navigation device 3 may obtain it by communication.
  • the map information DB 31 is, for example, link data 32 regarding roads (links), node data 33 regarding node points, point data 34 regarding points such as facilities, intersection data regarding each intersection, and map display data for displaying a map.
  • the storage means stores search data for searching a route, search data for searching a spot, and the like.
  • the link data 32 for example, a link ID for identifying the link, end node information for identifying a node located at the end of the link, a road type of a road formed by the link, and the like are stored.
  • the node data 33 a node ID for identifying the node, a position coordinate of the node, connection destination node information for identifying a connection destination node to which the node is connected via a link, and the like are stored.
  • the point data 34 various kinds of information regarding facilities to be set as a destination are stored. For example, the ID for identifying the facility, the facility name, the position coordinates, the genre, the address, etc. are stored.
  • the navigation ECU (electronic control unit) 15 is an electronic control unit that controls the entire navigation device 3, and a CPU 41 as an arithmetic unit and a control unit, and a working memory for the CPU 41 to perform various arithmetic processes.
  • a driving support processing program FIG. 5
  • An internal storage device such as a flash memory 44 for storing a program is provided.
  • the navigation ECU 15 has various means as a processing algorithm.
  • the position detecting means detects the positions of both eyes of the user.
  • the display position determining means determines the display position of the image on the front display 4 based on the positions of the eyes of the user detected by the position detecting means.
  • the operation unit 16 is operated when inputting a departure point as a travel start point and a destination as a travel end point, and has a plurality of operation switches (not shown) such as various keys and buttons. Then, the navigation ECU 15 performs control so as to execute various corresponding operations based on the switch signal output by pressing each switch.
  • the operation unit 16 may have a touch panel provided on the front surface of the liquid crystal display 17. Further, it may have a microphone and a voice recognition device.
  • liquid crystal display 17 a map image including roads, traffic information, operation guidance, operation menu, key guidance, guidance route from the departure point to the destination, guidance information along the guidance route, news, weather forecast, Time, mail, TV programs, etc. are displayed.
  • the liquid crystal display 17 may be omitted if the front display 4 is used to display the map image and the like.
  • the speaker 18 outputs voice guidance for guiding the driver to travel along the guide route and guidance for traffic information based on an instruction from the navigation ECU 15.
  • the DVD drive 19 is a drive that can read data recorded on a recording medium such as a DVD or a CD. Then, based on the read data, reproduction of music or video, updating of the map information DB 31, etc. are performed.
  • a card slot for reading and writing a memory card may be provided instead of the DVD drive 19.
  • the communication module 20 is a communication device for receiving traffic information including traffic congestion information, regulation information, traffic accident information and the like transmitted from a traffic information center, for example, a VICS center or a probe center.
  • a traffic information center for example, a VICS center or a probe center.
  • a mobile phone or DCM is applicable.
  • FIG. 5 is a flowchart of the driving support processing program according to the present embodiment.
  • the driving support processing program is executed after the ACC power supply (accessory power supply) of the vehicle is turned on, and by superimposing the image displayed on the front display 4 on the surrounding environment of the vehicle for visual recognition, It is a program that provides various information to passengers.
  • the programs shown in the flowcharts of FIGS. 5 and 8 below are stored in the RAM 42 or the ROM 43 included in the navigation device 3, and are executed by the CPU 41.
  • a warning for an object for example, another vehicle, a pedestrian, a guide sign
  • a warning displayed on the road surface a rear-end collision warning, a speed limit, etc.
  • a lane section in which the vehicle travels It is possible to display lines, advertisement images, facility information, map images, traffic information, news, weather forecast, time, screens of connected smartphones, TV programs, and the like.
  • step (hereinafter abbreviated as S) 1 the CPU 41 acquires the current position information of the own vehicle, the map information around the own vehicle, and the guide route set by the navigation device 3, respectively.
  • the current position of the host vehicle is detected using, for example, the GPS 21 or the vehicle speed sensor 22.
  • the map information around the own vehicle is acquired from the map information DB 31 or an external server.
  • the map information includes information that identifies the shape of the road and the lane segment.
  • the CPU 41 performs image processing such as binarization processing on the captured image captured by the front camera 11 to display the road surface of the road on which the vehicle is running, which is the target of superimposition of the virtual image of the guide arrow. To detect.
  • the CPU 41 specifies a specific position of the road surface of the road on which the vehicle on which the virtual image of the guide arrow is to be superimposed (hereinafter referred to as a superimposition target road surface) is three-dimensional position coordinates.
  • a specific position of the road surface of the road on which the vehicle on which the virtual image of the guide arrow is to be superimposed is three-dimensional position coordinates.
  • the road surface of the road including the guidance intersection if there is a guidance intersection
  • 100 m ahead from the current position of the vehicle along the guidance route is set as the superimposition target road surface.
  • the position of the road surface to be superimposed may be specified based on the imaged image taken by the front camera 11, or may be specified using a sensor such as a millimeter wave radar.
  • the navigation device 3 can acquire map information including three-dimensional information, it is also possible to specify the position of the road surface to be superimposed on the basis of the map information and the current position of the vehicle.
  • the position of the road surface to be superimposed will be continuously specified until the driving support processing program ends. For example, if the position of the road surface to be superimposed is specified for each frame of the image captured by the front camera 11, the current position of the road surface to be superimposed can be continuously specified.
  • the CPU 41 determines whether or not to provide information about the vehicle via the front display 4.
  • the information regarding the own vehicle includes, for example, the current vehicle speed, the shift position, the remaining energy level, and the like. It should be noted that the ON / OFF switching of the provision of information regarding the own vehicle can be performed by the operation of the vehicle occupant. Further, ON or OFF may be automatically switched based on the surrounding situation or the state of the vehicle.
  • the CPU 41 performs display range determination processing (FIG. 8) described later.
  • the display range determination processing the range in which the guide image is projected on the windshield 5 (that is, the range in which the guide image is displayed on the front display 4) is determined.
  • an image of a guide arrow (hereinafter referred to as the first guide image) indicating the traveling direction of the vehicle along the guide route set by the navigation device 3 and an image (hereinafter referred to as the first guide image) that guides information about the vehicle.
  • Two types of images (referred to as 2 guide images) are displayed. Therefore, in S5, the display range of each of the first guide image and the second guide image is determined.
  • the first guide image is an image that needs to be maintained in a state of being superposed on a specific object (specifically, a superposition target road surface) included in the surrounding environment visually recognized from the vehicle.
  • the first guide image is one of the images selected for the right-eye image or the left-eye image on the front display 4 in order to prevent the image from being visually recognized twice due to the difference between the two eyes. It is displayed (that is, the first guidance image is visually recognized only by one of the right eye and the left eye). Therefore, in S5, the display range of the first guide image in the selected one of the right-eye image and the left-eye image is determined as described later.
  • the second guide image is an image that does not need to be maintained in a state in which it is superposed on a specific object included in the surrounding environment visually recognized by the vehicle. Further, since the accuracy of the overlapping position of the second guide image does not matter as compared with the first guide image, both the right-eye image and the left-eye image are displayed on the front display 4 so that they can be visually recognized more comfortably. The image is displayed (that is, the second guidance image is visually recognized by both the right eye and the left eye). Therefore, in S5, the display range of the second guide image in both the right-eye image and the left-eye image is determined as described later.
  • the CPU 41 transmits a control signal to the front display 4 and displays the first guide image on the front display 4 in the display range of the image for the right eye or the image for the left eye determined in S5. .
  • the second guide image is displayed in each display range of the image for the right eye and the image for the left eye determined in S5.
  • the virtual images of the first guide image and the second guide image are visually recognized by the occupant of the vehicle in an overlapping manner with the surrounding environment in front of the vehicle.
  • the virtual image of the first guide image is visually recognized as being overlapped with the road surface to be superimposed when visually recognized by an occupant of the vehicle.
  • FIG. 6 shows the contents of the left-eye image and the right-eye image displayed on the front display 4 in S6, and the first guide image visually recognized by the occupant of the vehicle as a result of visually recognizing the left-eye image and the right-eye image.
  • FIG. 6 is a diagram showing an example of a virtual image of a second guide image.
  • the first guidance image 55 is displayed as the left-eye image 56
  • the second guidance image 57 is displayed as the left-eye image 56 and the right-eye image 58.
  • the display area in which the first guide image 55 and the second guide image 57 are not displayed is a black image or an image with zero brightness. Therefore, the region of the right-eye image 58 corresponding to the first guide image 55 displayed on the left-eye image 56 becomes a black image or an image of 0 brightness, and even when the front display 4 is visually recognized by both eyes, It is possible for only one eye to visually recognize the first guide image.
  • the virtual image of the first guide image 55 superimposed on the surrounding environment is displayed with the left eye over the windshield 5 of the vehicle.
  • the virtual image of the second guide image 57 is visually recognized by both eyes.
  • the virtual image of the first guide image 55 is visually recognized in the surrounding environment, in particular, on the road surface 59 to be superimposed in front of the vehicle.
  • FIG. 6 is an example of the first guidance image 55 displayed when the guidance route is a route that turns right at the front guidance intersection, and the first guidance image 55 is an image of an arrow indicating that the first guidance image 55 turns right at the guidance intersection.
  • the occupant can grasp the position of the guided intersection and the traveling direction of the vehicle at the guided intersection without moving the line of sight from the front of the traveling direction.
  • the virtual image of the first guide image 55 is visually recognized only by the left eye of the occupant of the vehicle, it is possible to prevent a phenomenon in which the image is double visually recognized due to the difference between both eyes.
  • the display of the first guide image 55 is basically performed while the guide route is set (a straight arrow is displayed when going straight), but is only displayed when approaching the guidance intersection. You may do it.
  • the virtual image of the second guide image 57 is visually recognized in the vicinity of the center of the lower edge of the windshield 5 of the vehicle, being superimposed on the surrounding environment. Unlike the first guide image, the second guide image 57 is not superposed on a specific object, but is fixed at the same position on the windshield 5 and is visually recognized.
  • the second guidance image 57 is an image showing the current vehicle speed of the vehicle. As a result, the occupant can grasp the current vehicle speed of the vehicle without moving the line of sight from the front in the traveling direction. Further, since the virtual image of the second guide image 57 is visually recognized by both eyes of the occupant of the vehicle, there is no sense of discomfort as compared with the case of viewing with one eye.
  • the display of the second guidance image 57 is basically always performed when the information provision of the own vehicle using the front display 4 is turned on.
  • the display range of the first guide image 55 is newly determined in S5, and the display range is updated. That is, when the surrounding environment visually recognized from the vehicle changes, the display position and the display size of the first guide image 55 also change in association with it.
  • the first guidance image 55 is an image associated with the position where it is superimposed in the surrounding environment.
  • the second guide image 57 since the second guide image 57 does not need to be superimposed on a specific object, it is basically displayed at a fixed position of the windshield 5 (for example, near the center of the lower edge). That is, the display position of the second guide image 57 is not linked to the change of the surrounding environment visually recognized from the own vehicle, and the display range initially determined in S5 is basically unchanged. In other words, the second guide image 57 is an image in which the positions to be superimposed in the surrounding environment are not associated.
  • the CPU 41 determines whether to switch the eyes of the occupant who visually recognizes the first guidance image to the other eye. For example, when the occupant's left eye visually recognizes, when the occupant changes his / her posture and the left eye deviates from the detection area where the eye position of the occupant 6 can be detected by the in-vehicle camera 12, or the edge of the detection area. On the other hand, when it approaches within a predetermined distance (for example, within 5 cm) or when the fatigue level of the left eye exceeds the threshold value, it is determined that the occupant's eye visually recognizing the first guidance image is switched to the right eye.
  • a predetermined distance for example, within 5 cm
  • the eyes of the occupant who visually recognizes the first guidance image are switched to the left eye when the distance to the part is within a predetermined distance (for example, within 5 cm) or when the fatigue level of the right eye exceeds a threshold value.
  • the degree of eye fatigue is estimated from the viewing time of the first guide image. For example, it is determined that the fatigue level of the left eye exceeds the threshold value when the state in which the left eye visually recognizes the first guidance image becomes a predetermined time (for example, 10 minutes) or more.
  • the timing of switching the eyes of the passenger who visually recognizes the first guidance image to the other eye is the timing when the first guidance image is hidden.
  • the first guide image is basically displayed while the guide route is set, but is displayed in a display mode in which display and non-display are repeated at regular intervals (for example, in units of 1 sec).
  • the timing of switching the eyes of the passenger who visually recognizes the first guidance image to the other eye may be the timing of changing the display mode of the first guidance image.
  • the first guide image is a guide arrow indicating the traveling direction of the vehicle along the guide route
  • the timing of changing the display mode of the first guide image may be, in addition to the timing of changing the interval of the guide arrows, the timing of changing the shape of the guide arrows, the timing of changing the display color of the guide arrows, or the like.
  • the eyes of the occupant who visually recognizes the first guidance image is switched to the other eye (S7: YES).
  • the eyes of the occupant who visually recognizes the first guidance image is switched to the other eye (S8).
  • the display range of the first guidance image 55 is determined based on the position of the eyes of the occupant newly switched in S5, and the display range is updated.
  • the first guide image 55 is displayed on the left-eye image 56
  • the first guide image 55 is displayed on the right-eye image 58 thereafter, while the first guide image 55 is displayed on the right-eye image 58. If it is displayed, the first guide image 55 is displayed as the left-eye image 56 thereafter.
  • the first guidance image 55 is visually recognized by the occupant at the same position before and after the eyes of the occupant visually recognizing the first guidance image 55 are switched.
  • S9 similarly to S5, the display range determination described below is performed. Processing (FIG. 8) is performed. However, in S9, only the first guide image is displayed and only the display range of the first guide image is determined.
  • the CPU 41 transmits a control signal to the front display 4, and displays the first guide image on the front display 4 in the display range of the image for the right eye or the image for the left eye determined in S9.
  • the virtual image of the first guidance image is visually recognized by the occupant of the vehicle in a superimposed manner on the surrounding environment in front of the vehicle.
  • the virtual image of the first guide image is visually recognized as being overlapped with the road surface to be superimposed when visually recognized by an occupant of the vehicle.
  • FIG. 7 shows the contents of the left-eye image and the right-eye image displayed on the front display 4 in S10, and the first guide image visually recognized by the occupant of the vehicle as a result of visually recognizing the left-eye image and the right-eye image.
  • FIG. 6 is a diagram showing an example of a virtual image of a second guide image.
  • the first guidance image 55 is displayed as the left-eye image 56. Further, in the left-eye image 56, the display area in which the first guide image 55 is not displayed and the entire right-eye image 58 are black images or images with zero brightness. Therefore, the region of the right-eye image 58 corresponding to the first guide image 55 displayed on the left-eye image 56 becomes a black image or an image of 0 brightness, and even when the front display 4 is visually recognized by both eyes, It is possible for only one eye to visually recognize the first guide image.
  • the virtual image of the first guide image 55 superimposed on the surrounding environment is displayed with the left eye over the windshield 5 of the vehicle. To be seen.
  • the virtual image of the first guide image 55 is visually recognized in the surrounding environment, in particular, on the road surface 59 to be superimposed in front of the vehicle.
  • FIG. 7 is an example of the first guidance image 55 displayed when the guidance route is a route that makes a right turn at the guidance intersection ahead, and the first guidance image 55 is an image of an arrow indicating that the guidance route turns right at the guidance intersection.
  • the occupant can grasp the position of the guided intersection and the traveling direction of the vehicle at the guided intersection without moving the line of sight from the front of the traveling direction.
  • the virtual image of the first guide image 55 is visually recognized only by the left eye of the occupant of the vehicle, it is possible to prevent a phenomenon in which the image is double visually recognized due to the difference between both eyes.
  • the display of the first guide image 55 is basically performed while the guide route is set (a straight arrow is displayed when going straight), but is only displayed when approaching the guidance intersection. You may do it. Then, the process proceeds to S7.
  • FIG. 8 is a flowchart of a sub-process program of the display range determination process.
  • the CPU 41 detects the positions of the left and right eyes of the occupant of the vehicle (starting point of sight line) based on the captured image captured by the in-vehicle camera 12.
  • the detected eye position is specified by three-dimensional position coordinates.
  • the CPU 41 selects either the right eye or the left eye as the eyes of the occupant who visually recognizes the first guidance image. For example, the selection is made according to the following criteria (1) to (4).
  • the condition (1) is given the highest priority, and any of the conditions (2) to (4) may be adopted. (1) If only one eye is included in the detection area in which the position of the eyes of the occupant 6 can be detected by the in-vehicle camera 12, the included eyes are selected. (2) If both eyes are included in the detection area and the dominant eye of the occupant can be determined, the dominant eye of the occupant is selected.
  • the fatigue level of the left and right eyes is detected from the image captured by the in-vehicle camera 12, and the eye with the lesser fatigue level is selected.
  • the detection area includes both eyes, the eye closer to the center of the detection area is selected.
  • the eyes of the occupant who visually recognizes the first guidance image selected in S12 may not be continuously fixed thereafter, but may be switched to the other eye in the subsequent processing of S8. However, the eyes of the occupant who visually recognizes the first guidance image may be fixed.
  • the CPU 41 determines whether or not the display on the front display 4 is ON.
  • the display of the front display 4 can be switched between ON and OFF by an operation of a vehicle occupant. Further, ON or OFF may be automatically switched based on the surrounding situation or the state of the vehicle.
  • the CPU 41 acquires the position coordinates of the windshield 5 on which the image is projected by the front display 4.
  • the position coordinates of the windshield 5 are specified by three-dimensional position coordinates.
  • the CPU 41 causes the position coordinates of the road surface to be superimposed detected in S3, the position coordinates of the position of the eyes of the occupant of the vehicle (gaze starting point) detected in S11, and the front acquired in S14.
  • position coordinates for displaying the first guide image and the second guide image on the windshield 5 are calculated.
  • the display position of the first guide image is set to a position where the virtual image of the first guide image overlaps with the road surface to be superimposed and is visually recognized by the vehicle occupant as shown in FIGS. 8 and 9.
  • the display position of the second guide image is the fixed position of the windshield 5 (for example, near the center of the lower edge) as shown in FIG.
  • the CPU 41 determines the projection range of the first guidance image and the second guidance image on the windshield 5 based on the position coordinates calculated in S15. Further, the display range of the first guide image and the second guide image on the front display 4 is also determined from the determined projection range. Further, the front display 4 of the present embodiment can respectively display a right-eye image that is viewed by only the right eye of the occupant 6 and a left-eye image that is viewed by only the left eye of the occupant 6. Then, for the first guide image, the display range in the image corresponding to the eye selected in S12 (the image for the right eye if viewed by the right eye) is specified from among the image for the left eye and the image for the right eye. On the other hand, for the second guidance image, the display range in both the left-eye image and the right-eye image is specified.
  • FIG. 9 is a diagram illustrating the details of the determination process of the display range of the first guide image on the front display 4.
  • the windshield 5 is connected by a straight line to connect the line of sight starting point S and the road surface 59 to be superimposed.
  • the projection range of the first guidance image at is determined.
  • the display range of the first guide image on the front display 4 is also determined based on the visual angle from the line-of-sight start point S to the projection range and the inclination angle of the windshield 5.
  • the line-of-sight start point S is one point, and therefore the processing relating to the determination of the display range of the first guidance image on the front display 4 is viewed with both eyes. It can be simplified as compared with the case.
  • the image for the right eye to be viewed only by the right eye of the user and the left eye that is viewed only by the left eye of the user is provided to detect the positions of both eyes of the vehicle occupant (S11), and the display position of the guide image on the front display 4 is determined based on the detected positions of both eyes of the occupant. (S16), in particular, by displaying the first guidance image as either the right-eye image or the left-eye image, the first guidance image displayed on the front display 4 is superimposed on the surrounding environment around the vehicle.
  • both eyes can be viewed by making only one eye of the occupant visually recognize the first guidance image.
  • First guide image it is possible to prevent that the event is visible to the doubly caused by. Further, by making it possible to detect the positions of both eyes of the occupant, it is possible to visually recognize the first guide image superimposed at an appropriate position regardless of the current position of the occupant's eyes and the type of eyes to be visually recognized. To do.
  • the present invention is not limited to the above embodiment, and it is needless to say that various improvements and modifications can be made without departing from the gist of the present invention.
  • the virtual image is generated in front of the windshield 5 of the vehicle 2 by the front display 4, but the virtual image may be generated in front of the window other than the windshield 5.
  • the object on which the image is reflected by the front display 4 may be the visor (combiner) installed around the windshield 5 instead of the windshield 5 itself.
  • the front display 4 is used as a means for displaying an image superimposed on the surrounding environment, but a window shield display (WSD) for displaying an image on the windshield 5 may be used.
  • WSD window shield display
  • the first guide image is displayed as either the right-eye image or the left-eye image
  • the second guide image is displayed as both the right-eye image and the left-eye image.
  • the two-guidance image may be displayed as either the right-eye image or the left-eye image.
  • the first guidance image and the second guidance image may be viewed with the same eye or different eyes (for example, the first guidance image is the left eye, and the second guidance image is the right eye. ).
  • the first guide image is an image of a guide arrow indicating the traveling direction of the vehicle along the guide route set by the navigation device 3, but other images may be used.
  • a warning image of an object for example, another vehicle, a pedestrian, a guide sign
  • an image of a lane marking in which the vehicle travels, or the like may be used.
  • the first guidance image is an image that needs to be maintained in a state of being superimposed on a specific object (for example, a road surface) included in the surrounding environment visually recognized from the vehicle. It may be an image (for example, an image of a smartphone, a map image, etc.) that does not need to be maintained in the state of being superposed on.
  • the processing of the driving support processing program (FIG. 5) is configured to be executed by the navigation ECU 15 of the navigation device 3, but the execution subject can be appropriately changed.
  • it may be configured to be executed by the control unit of the front display 4, the vehicle control ECU, and other vehicle-mounted devices.
  • the control unit of the front display 4 executes the superimposing image display device according to the present invention
  • the superimposition image display device may be configured by only the front display 4.
  • the superimposed image display device can also have the following configuration, and in that case, the following effects are exhibited.
  • the first configuration is as follows.
  • An image display surface (4) mounted on the vehicle (2) and capable of displaying a right-eye image (58) that is visible only to the right eye of the user (6) and a left-eye image (56) that is visible only to the user's left eye.
  • a superimposed image display device (1) for visually recognizing an image displayed on the image display surface by observing the image with both eyes of a user, by detecting the positions of both eyes of the user.
  • the guide image (55) to be guided to the user is displayed on the image display surface, it is displayed as either the right-eye image or the left-eye image.
  • the guide image is visually recognized only by one of the eyes of the user. It is possible to prevent the occurrence of a heavily visually recognized event. Further, by making it possible to detect the positions of both eyes of the user, it is possible to visually recognize the guide image superimposed at an appropriate position regardless of the current position of the user's eyes and the type of eyes to be visually recognized.
  • the second configuration is as follows.
  • the guidance image is displayed as both a first guidance image (55) displayed as either the right-eye image (58) or the left-eye image (56), and both the right-eye image and the left-eye image.
  • the superimposed image display device having the above configuration, by visually recognizing the first guide image with only one eye, it is possible to prevent the phenomenon in which the guide image is double recognized due to the difference between the two eyes.
  • the second guide image can be visually recognized by both eyes without any discomfort.
  • the third configuration is as follows.
  • the first guide image (55) is an image that needs to be maintained in a state of being superposed on a specific object (59) included in the surrounding environment visually recognized by the vehicle (2)
  • the second guide image (55) is an image that does not need to be maintained in a state of being superimposed on a specific target object included in the surrounding environment visually recognized by the vehicle.
  • the superimposed image display device having the above-described configuration, when the first guide image, which requires the accuracy of the superimposed position, is visually recognized by only one eye, there is a phenomenon that the guide image is double visually recognized due to the difference between both eyes. While it is possible to prevent the second guide image from being generated, it is possible to visually recognize the second guide image that does not require the accuracy of the overlapping position with both eyes.
  • the fourth configuration is as follows.
  • the guide image (55) is displayed as either the right-eye image (58) or the left-eye image (56)
  • the area of the other image corresponding to the guide image displayed as one image Is a black image.
  • the fifth configuration is as follows.
  • the guide image (55) is displayed as either the right-eye image (58) or the left-eye image (56)
  • the area of the other image corresponding to the guide image displayed as one image Is an image with 0 brightness.
  • the superimposed image display device having the above configuration even when the image display surface is viewed by both eyes of the user, it is possible to allow only one eye to view the guide image.
  • the sixth configuration is as follows.
  • the guide image (55) is displayed on the image display surface (4)
  • the guide image has an eye selecting unit (41) for selecting the eyes of the user (6) who visually recognizes the guide image.
  • the image for the right eye (58) or the image for the left eye (56) is displayed in the image corresponding to the eye selected by the eye selecting means.
  • the guide image when the guide image is viewed by being superimposed on the surrounding environment of the vehicle, the guide image can be viewed only by one eye of the selected user.
  • the seventh configuration is as follows.
  • the eye selecting means (41) preferentially selects eyes included in the detection range of the position detecting means (41).
  • the guide image can be visually recognized by the eye capable of detecting the current position on the device side. As a result, it is possible to visually recognize the guide image superimposed on an appropriate position.
  • the eighth configuration is as follows.
  • the eye selecting means (41) switches the user's eyes to be selected as a target for visually recognizing the guidance image based on the positional relationship of the user's eyes with respect to the detection range of the position detecting means (41).
  • the guide image is displayed.
  • the ninth configuration is as follows.
  • a target for visualizing the guide image (55) based on the degree of fatigue of the eyes of the user which has a degree of fatigue detecting means (41) for detecting the degree of fatigue of the eyes of the user.
  • the superimposed image display device having the above-described configuration even if the fatigue of the eye visually recognizing the guide image is accumulated, the burden on the user is increased by switching the eye to visually recognize the guide image to the other eye. Instead, it is possible to continuously make the guide image visible only to one eye.
  • the tenth configuration is as follows.
  • the eye selecting means (41) switches the eyes of the user to be selected as a target for visually recognizing the guide image at the timing when the guide image (55) is hidden. According to the superimposed image display device having the above configuration, even if the eye for visually recognizing the guide image is switched to the other eye, the guide image visually recognized by the user does not feel uncomfortable.
  • the eleventh configuration is as follows.
  • the eye selecting means (41) switches the eyes of the user to be selected as the target for visually confirming the guide image at the timing of changing the display mode of the guide image (55). According to the superimposed image display device having the above configuration, even if the eye for visually recognizing the guide image is switched to the other eye, the guide image visually recognized by the user does not feel uncomfortable.
  • the twelfth configuration is as follows.
  • the display position determining means (41) is arranged on the image display surface (4) based on the position where the guide image (55) is superimposed in the surrounding environment and the eye position selected by the eye selecting means (41).
  • the display position of the guide image is determined. According to the superimposed image display device having the above configuration, it is possible to simplify the processing relating to the determination of the display position of the guide image, as compared with the case where the guide image is visually recognized by both eyes. Further, regardless of the position of the eyes of the user, it is possible to surely make the guide image visible only to one eye of the selected user.
  • the thirteenth configuration is as follows. By reflecting the image displayed on the image display surface (4) on the windshield (5) of the vehicle (2) and making it visible to the user (6), a virtual image of the image displayed on the image display surface is displayed. Make it visible in the surrounding environment around the vehicle. According to the superimposed image display device having the above configuration, it is possible to visually recognize the guide image by superimposing it on the surrounding environment of the vehicle by the virtual image of the image.
  • the fourteenth configuration is as follows. By viewing the image display surface (4) through polarizing glasses or liquid crystal shutter glasses worn by the user (6), only the right-eye image (58) to be viewed by only the right eye of the user and the left eye of the user are displayed. The left-eye image (56) to be visually recognized can be displayed on the image display surface. According to the superimposed image display device having the above configuration, it is possible to allow the user to visually recognize the guide image only by wearing the polarizing glasses or the liquid crystal shutter glasses.

Abstract

Provided are a superimposed-image display device and a computer program that make it possible to prevent the occurrence of double-vision of a guide image due to both-eyes parallax, and to cause the guide image to be visible superimposed at an appropriate position regardless of the current eye position of a user. The superimposed-image display device is provided with a front display 4 capable of displaying, respectively, a right-eye image visible to only the right eye of the user and a left-eye image visible to only the left eye of the user, and is configured such that: the positions of each eye of a vehicle occupant are detected; a guide image display position on the front display 4 is determined on the basis of the detected positions of the eyes of the occupant; and a first guide image in particular is displayed as either a right-eye image or a left-eye image, thereby causing the first guide image displayed on the front display 4 to be visible superimposed on the surrounding environment around the vehicle.

Description

重畳画像表示装置及びコンピュータプログラムSuperimposed image display device and computer program
 本発明は、車両の走行支援を行う重畳画像表示装置及びコンピュータプログラムに関する。 The present invention relates to a superimposed image display device and a computer program that support driving of a vehicle.
 従来より、車両の乗員に対して経路案内や障害物の警告等の車両の走行支援を行う為の各種情報を提供する情報提供手段として、様々な手段が用いられている。例えば、車両に設置された液晶ディスプレイによる表示や、スピーカから出力する音声等である。そして、近年、このような情報提供手段の一つとして、ヘッドアップディスプレイ(以下、HUDという)やウインドウシールドディスプレイ(以下、WSDという)のように車両の周辺環境(風景、実景)に重畳する画像を表示することによって、情報の提供を行う装置がある。 Conventionally, various means have been used as information providing means for providing various information to vehicle occupants such as route guidance and warning of obstacles to assist vehicle traveling. For example, it is a display on a liquid crystal display installed in the vehicle, a voice output from a speaker, or the like. In recent years, as one of such information providing means, an image to be superimposed on the surrounding environment (landscape, real scene) of a vehicle such as a head-up display (hereinafter referred to as HUD) or a window shield display (hereinafter referred to as WSD). There is a device that provides information by displaying.
 ここで、上記のようなHUDやWSDでは、乗員の視線方向を前方から逸らすことなく、乗員への各種情報の提供が可能となる。更に、周辺環境に重畳する画像を乗員に視認させる場合には、両目で画像を視認させる場合に加えて、片方の目のみに画像を視認させることも行われていた。例えば特開2015-194709号公報には、ユーザに視認させる虚像を生成する場合において、ユーザの立体視距離が50m未満である場合には、両目で画像を視認させることによって虚像を生成し、ユーザの立体視距離が50m以上である場合には、片方の目のみで画像を視認させることによって虚像を生成することについて開示されている。それによって、ユーザの負担を減らすとともに視認性の良い虚像を生成することを可能としている。 Here, with the HUD and WSD as described above, it is possible to provide various information to the occupant without deviating the direction of the line of sight of the occupant from the front. Further, when the occupant visually recognizes the image to be superimposed on the surrounding environment, in addition to visually recognizing the image with both eyes, the image is visually recognized with only one eye. For example, in Japanese Patent Laid-Open No. 2015-194709, in the case of generating a virtual image to be visually recognized by the user, when the stereoscopic viewing distance of the user is less than 50 m, the virtual image is generated by visually recognizing the image with both eyes, When the stereoscopic viewing distance is 50 m or more, it is disclosed that the virtual image is generated by visually recognizing the image with only one eye. This makes it possible to reduce the burden on the user and generate a virtual image with good visibility.
特開2015-194709号公報(第5頁)JP-A-2015-194709 (page 5)
 しかしながら、上記特許文献1では乗員の目の位置を検出する手段を備えていないので、装置側で乗員の左右の目の正確な位置を特定することができず、乗員の目の位置によっては正しい位置に虚像を視認させることができなかったり、虚像そのものを視認させることができない問題が生じていた。特に片方の目のみで画像を視認させる場合(右目用画像または左目用画像のみ表示する場合)には、虚像を視認させることができる目の位置の範囲が狭くなるので、より大きな問題となっていた。 However, in Patent Document 1 described above, since the device for detecting the position of the eyes of the occupant is not provided, it is not possible to specify the exact positions of the eyes of the left and right of the occupant on the device side, and the position may be correct depending on the position of the eyes of the occupant. There is a problem that the virtual image cannot be visually recognized at the position or the virtual image itself cannot be visually recognized. In particular, when the image is viewed with only one eye (when only the image for the right eye or the image for the left eye is displayed), the range of the eye position where the virtual image can be viewed becomes narrow, which is a larger problem. It was
 本発明は前記従来における問題点を解消するためになされたものであり、ユーザの片方の目のみに案内画像を視認させることによって、両目視差により案内画像が2重に視認される事象が生じることを防止し、且つユーザの現在の目の位置に関わらず適切な位置に重畳させた案内画像を視認させることを可能にした重畳画像表示装置及びコンピュータプログラムを提供することを目的とする。 The present invention has been made in order to solve the above-mentioned problems in the related art. When the guide image is visually recognized only by one eye of the user, a phenomenon in which the guide image is double viewed due to a difference between the two eyes occurs. It is an object of the present invention to provide a superimposed image display device and a computer program capable of preventing the above and visually recognizing a guide image superimposed at an appropriate position regardless of the current eye position of the user.
 前記目的を達成するため本発明に係る重畳画像表示装置は、車両に搭載され、ユーザの右目のみに視認させる右目用画像とユーザの左目のみに視認させる左目用画像を夫々表示可能な画像表示面をユーザの両目で視認させることによって、前記画像表示面に表示された画像を前記車両周辺の周辺環境に重畳して視認させる重畳画像表示装置であって、ユーザの両目の位置を夫々検出する位置検出手段と、前記位置検出手段により検出したユーザの両目の位置に基づいて、前記画像表示面における画像の表示位置を決定する表示位置決定手段と、を有し、ユーザに対して案内対象となる案内画像を前記画像表示面に表示する場合に、前記右目用画像又は前記左目用画像のいずれか一方として表示する。
 ここで、「画像を周辺環境に重畳して視認させる」とは、周辺環境(路面や障害物以外に空や建築物等のユーザから視認される全てを含む)のどこかに画像が少なくとも重なった状態で視認されることをいい、周辺環境内の特定の対象物(例えば路面、障害物等)に対して画像を重ねて視認される状態を維持するか否かは問わない。
In order to achieve the above-mentioned object, the superimposed image display device according to the present invention is mounted on a vehicle, and an image display surface capable of displaying a right-eye image visually recognized only by the user's right eye and a left-eye image visually recognized only by the user's left eye. Is a superimposed image display device for visually recognizing the image displayed on the image display surface by superimposing the image displayed on the image display surface on the surrounding environment around the vehicle by visually recognizing with both eyes of the user. And a display position determination unit that determines the display position of the image on the image display surface based on the positions of both eyes of the user detected by the position detection unit, and is a guidance target for the user. When the guide image is displayed on the image display surface, it is displayed as either the right-eye image or the left-eye image.
Here, "to superimpose an image on the surrounding environment to be visually recognized" means that the image is at least overlapped with somewhere in the surrounding environment (including everything visible to the user such as the sky and buildings in addition to the road surface and obstacles). It means that the image is visually recognized in a closed state, and it does not matter whether or not a state in which the image is visually recognized by superimposing an image on a specific object (for example, road surface, obstacle, etc.) in the surrounding environment is maintained.
 また、本発明に係るコンピュータプログラムは、車両の走行支援を行うプログラムである。具体的には、車両に搭載され、ユーザの右目のみに視認させる右目用画像とユーザの左目のみに視認させる左目用画像を夫々表示可能な画像表示面をユーザの両目で視認させることによって、前記画像表示面に表示された画像を前記車両周辺の周辺環境に重畳して視認させる重畳画像表示装置を、ユーザの両目の位置を夫々検出する位置検出手段と、前記位置検出手段により検出したユーザの両目の位置に基づいて、前記画像表示面における画像の表示位置を決定する表示位置決定手段と、して機能させる為のコンピュータプログラムであって、ユーザに対して案内対象となる案内画像を前記画像表示面に表示する場合に、前記右目用画像又は前記左目用画像のいずれか一方として表示する。 Further, the computer program according to the present invention is a program for supporting driving of a vehicle. Specifically, by mounting on a vehicle and visually recognizing, with both eyes of the user, an image display surface capable of displaying a right-eye image to be visually recognized only by the user's right eye and a left-eye image to be visually recognized only by the user's left eye, The superimposed image display device that superimposes the image displayed on the image display surface on the surrounding environment around the vehicle for visual recognition, and position detection means for detecting the positions of both eyes of the user, and the user's position detected by the position detection means. A computer program for functioning as display position determining means for determining the display position of an image on the image display surface based on the positions of both eyes, wherein the guide image to be guided to the user is the image. When displayed on the display surface, it is displayed as either the image for the right eye or the image for the left eye.
 前記構成を有する本発明に係る重畳画像表示装置及びコンピュータプログラムによれば、車両の周辺環境に重畳して案内画像を視認させる場合において、ユーザの片方の目のみに案内画像を視認させることによって、両目視差により案内画像が2重に視認される事象が生じることを防止することが可能となる。また、ユーザの両目の位置を夫々検出可能とすることによって、ユーザの現在の目の位置や視認させる目の種類に関わらず適切な位置に重畳させた案内画像を視認させることを可能とする。 According to the superimposed image display device and the computer program according to the present invention having the above configuration, when the guide image is visually recognized by being superimposed on the surrounding environment of the vehicle, the guide image is visually recognized by only one eye of the user, It is possible to prevent a phenomenon in which the guide image is visually recognized twice due to the difference between the two eyes. Further, by making it possible to detect the positions of both eyes of the user, it is possible to visually recognize the guide image superimposed at an appropriate position regardless of the current position of the user's eyes and the type of eyes to be visually recognized.
本実施形態に係る重畳画像表示装置の概略構成図である。It is a schematic block diagram of the superimposed image display apparatus which concerns on this embodiment. 本実施形態に係るフロントディスプレイの構造を示した図である。It is the figure which showed the structure of the front display which concerns on this embodiment. 本実施形態に係るフロントディスプレイの右目用画像と左目用画像の表示態様を示した図である。It is a figure showing the display mode of the image for the right eye and the image for the left eye of the front display concerning this embodiment. 本実施形態に係るナビゲーション装置を示したブロック図である。It is a block diagram showing a navigation device concerning this embodiment. 本実施形態に係る走行支援処理プログラムのフローチャートである。It is a flow chart of a driving support processing program concerning this embodiment. 車両の乗員から視認される第1案内画像及び第2案内画像の虚像の例を示した図である。It is the figure which showed the example of the virtual image of the 1st guidance image and 2nd guidance image visually recognized by the passenger | crew of a vehicle. 車両の乗員から視認される第1案内画像の虚像の例を示した図である。It is the figure which showed the example of the virtual image of the 1st guidance image visually recognized by the passenger of a vehicle. 表示範囲決定処理のサブ処理プログラムのフローチャートである。7 is a flowchart of a sub-processing program of display range determination processing. フロントディスプレイにおける第1案内画像の表示範囲を決定する方法を説明した図である。It is a figure explaining the method of determining the display range of the 1st guidance image on the front display.
 以下、本発明に係る重畳画像表示装置を具体化した一実施形態に基づき図面を参照しつつ詳細に説明する。先ず、本実施形態に係る重畳画像表示装置1の概略構成について図1を用いて説明する。図1は本実施形態に係る重畳画像表示装置1の概略構成図である。 Hereinafter, a detailed description will be given with reference to the drawings based on an embodiment in which a superimposed image display device according to the present invention is embodied. First, the schematic configuration of the superimposed image display device 1 according to the present embodiment will be described with reference to FIG. FIG. 1 is a schematic configuration diagram of a superimposed image display device 1 according to the present embodiment.
 図1に示すように重畳画像表示装置1は、車両2に搭載されたナビゲーション装置3と、同じく車両2に搭載されるとともにナビゲーション装置3と接続されたフロントディスプレイ4とを基本的に有する。尚、フロントディスプレイ4は後述のように車両2のフロントガラス5とともにヘッドアップディスプレイとして機能し、車両2の乗員6に対して様々な情報の提供を行う情報提供手段となる。 As shown in FIG. 1, the superimposed image display device 1 basically has a navigation device 3 mounted on a vehicle 2 and a front display 4 also mounted on the vehicle 2 and connected to the navigation device 3. The front display 4 functions as a head-up display together with the windshield 5 of the vehicle 2 as described later, and serves as an information providing means for providing various information to the occupant 6 of the vehicle 2.
 ここで、ナビゲーション装置3は、目的地までの推奨経路を探索したり、サーバから取得したりメモリに格納された地図データに基づいて車両2の現在位置周辺の地図画像を表示したり、設定された案内経路に沿った走行案内や障害物に対する警告等をフロントディスプレイ4とともに行う機能を有する。尚、上記機能の全てをナビゲーション装置3が備えている必要はなく、少なくとも案内経路に沿った走行案内や障害物に対する警告等を行う機能を有していれば本願発明を構成することが可能である。尚、ナビゲーション装置3の構造の詳細については後述する。 Here, the navigation device 3 searches for a recommended route to the destination, displays a map image around the current position of the vehicle 2 based on the map data acquired from the server or stored in the memory, and is set. The front display 4 has a function of providing driving guidance along the guide route, warning of obstacles, and the like. The navigation device 3 does not have to have all of the functions described above, and the present invention can be configured as long as the navigation device 3 has at least a function of performing traveling guidance along the guide route, warning of obstacles, and the like. is there. The details of the structure of the navigation device 3 will be described later.
 一方、フロントディスプレイ4は、車両2のダッシュボード7内部に設置され、前面に設けられた画像表示面に対して画像を表示する機能を有する液晶ディスプレイである。バックライトとしては例えばCCFL(冷陰極管)や白色LEDが用いられる。尚、フロントディスプレイ4としては、液晶ディスプレイ以外に、有機ELディスプレイや液晶プロジェクタとスクリーンの組み合わせを用いても良い。 On the other hand, the front display 4 is a liquid crystal display installed inside the dashboard 7 of the vehicle 2 and having a function of displaying an image on an image display surface provided on the front surface. As the backlight, for example, CCFL (cold cathode tube) or white LED is used. As the front display 4, a combination of an organic EL display or a liquid crystal projector and a screen may be used instead of the liquid crystal display.
 そして、フロントディスプレイ4は車両2のフロントガラス5とともにヘッドアップディスプレイとして機能し、フロントディスプレイ4から出力される画像を、運転席の前方のフロントガラス5に反射させて車両2の乗員6に視認させるように構成されている。尚、フロントディスプレイ4に表示される画像としては、車両2に関する情報や乗員6の運転の支援の為に用いられる各種情報がある。例えば乗員6に対して警告対象となる対象物(他車両、歩行者、案内標識)に対する警告、ナビゲーション装置3で設定された案内経路や案内経路に基づく案内情報(右左折方向を示す矢印等)、路面に表示する警告(追突注意、制限速度等)、車両が走行する車線の区画線、現在車速、シフト位置、エネルギ残量、広告画像、施設情報、地図画像、交通情報、ニュース、天気予報、時刻、接続されたスマートフォンの画面、テレビ番組等がある。 The front display 4 functions as a head-up display together with the windshield 5 of the vehicle 2, and the image output from the front display 4 is reflected on the windshield 5 in front of the driver's seat so that the occupant 6 of the vehicle 2 can visually recognize the image. Is configured. The images displayed on the front display 4 include information about the vehicle 2 and various information used to support the driving of the occupant 6. For example, a warning for an object (other vehicle, pedestrian, guide sign) that is a warning target for the occupant 6, a guide route set by the navigation device 3 or guide information based on the guide route (arrows indicating right / left turn directions, etc.) , Warnings displayed on the road surface (crash collision, speed limits, etc.), lane markings on which the vehicle is traveling, current vehicle speed, shift position, energy level, advertisement image, facility information, map image, traffic information, news, weather forecast , Time, connected smartphone screen, TV program, etc.
 また、本実施形態のフロントディスプレイ4は、視差バリア又はレンチキュラーレンズを備えた液晶ディスプレイであり、乗員6の右目のみに視認させる画像(以下、右目用画像という)と、乗員6の左目のみに視認させる画像(以下、左目用画像という)を夫々表示可能である。従って、上記警告や案内情報等を右目用画像としてのみ表示すれば、右目のみにそれらの情報を視認させることが可能となる。また、左目用画像としてのみ表示すれば、左目のみにそれらの情報を視認させることが可能となる。更に、右目用画像と左目用画像の両方に表示すれば、両目にそれらの情報を視認させることが可能となる Further, the front display 4 of the present embodiment is a liquid crystal display provided with a parallax barrier or a lenticular lens, and an image to be viewed only by the right eye of the occupant 6 (hereinafter referred to as an image for the right eye) and an image viewed by only the left eye of the occupant 6 The images to be displayed (hereinafter, referred to as left-eye images) can be displayed. Therefore, if the above-mentioned warning, guidance information, and the like are displayed only as an image for the right eye, the information can be visually recognized only by the right eye. Further, if it is displayed only as the image for the left eye, it becomes possible to make the information visible only to the left eye. Further, if the information is displayed on both the right-eye image and the left-eye image, it becomes possible for both eyes to visually recognize the information.
 例えば、視差バリアを用いたフロントディスプレイ4では、図2に示すようにフロントディスプレイ4の画像表示領域8の前面に、格子状に隙間を形成したパララックスバリア(視差バリア)9を配置する。そして、画像表示領域8の内、乗員6の右目からパララックスバリア9の隙間を通して視認可能な領域(画素)8Aを用いて表示された画像は右目のみに視認可能な画像(即ち右目用画像)となる。一方、画像表示領域8の内、乗員6の左目からパララックスバリア9の隙間を通して視認可能な領域(画素)8Bを用いて表示された画像は左目のみに視認可能な画像(即ち左目用画像)となる。 For example, in the front display 4 using a parallax barrier, as shown in FIG. 2, a parallax barrier (parallax barrier) 9 having lattice-shaped gaps is arranged in front of the image display area 8 of the front display 4. In the image display area 8, the image displayed using the area (pixel) 8A that is visible from the right eye of the occupant 6 through the gap of the parallax barrier 9 is an image that is visible only to the right eye (that is, the image for the right eye). Becomes On the other hand, in the image display area 8, the image displayed using the area (pixel) 8B that is visible from the left eye of the occupant 6 through the gap of the parallax barrier 9 is an image that is visible only to the left eye (that is, the image for the left eye). Becomes
 また、本実施形態では車両内にカメラを配置し、乗員6の左右の目の位置を夫々検出することが可能である為、右目用画像を表示する領域8Aと左目用画像を表示する領域8Bを乗員6の目の位置に応じて補正することも可能である。例えば、乗員6の左目の位置がL1からR1へと移動したことを検出した場合、或いは、乗員6の右目の位置がR1からL1へと移動したことを検出した場合には、右目用画像を表示する領域8Aと左目用画像を表示する領域8Bを切り換える。或いは、図3に示すように右目用画像を表示する領域8Aと左目用画像を表示する領域8Bが左右方向に複数画素からなる場合には、右目用画像を表示する領域8Aと左目用画像を表示する領域8Bを一度に切り換えるのではなく、ユーザの目の位置の変化に応じて1画素ずつ移動させることも可能である。それによって、ユーザの目の位置に関わらずユーザの右目に右目用画像を正確に視認させ、ユーザの左目に左目用画像を正確に視認させることが可能となる。 In addition, in the present embodiment, since a camera is arranged in the vehicle and the positions of the left and right eyes of the occupant 6 can be respectively detected, the area 8A for displaying the image for the right eye and the area 8B for displaying the image for the left eye. Can also be corrected according to the position of the eyes of the occupant 6. For example, if it is detected that the position of the left eye of the occupant 6 has moved from L1 to R1, or if the position of the right eye of the occupant 6 has been moved from R1 to L1, the image for the right eye is displayed. The area 8A to be displayed and the area 8B to display the image for the left eye are switched. Alternatively, as shown in FIG. 3, when the area 8A for displaying the image for the right eye and the area 8B for displaying the image for the left eye are composed of a plurality of pixels in the horizontal direction, the area 8A for displaying the image for the right eye and the image for the left eye are Instead of switching the display area 8B at once, it is possible to move the display area 8B pixel by pixel according to the change in the position of the eyes of the user. As a result, the right-eye image can be accurately viewed by the user's right eye, and the left-eye image can be accurately viewed by the user's left eye, regardless of the position of the user's eyes.
 尚、画像を表示する手段としては、液晶ディスプレイ以外に、有機ELディスプレイ、液晶プロジェクタとスクリーンの組み合わせ等を用いても良いが、右目用画像と左目用画像を夫々表示することが可能な手段であることが望ましい。 As a means for displaying an image, an organic EL display, a combination of a liquid crystal projector and a screen may be used instead of a liquid crystal display, but it is a means capable of displaying an image for the right eye and an image for the left eye respectively. Is desirable.
 また、右目用画像と左目用画像を表示する手段としては偏光眼鏡又は液晶シャッター眼鏡を用いても良い。具体的には偏光眼鏡又は液晶シャッター眼鏡を乗員6が装着し、偏光眼鏡又は液晶シャッター眼鏡を介してフロントディスプレイ4を視認させる。フロントディスプレイ4では装着した眼鏡に対応した方式の画像(例えば液晶シャッター眼鏡では右目用画像と左目用画像を交互に表示する)を表示することによって、右目用画像と左目用画像を夫々表示することが可能となる。更に、上記例以外にヘッドマウントディスプレイ(HMD)式等の右目用画像と左目用画像を夫々表示することが可能な表示手段であれば本願に適用可能である。 Also, polarizing glasses or liquid crystal shutter glasses may be used as a means for displaying the image for the right eye and the image for the left eye. Specifically, the occupant 6 wears polarized glasses or liquid crystal shutter glasses to visually recognize the front display 4 through the polarized glasses or liquid crystal shutter glasses. The front display 4 displays an image for a right eye image and a left eye image by displaying an image of a method corresponding to the glasses worn (for example, liquid crystal shutter glasses alternately display a right eye image and a left eye image). Is possible. Further, in addition to the above examples, any display means such as a head-mounted display (HMD) type capable of displaying a right-eye image and a left-eye image can be applied to the present application.
 また、フロントガラス5を反射して乗員6がフロントディスプレイ4に表示された画像を視認した場合に、乗員6にはフロントガラス5の位置ではなく、フロントガラス5の先の遠方の位置にフロントディスプレイ4に表示された画像が虚像10として視認されるように構成される。また、虚像10は車両前方の周辺環境(風景、実景)に重畳して表示されることとなり、例えば車両前方に位置する任意の対象物(路面、建築物、警告対象となる物等)に重畳させて表示させることも可能である。 Further, when the occupant 6 sees the image displayed on the front display 4 by reflecting the windshield 5, the occupant 6 does not see the position of the windshield 5 but the front display at a position far away from the windshield 5. The image displayed in 4 is configured to be visually recognized as the virtual image 10. Further, the virtual image 10 is displayed by being superimposed on the surrounding environment (landscape, real scene) in front of the vehicle, and is superimposed on, for example, an arbitrary object (road surface, building, object to be warned, etc.) located in front of the vehicle. It is also possible to display them.
 ここで、虚像10を生成する位置、より具体的には乗員6から虚像10までの距離(以下、結像距離という)Lについては、フロントディスプレイ4の位置によって決定される。例えば、フロントディスプレイ4において画像の表示された位置からフロントガラス5までの光路に沿った距離(光路長)によって結像距離Lが決定される。例えば結像距離Lが1.5mとなるように光路長が設定されている。 Here, the position at which the virtual image 10 is generated, more specifically, the distance L from the occupant 6 to the virtual image 10 (hereinafter referred to as the imaging distance) is determined by the position of the front display 4. For example, the imaging distance L is determined by the distance (optical path length) along the optical path from the position where the image is displayed on the front display 4 to the windshield 5. For example, the optical path length is set so that the imaging distance L is 1.5 m.
 また、本実施形態では車両前方の周辺環境に重畳する画像を表示する手段としてフロントディスプレイ4を用いているが、他の手段を用いても良い。例えば、フロントガラス5に対して映像を表示するウインドウシールドディスプレイ(WSD)を用いても良い。WSDでは、フロントガラス5をスクリーンとしてプロジェクタから映像を表示しても良いし、フロントガラス5を透過液晶ディスプレイとしても良い。WSDによってフロントガラス5に対して表示された画像は、フロントディスプレイ4と同様に車両前方の周辺環境に重畳する画像となる。 Further, in the present embodiment, the front display 4 is used as the means for displaying the image to be superimposed on the surrounding environment in front of the vehicle, but other means may be used. For example, a window shield display (WSD) that displays an image on the windshield 5 may be used. In WSD, an image may be displayed from a projector using the windshield 5 as a screen, or the windshield 5 may be used as a transmissive liquid crystal display. The image displayed on the windshield 5 by the WSD is an image to be superimposed on the surrounding environment in front of the vehicle, like the front display 4.
 また、車両のフロントバンパの上方やルームミラーの裏側等にはフロントカメラ11が設置される。フロントカメラ11は、例えばCCD等の固体撮像素子を用いたカメラを有する撮像装置であり、光軸方向を車両の進行方向前方に向けて設置される。そして、フロントカメラ11により撮像された撮像画像に対して画像処理が行われることによって、フロントガラス越しに乗員6に視認される前方環境(即ち虚像10が重畳される環境)の状況等が検出される。尚、フロントカメラ11の代わりにミリ波レーダ等のセンサを用いても良い。 Also, a front camera 11 is installed above the front bumper of the vehicle, behind the room mirror, and the like. The front camera 11 is an imaging device having a camera that uses a solid-state imaging device such as a CCD, and is installed with the optical axis direction facing forward in the traveling direction of the vehicle. Then, by performing image processing on the captured image captured by the front camera 11, the situation of the front environment (that is, the environment in which the virtual image 10 is superimposed) visually recognized by the occupant 6 through the windshield is detected. It A sensor such as a millimeter wave radar may be used instead of the front camera 11.
 また、車両のインストルメントパネルの上面には車内カメラ12が設置される。車内カメラ12は、例えばCCD等の固体撮像素子を用いたカメラを有する撮像装置であり、光軸方向を運転席に向けて設置される。車内において一般的に乗員の顔が位置すると予想される範囲を検出範囲(車内カメラ12の撮像範囲)として設定し、運転席に座った乗員6の顔を撮像する。そして、車内カメラ12により撮像された撮像画像に対して画像処理が行われることによって、乗員6の左右の目の位置(視線開始点)や視線方向を検出する。 Also, an in-vehicle camera 12 is installed on the upper surface of the instrument panel of the vehicle. The in-vehicle camera 12 is an imaging device having a camera using a solid-state imaging device such as a CCD, and is installed with the optical axis direction facing the driver's seat. The range in which the occupant's face is generally expected to be located inside the vehicle is set as the detection range (imaging range of the in-vehicle camera 12), and the face of the occupant 6 sitting in the driver's seat is imaged. Then, image processing is performed on the captured image captured by the in-vehicle camera 12 to detect the position of the left and right eyes of the occupant 6 (line-of-sight start point) and the line-of-sight direction.
 次に、上記重畳画像表示装置1を構成するナビゲーション装置3の概略構成について図4を用いて説明する。図4は本実施形態に係るナビゲーション装置3を示したブロック図である。 Next, a schematic configuration of the navigation device 3 which constitutes the superimposed image display device 1 will be described with reference to FIG. FIG. 4 is a block diagram showing the navigation device 3 according to the present embodiment.
 図4に示すように本実施形態に係るナビゲーション装置3は、ナビゲーション装置3が搭載された車両2の現在位置を検出する現在位置検出部13と、各種のデータが記録されたデータ記録部14と、入力された情報に基づいて、各種の演算処理を行うナビゲーションECU15と、ユーザからの操作を受け付ける操作部16と、ユーザに対して車両周辺の地図や施設の関する施設情報を表示する液晶ディスプレイ17と、経路案内に関する音声ガイダンスを出力するスピーカ18と、記憶媒体であるDVDを読み取るDVDドライブ19と、VICS(登録商標:Vehicle Information and Communication System)センタ等の情報センタとの間で通信を行う通信モジュール20と、を有する。また、ナビゲーション装置3はCAN等の車載ネットワークを介して、前述したフロントディスプレイ4、フロントカメラ11及び車内カメラ12等が接続されている。 As shown in FIG. 4, the navigation device 3 according to the present embodiment includes a current position detection unit 13 that detects the current position of the vehicle 2 in which the navigation device 3 is mounted, and a data recording unit 14 that records various data. A navigation ECU 15 that performs various kinds of arithmetic processing based on the input information, an operation unit 16 that receives an operation from a user, and a liquid crystal display 17 that displays a map around the vehicle and facility information related to the facility to the user. Communication for communication between a speaker 18 that outputs voice guidance regarding route guidance, a DVD drive 19 that reads a DVD that is a storage medium, and an information center such as a VICS (registered trademark: Vehicle Information and Communication System) center. And a module 20. Further, the navigation device 3 is connected to the front display 4, the front camera 11, the in-vehicle camera 12 and the like described above via an in-vehicle network such as CAN.
 以下に、ナビゲーション装置3が有する各構成要素について順に説明する。
 現在位置検出部13は、GPS21、車速センサ22、ステアリングセンサ23、ジャイロセンサ24等からなり、現在の車両の位置、方位、車両の走行速度、現在時刻等を検出することが可能となっている。ここで、特に車速センサ22は、車両の移動距離や車速を検出する為のセンサであり、車両の駆動輪の回転に応じてパルスを発生させ、パルス信号をナビゲーションECU15に出力する。そして、ナビゲーションECU15は発生するパルスを計数することにより駆動輪の回転速度や移動距離を算出する。尚、上記4種類のセンサをナビゲーション装置3が全て備える必要はなく、これらの内の1又は複数種類のセンサのみをナビゲーション装置3が備える構成としても良い。
Below, each component which the navigation apparatus 3 has is demonstrated in order.
The current position detection unit 13 includes a GPS 21, a vehicle speed sensor 22, a steering sensor 23, a gyro sensor 24, and the like, and is capable of detecting the current vehicle position, azimuth, vehicle traveling speed, current time, and the like. . Here, in particular, the vehicle speed sensor 22 is a sensor for detecting the moving distance and the vehicle speed of the vehicle, generates a pulse in accordance with the rotation of the drive wheels of the vehicle, and outputs a pulse signal to the navigation ECU 15. Then, the navigation ECU 15 calculates the rotational speed and the moving distance of the drive wheels by counting the generated pulses. It is not necessary for the navigation device 3 to include all of the four types of sensors, and the navigation device 3 may include only one or a plurality of types of these sensors.
 また、データ記録部14は、外部記憶装置及び記録媒体としてのハードディスク(図示せず)と、ハードディスクに記録された地図情報DB31や所定のプログラム等を読み出すとともにハードディスクに所定のデータを書き込む為のドライバである記録ヘッド(図示せず)とを備えている。尚、データ記録部14はハードディスクの代わりにフラッシュメモリやメモリーカードやCDやDVD等の光ディスクを有していても良い。また、地図情報DB31は外部のサーバに格納させ、ナビゲーション装置3が通信により取得する構成としても良い。 The data recording unit 14 also reads a hard disk (not shown) as an external storage device and a recording medium, a map information DB 31 and a predetermined program recorded in the hard disk, and a driver for writing predetermined data in the hard disk. And a recording head (not shown). The data recording unit 14 may have a flash memory, a memory card, or an optical disk such as a CD or a DVD instead of the hard disk. Further, the map information DB 31 may be stored in an external server and the navigation device 3 may obtain it by communication.
 ここで、地図情報DB31は、例えば、道路(リンク)に関するリンクデータ32、ノード点に関するノードデータ33、施設等の地点に関する地点データ34、各交差点に関する交差点データ、地図を表示するための地図表示データ、経路を探索するための探索データ、地点を検索するための検索データ等が記憶された記憶手段である。 Here, the map information DB 31 is, for example, link data 32 regarding roads (links), node data 33 regarding node points, point data 34 regarding points such as facilities, intersection data regarding each intersection, and map display data for displaying a map. The storage means stores search data for searching a route, search data for searching a spot, and the like.
 ここで、リンクデータ32としては、例えば、該リンクを識別するリンクID、該リンクの端部に位置するノードを特定する端部ノード情報、該リンクが構成する道路の道路種別等が記憶される。また、ノードデータ33としては、該ノードを識別するノードID、該ノードの位置座標、該ノードがリンクを介して接続される接続先ノードを特定する接続先ノード情報等が記憶される。また、地点データ34としては、目的地への設定対象となる施設に関する各種情報が記憶される。例えば、施設を特定するID、施設名称、位置座標、ジャンル、住所等が記憶される。 Here, as the link data 32, for example, a link ID for identifying the link, end node information for identifying a node located at the end of the link, a road type of a road formed by the link, and the like are stored. . In addition, as the node data 33, a node ID for identifying the node, a position coordinate of the node, connection destination node information for identifying a connection destination node to which the node is connected via a link, and the like are stored. Further, as the point data 34, various kinds of information regarding facilities to be set as a destination are stored. For example, the ID for identifying the facility, the facility name, the position coordinates, the genre, the address, etc. are stored.
 一方、ナビゲーションECU(エレクトロニック・コントロール・ユニット)15は、ナビゲーション装置3の全体の制御を行う電子制御ユニットであり、演算装置及び制御装置としてのCPU41、並びにCPU41が各種の演算処理を行うにあたってワーキングメモリとして使用されるとともに、経路が探索されたときの経路データ等が記憶されるRAM42、制御用のプログラムのほか、後述の走行支援処理プログラム(図5)等が記録されたROM43、ROM43から読み出したプログラムを記憶するフラッシュメモリ44等の内部記憶装置を備えている。尚、ナビゲーションECU15は、処理アルゴリズムとしての各種手段を有する。例えば、位置検出手段は、ユーザの両目の位置を夫々検出する。表示位置決定手段は、位置検出手段により検出したユーザの両目の位置に基づいて、フロントディスプレイ4における画像の表示位置を決定する。 On the other hand, the navigation ECU (electronic control unit) 15 is an electronic control unit that controls the entire navigation device 3, and a CPU 41 as an arithmetic unit and a control unit, and a working memory for the CPU 41 to perform various arithmetic processes. In addition to the RAM 42 that stores the route data when the route is searched, the control program, and the ROM 43 and the ROM 43 that store a driving support processing program (FIG. 5) described below and the like. An internal storage device such as a flash memory 44 for storing a program is provided. The navigation ECU 15 has various means as a processing algorithm. For example, the position detecting means detects the positions of both eyes of the user. The display position determining means determines the display position of the image on the front display 4 based on the positions of the eyes of the user detected by the position detecting means.
 操作部16は、走行開始地点としての出発地及び走行終了地点としての目的地を入力する際等に操作され、各種のキー、ボタン等の複数の操作スイッチ(図示せず)を有する。そして、ナビゲーションECU15は、各スイッチの押下等により出力されるスイッチ信号に基づき、対応する各種の動作を実行すべく制御を行う。尚、操作部16は液晶ディスプレイ17の前面に設けたタッチパネルを有していても良い。また、マイクと音声認識装置を有していても良い。 The operation unit 16 is operated when inputting a departure point as a travel start point and a destination as a travel end point, and has a plurality of operation switches (not shown) such as various keys and buttons. Then, the navigation ECU 15 performs control so as to execute various corresponding operations based on the switch signal output by pressing each switch. The operation unit 16 may have a touch panel provided on the front surface of the liquid crystal display 17. Further, it may have a microphone and a voice recognition device.
 また、液晶ディスプレイ17には、道路を含む地図画像、交通情報、操作案内、操作メニュー、キーの案内、出発地から目的地までの案内経路、案内経路に沿った案内情報、ニュース、天気予報、時刻、メール、テレビ番組等が表示される。尚、本実施形態では情報の表示手段としてフロントディスプレイ4を備えているので、上記地図画像等の表示をフロントディスプレイ4で行う構成とすれば液晶ディスプレイ17は省略しても良い。 Further, on the liquid crystal display 17, a map image including roads, traffic information, operation guidance, operation menu, key guidance, guidance route from the departure point to the destination, guidance information along the guidance route, news, weather forecast, Time, mail, TV programs, etc. are displayed. In addition, since the front display 4 is provided as the information display means in the present embodiment, the liquid crystal display 17 may be omitted if the front display 4 is used to display the map image and the like.
 また、スピーカ18は、ナビゲーションECU15からの指示に基づいて案内経路に沿った走行を案内する音声ガイダンスや、交通情報の案内を出力する。 Further, the speaker 18 outputs voice guidance for guiding the driver to travel along the guide route and guidance for traffic information based on an instruction from the navigation ECU 15.
 また、DVDドライブ19は、DVDやCD等の記録媒体に記録されたデータを読み取り可能なドライブである。そして、読み取ったデータに基づいて音楽や映像の再生、地図情報DB31の更新等が行われる。尚、DVDドライブ19に替えてメモリーカードを読み書きする為のカードスロットを設けても良い。 The DVD drive 19 is a drive that can read data recorded on a recording medium such as a DVD or a CD. Then, based on the read data, reproduction of music or video, updating of the map information DB 31, etc. are performed. A card slot for reading and writing a memory card may be provided instead of the DVD drive 19.
 また、通信モジュール20は、交通情報センタ、例えば、VICSセンタやプローブセンタ等から送信された渋滞情報、規制情報、交通事故情報等の各情報から成る交通情報を受信する為の通信装置であり、例えば携帯電話機やDCMが該当する。 The communication module 20 is a communication device for receiving traffic information including traffic congestion information, regulation information, traffic accident information and the like transmitted from a traffic information center, for example, a VICS center or a probe center. For example, a mobile phone or DCM is applicable.
 続いて、前記構成を有する重畳画像表示装置1の内、特にナビゲーション装置3において実行する走行支援処理プログラムについて図5に基づき説明する。図5は本実施形態に係る走行支援処理プログラムのフローチャートである。ここで、走行支援処理プログラムは車両のACC電源(accessory power supply)がONされた後に実行され、フロントディスプレイ4に表示された画像を車両周辺の周辺環境に重畳して視認させることによって、車両の乗員に対する各種情報の提供を行うプログラムである。尚、以下の図5及び図8にフローチャートで示されるプログラムは、ナビゲーション装置3が備えているRAM42やROM43に記憶されており、CPU41により実行される。 Next, a driving support processing program executed by the navigation device 3 among the superimposed image display device 1 having the above-described configuration will be described with reference to FIG. FIG. 5 is a flowchart of the driving support processing program according to the present embodiment. Here, the driving support processing program is executed after the ACC power supply (accessory power supply) of the vehicle is turned on, and by superimposing the image displayed on the front display 4 on the surrounding environment of the vehicle for visual recognition, It is a program that provides various information to passengers. The programs shown in the flowcharts of FIGS. 5 and 8 below are stored in the RAM 42 or the ROM 43 included in the navigation device 3, and are executed by the CPU 41.
 以下の説明では特にフロントディスプレイ4を用いて、ナビゲーション装置3で設定された案内経路に沿った車両の進行方向を示す案内矢印と、自車両に関する情報(例えば現在車速、シフト位置、エネルギ残量)の提供を行う例について説明する。但し、上記以外の案内や情報提供を行うことも可能である。例えば、車両2の乗員6への警告対象となる対象物(例えば他車両、歩行者、案内標識)に対する警告、路面に表示する警告(追突注意、制限速度等)、車両が走行する車線の区画線、広告画像、施設情報、地図画像、交通情報、ニュース、天気予報、時刻、接続されたスマートフォンの画面、テレビ番組の表示等が可能である。 In the following description, particularly using the front display 4, a guide arrow indicating the traveling direction of the vehicle along the guide route set by the navigation device 3 and information about the vehicle (for example, current vehicle speed, shift position, remaining energy amount) An example of providing the above will be described. However, it is possible to provide guidance and information other than the above. For example, a warning for an object (for example, another vehicle, a pedestrian, a guide sign) that is a warning target for the occupant 6 of the vehicle 2, a warning displayed on the road surface (a rear-end collision warning, a speed limit, etc.), and a lane section in which the vehicle travels It is possible to display lines, advertisement images, facility information, map images, traffic information, news, weather forecast, time, screens of connected smartphones, TV programs, and the like.
 先ず、走行支援処理プログラムでは、ステップ(以下、Sと略記する)1においてCPU41は、自車の現在位置情報、自車周辺の地図情報、ナビゲーション装置3で設定されている案内経路をそれぞれ取得する。尚、自車の現在位置は例えばGPS21や車速センサ22等を用いて検出する。自車が複数車線からなる道路を走行する場合には自車の走行する車線についても特定するのが望ましい。また、自車周辺の地図情報は地図情報DB31或いは外部サーバから取得する。尚、地図情報には、道路の形状や車線区分を特定する情報が含まれる。 First, in the driving support processing program, in step (hereinafter abbreviated as S) 1, the CPU 41 acquires the current position information of the own vehicle, the map information around the own vehicle, and the guide route set by the navigation device 3, respectively. . The current position of the host vehicle is detected using, for example, the GPS 21 or the vehicle speed sensor 22. When the vehicle travels on a road having a plurality of lanes, it is desirable to specify the lane in which the vehicle travels. Further, the map information around the own vehicle is acquired from the map information DB 31 or an external server. The map information includes information that identifies the shape of the road and the lane segment.
 次に、S2においてCPU41は、フロントカメラ11により撮像した撮像画像に対して2値化処理等の画像処理を行うことによって、案内矢印の虚像の重畳対象となる自車の走行する道路の路面を検出する。 Next, in S2, the CPU 41 performs image processing such as binarization processing on the captured image captured by the front camera 11 to display the road surface of the road on which the vehicle is running, which is the target of superimposition of the virtual image of the guide arrow. To detect.
 続いて、S3においてCPU41は、案内矢印の虚像の重畳対象となる自車の走行する道路の路面(以下、重畳対象路面という)の具体的な位置を3次元位置座標で特定する。例えば案内経路に沿って車両の現在位置から100m先までの道路(案内交差点がある場合には案内交差点も含む)の路面を、重畳対象路面とする。尚、重畳対象路面の位置はフロントカメラ11により撮像した撮像画像に基づいて特定しても良いし、ミリ波レーダ等のセンサを用いて特定しても良い。また、ナビゲーション装置3が3次元情報を備えた地図情報を取得可能な場合には、地図情報と車両の現在位置に基づいて重畳対象路面の位置を特定することも可能である Subsequently, in S3, the CPU 41 specifies a specific position of the road surface of the road on which the vehicle on which the virtual image of the guide arrow is to be superimposed (hereinafter referred to as a superimposition target road surface) is three-dimensional position coordinates. For example, the road surface of the road (including the guidance intersection if there is a guidance intersection) 100 m ahead from the current position of the vehicle along the guidance route is set as the superimposition target road surface. The position of the road surface to be superimposed may be specified based on the imaged image taken by the front camera 11, or may be specified using a sensor such as a millimeter wave radar. Further, when the navigation device 3 can acquire map information including three-dimensional information, it is also possible to specify the position of the road surface to be superimposed on the basis of the map information and the current position of the vehicle.
 また、重畳対象路面の位置の特定は、走行支援処理プログラムが終了するまで以後継続的に行われる。例えば、重畳対象路面の位置の特定をフロントカメラ11で撮像した撮像画像の1フレーム毎に行うこととすれば、継続的に重畳対象路面の現在の位置を特定することが可能となる。 Also, the position of the road surface to be superimposed will be continuously specified until the driving support processing program ends. For example, if the position of the road surface to be superimposed is specified for each frame of the image captured by the front camera 11, the current position of the road surface to be superimposed can be continuously specified.
 次に、S4においてCPU41は、フロントディスプレイ4を介して自車両に関する情報の提供を行うか否かを判定する。自車両に関する情報としては例えば現在車速、シフト位置、エネルギ残量等がある。尚、自車両に関する情報の提供のON又はOFFの切り替えは車両の乗員の操作によって行うことが可能である。また、周辺状況や車両の状態に基づいてON又はOFFを自動で切り替えても良い。 Next, in S4, the CPU 41 determines whether or not to provide information about the vehicle via the front display 4. The information regarding the own vehicle includes, for example, the current vehicle speed, the shift position, the remaining energy level, and the like. It should be noted that the ON / OFF switching of the provision of information regarding the own vehicle can be performed by the operation of the vehicle occupant. Further, ON or OFF may be automatically switched based on the surrounding situation or the state of the vehicle.
 そして、フロントディスプレイ4を介して自車両に関する情報の提供を行うと判定された場合(S4:YES)には、S5へと移行する。それに対して、フロントディスプレイ4を介して自車両に関する情報の提供を行わないと判定された場合(S4:NO)には、S9へと移行する。 Then, if it is determined that the information regarding the own vehicle is provided via the front display 4 (S4: YES), the process proceeds to S5. On the other hand, when it is determined that the information regarding the own vehicle is not provided via the front display 4 (S4: NO), the process proceeds to S9.
 その後、S5においてCPU41は、後述の表示範囲決定処理(図8)を行う。表示範囲決定処理では、フロントガラス5に対して案内画像を投影する範囲(即ちフロントディスプレイ4において案内画像を表示する範囲)を決定する。尚、S5以降では、ナビゲーション装置3で設定された案内経路に沿った車両の進行方向を示す案内矢印の画像(以下、第1案内画像という)と自車両に関する情報を案内する画像(以下、第2案内画像という)の2種類の画像を表示対象とする。従って、S5では第1案内画像及び第2案内画像の夫々の表示範囲を決定する。 After that, in S5, the CPU 41 performs display range determination processing (FIG. 8) described later. In the display range determination processing, the range in which the guide image is projected on the windshield 5 (that is, the range in which the guide image is displayed on the front display 4) is determined. After S5, an image of a guide arrow (hereinafter referred to as the first guide image) indicating the traveling direction of the vehicle along the guide route set by the navigation device 3 and an image (hereinafter referred to as the first guide image) that guides information about the vehicle. Two types of images (referred to as 2 guide images) are displayed. Therefore, in S5, the display range of each of the first guide image and the second guide image is determined.
 ここで、第1案内画像は車両から視認される周辺環境に含まれる特定の対象物(具体的には重畳対象路面)に重畳した状態を維持することが必要な画像である。また、第1案内画像は、両目視差により画像が2重に視認される事象が生じることを防止する為に、フロントディスプレイ4では右目用画像又は左目用画像の内、選択された一方の画像で表示する(即ち、右目又は左目の一方のみで第1案内画像を視認させる)。従って、前記S5では後述のように右目用画像又は左目用画像の内、選択された一方の画像における第1案内画像の表示範囲を決定する。 Here, the first guide image is an image that needs to be maintained in a state of being superposed on a specific object (specifically, a superposition target road surface) included in the surrounding environment visually recognized from the vehicle. Further, the first guide image is one of the images selected for the right-eye image or the left-eye image on the front display 4 in order to prevent the image from being visually recognized twice due to the difference between the two eyes. It is displayed (that is, the first guidance image is visually recognized only by one of the right eye and the left eye). Therefore, in S5, the display range of the first guide image in the selected one of the right-eye image and the left-eye image is determined as described later.
 一方、第2案内画像は車両から視認される周辺環境に含まれる特定の対象物に重畳した状態を維持することが不要な画像である。また、第2案内画像は、第1案内画像と比較して重畳位置の正確性は問われないことから、より違和感なく視認させる為に、フロントディスプレイ4では右目用画像と左目用画像の両方の画像で表示する(即ち、右目と左目の両方で第2案内画像を視認させる)。従って、前記S5では後述のように右目用画像と左目用画像の両方の画像における第2案内画像の表示範囲を決定する。 On the other hand, the second guide image is an image that does not need to be maintained in a state in which it is superposed on a specific object included in the surrounding environment visually recognized by the vehicle. Further, since the accuracy of the overlapping position of the second guide image does not matter as compared with the first guide image, both the right-eye image and the left-eye image are displayed on the front display 4 so that they can be visually recognized more comfortably. The image is displayed (that is, the second guidance image is visually recognized by both the right eye and the left eye). Therefore, in S5, the display range of the second guide image in both the right-eye image and the left-eye image is determined as described later.
 続いて、S6においてCPU41は、フロントディスプレイ4に対して制御信号を送信し、フロントディスプレイ4に対して第1案内画像を前記S5で決定された右目用画像又は左目用画像の表示範囲に表示する。また、第2案内画像を前記S5で決定された右目用画像と左目用画像の各表示範囲に表示する。その結果、第1案内画像及び第2案内画像の虚像が、車両の乗員から車両前方の周辺環境に重畳して視認される。特に第1案内画像の虚像は、車両の乗員が視認した際に重畳対象路面と重複して視認される。 Subsequently, in S6, the CPU 41 transmits a control signal to the front display 4 and displays the first guide image on the front display 4 in the display range of the image for the right eye or the image for the left eye determined in S5. . Further, the second guide image is displayed in each display range of the image for the right eye and the image for the left eye determined in S5. As a result, the virtual images of the first guide image and the second guide image are visually recognized by the occupant of the vehicle in an overlapping manner with the surrounding environment in front of the vehicle. In particular, the virtual image of the first guide image is visually recognized as being overlapped with the road surface to be superimposed when visually recognized by an occupant of the vehicle.
 ここで、図6は前記S6でフロントディスプレイ4に表示される左目用画像及び右目用画像の内容と、左目用画像及び右目用画像を視認した結果、車両の乗員から視認される第1案内画像及び第2案内画像の虚像の一例を示した図である。 Here, FIG. 6 shows the contents of the left-eye image and the right-eye image displayed on the front display 4 in S6, and the first guide image visually recognized by the occupant of the vehicle as a result of visually recognizing the left-eye image and the right-eye image. FIG. 6 is a diagram showing an example of a virtual image of a second guide image.
 図6に示す例では第1案内画像55は左目用画像56で表示し、第2案内画像57は左目用画像56と右目用画像58で表示している。また、左目用画像56及び右目用画像58の内、第1案内画像55及び第2案内画像57を表示していない表示領域については、黒色の画像又は輝度0の画像とする。従って、左目用画像56に表示された第1案内画像55と対応する右目用画像58の領域について黒色の画像又は輝度0の画像となり、両目でフロントディスプレイ4を視認させた場合であっても、一方の目のみに第1案内画像を視認させることが可能となる。そして、車両の乗員6が左目用画像56を左目で右目用画像58を右目でそれぞれ視認した結果、車両のフロントガラス5越しには、周辺環境に重畳する第1案内画像55の虚像が左目で視認され、第2案内画像57の虚像が両目で視認される。 In the example shown in FIG. 6, the first guidance image 55 is displayed as the left-eye image 56, and the second guidance image 57 is displayed as the left-eye image 56 and the right-eye image 58. Further, among the left-eye image 56 and the right-eye image 58, the display area in which the first guide image 55 and the second guide image 57 are not displayed is a black image or an image with zero brightness. Therefore, the region of the right-eye image 58 corresponding to the first guide image 55 displayed on the left-eye image 56 becomes a black image or an image of 0 brightness, and even when the front display 4 is visually recognized by both eyes, It is possible for only one eye to visually recognize the first guide image. As a result of the occupant 6 of the vehicle visually recognizing the left-eye image 56 with the left eye and the right-eye image 58 with the right eye, the virtual image of the first guide image 55 superimposed on the surrounding environment is displayed with the left eye over the windshield 5 of the vehicle. The virtual image of the second guide image 57 is visually recognized by both eyes.
 ここで、第1案内画像55の虚像は、周辺環境の内、特に車両前方の重畳対象路面59に重畳して視認される。図6は案内経路が前方の案内交差点で右折する経路である場合に表示される第1案内画像55の例であり、第1案内画像55は案内交差点で右方向に曲がることを示す矢印の画像とする。その結果、乗員は視線を進行方向前方から移動させることなく案内交差点の位置と案内交差点における車両の進行方向を把握することが可能となる。また、第1案内画像55の虚像は車両の乗員の左目のみで視認されるので、両目視差により画像が2重に視認される事象が生じることを防止することが可能となる。尚、第1案内画像55の表示は、基本的に案内経路が設定されている間において行われる(直進する場合には直進の矢印が表示される)が、案内交差点に接近した場合にのみ行うようにしても良い。 Here, the virtual image of the first guide image 55 is visually recognized in the surrounding environment, in particular, on the road surface 59 to be superimposed in front of the vehicle. FIG. 6 is an example of the first guidance image 55 displayed when the guidance route is a route that turns right at the front guidance intersection, and the first guidance image 55 is an image of an arrow indicating that the first guidance image 55 turns right at the guidance intersection. And As a result, the occupant can grasp the position of the guided intersection and the traveling direction of the vehicle at the guided intersection without moving the line of sight from the front of the traveling direction. Further, since the virtual image of the first guide image 55 is visually recognized only by the left eye of the occupant of the vehicle, it is possible to prevent a phenomenon in which the image is double visually recognized due to the difference between both eyes. The display of the first guide image 55 is basically performed while the guide route is set (a straight arrow is displayed when going straight), but is only displayed when approaching the guidance intersection. You may do it.
 一方、第2案内画像57の虚像は、車両のフロントガラス5の下縁中央付近に周辺環境に重畳して視認される。第2案内画像57は第1案内画像と異なり特定の対象物に重畳するのではなく、フロントガラス5の同位置に固定して視認される。図6に示す例では第2案内画像57は特に車両の現在の車速を示す画像とする。その結果、乗員は視線を進行方向前方から移動させることなく車両の現在の車速を把握することが可能となる。また、第2案内画像57の虚像は車両の乗員の両目で視認されるので、片目で見る場合と比較して違和感を感じさせることが無い。尚、第2案内画像57の表示は、基本的にフロントディスプレイ4を用いた自車両の情報提供がONされている状態では常時行われる。 On the other hand, the virtual image of the second guide image 57 is visually recognized in the vicinity of the center of the lower edge of the windshield 5 of the vehicle, being superimposed on the surrounding environment. Unlike the first guide image, the second guide image 57 is not superposed on a specific object, but is fixed at the same position on the windshield 5 and is visually recognized. In the example shown in FIG. 6, the second guidance image 57 is an image showing the current vehicle speed of the vehicle. As a result, the occupant can grasp the current vehicle speed of the vehicle without moving the line of sight from the front in the traveling direction. Further, since the virtual image of the second guide image 57 is visually recognized by both eyes of the occupant of the vehicle, there is no sense of discomfort as compared with the case of viewing with one eye. The display of the second guidance image 57 is basically always performed when the information provision of the own vehicle using the front display 4 is turned on.
 また、第1案内画像55は自車両の乗員から視認される重畳対象路面59に重畳を維持する必要があるので、自車両が移動する(即ち車両から視認される周辺環境の位置や大きさが変化する)と、前記S5で新たに第1案内画像55の表示範囲が決定され、表示範囲が更新される。即ち、車両から視認される周辺環境が変化すると、それに連動して第1案内画像55の表示位置や表示サイズも変化する。言い換えれば第1案内画像55は、周辺環境において重畳する位置が関連付けられた画像である。 In addition, since the first guide image 55 needs to be maintained on the superimposition target road surface 59 viewed by the occupant of the own vehicle, the own vehicle moves (that is, the position and size of the surrounding environment viewed by the vehicle are Change), the display range of the first guide image 55 is newly determined in S5, and the display range is updated. That is, when the surrounding environment visually recognized from the vehicle changes, the display position and the display size of the first guide image 55 also change in association with it. In other words, the first guidance image 55 is an image associated with the position where it is superimposed in the surrounding environment.
 一方、第2案内画像57は特定の対象物に重畳させる必要が無いので基本的にフロントガラス5の固定位置(例えば下縁中央付近)に表示される。即ち、第2案内画像57は自車両から視認される周辺環境の変化に表示位置は連動せず、最初に前記S5で決定された表示範囲は基本的に変わらない。言い換えれば第2案内画像57は、周辺環境において重畳する位置が関連付けられていない画像である。 On the other hand, since the second guide image 57 does not need to be superimposed on a specific object, it is basically displayed at a fixed position of the windshield 5 (for example, near the center of the lower edge). That is, the display position of the second guide image 57 is not linked to the change of the surrounding environment visually recognized from the own vehicle, and the display range initially determined in S5 is basically unchanged. In other words, the second guide image 57 is an image in which the positions to be superimposed in the surrounding environment are not associated.
 次に、S7においてCPU41は、第1案内画像を視認させる乗員の目を他方の目に切り替えるか否かを判定する。例えば、乗員の左目で視認させている場合には、乗員が姿勢を変化させて左目が車内カメラ12によって乗員6の目の位置を検出可能な検出領域から外れた場合、或いは検出領域の縁部に対して所定距離以内(例えば5cm以内)に接近した場合、若しくは左目の疲労度が閾値を超えた場合に、第1案内画像を視認させる乗員の目を右目に切り替えると判定する。同様に、乗員の右目で視認させている場合には、乗員が姿勢を変化させて右目が車内カメラ12によって乗員6の目の位置を検出可能な検出領域から外れた場合、或いは検出領域の縁部に対して所定距離以内(例えば5cm以内)に接近した場合、若しくは右目の疲労度が閾値を超えた場合に、第1案内画像を視認させる乗員の目を左目に切り替えると判定する。尚、目の疲労度は第1案内画像の視認時間から推定する。例えば、左目で第1案内画像を視認させる状態が所定時間(例えば10分)以上となった時点で左目の疲労度が閾値を超えたと判定する。 Next, in S7, the CPU 41 determines whether to switch the eyes of the occupant who visually recognizes the first guidance image to the other eye. For example, when the occupant's left eye visually recognizes, when the occupant changes his / her posture and the left eye deviates from the detection area where the eye position of the occupant 6 can be detected by the in-vehicle camera 12, or the edge of the detection area. On the other hand, when it approaches within a predetermined distance (for example, within 5 cm) or when the fatigue level of the left eye exceeds the threshold value, it is determined that the occupant's eye visually recognizing the first guidance image is switched to the right eye. Similarly, when the occupant's right eye visually recognizes, when the occupant changes his / her posture and the right eye deviates from the detection area where the position of the eyes of the occupant 6 can be detected by the in-vehicle camera 12, or the edge of the detection area. It is determined that the eyes of the occupant who visually recognizes the first guidance image are switched to the left eye when the distance to the part is within a predetermined distance (for example, within 5 cm) or when the fatigue level of the right eye exceeds a threshold value. The degree of eye fatigue is estimated from the viewing time of the first guide image. For example, it is determined that the fatigue level of the left eye exceeds the threshold value when the state in which the left eye visually recognizes the first guidance image becomes a predetermined time (for example, 10 minutes) or more.
 また、第1案内画像を視認させる乗員の目を他方の目に切り替えるタイミングは、第1案内画像が非表示となったタイミングとする。第1案内画像は基本的に案内経路が設定されている間において表示されるが、一定間隔(例えば1sec単位)で表示と非表示を繰り返す表示態様で表示される。第1案内画像が非表示となったタイミングで第1案内画像を視認させる乗員の目を他方の目に切り替えることによって、乗員に違和感を与えることなく視認対象とする目を切り替えることが可能となる。 Also, the timing of switching the eyes of the passenger who visually recognizes the first guidance image to the other eye is the timing when the first guidance image is hidden. The first guide image is basically displayed while the guide route is set, but is displayed in a display mode in which display and non-display are repeated at regular intervals (for example, in units of 1 sec). By switching the eyes of the occupant who visually recognizes the first guidance image at the timing when the first guidance image is not displayed, it is possible to switch the eyes to be visually recognized without causing the occupant to feel discomfort. .
 また、第1案内画像を視認させる乗員の目を他方の目に切り替えるタイミングとしては、第1案内画像の表示態様を変更するタイミングとしても良い。例えば、図6に示すように第1案内画像が案内経路に沿った車両の進行方向を示す案内矢印の場合には、右左折することを案内する場合と直進方向に進むことを案内する場合とでは案内矢印の間隔を変更することが可能である。具体的には、直進方向に進むことを案内する場合では、右左折することを案内する場合よりも案内矢印の間隔を広くするのが望ましい。そこで、案内矢印の間隔が変更されるタイミング、即ち第1案内画像の表示態様が変更となったタイミングで第1案内画像を視認させる乗員の目を他方の目に切り替えることによって、乗員に違和感を与えることなく視認対象とする目を切り替えることが可能となる。尚、第1案内画像の表示態様を変更するタイミングとしては、案内矢印の間隔を変更するタイミング以外に、案内矢印の形状を変更するタイミング、案内矢印の表示色を変更するタイミング等としても良い。 Also, the timing of switching the eyes of the passenger who visually recognizes the first guidance image to the other eye may be the timing of changing the display mode of the first guidance image. For example, as shown in FIG. 6, when the first guide image is a guide arrow indicating the traveling direction of the vehicle along the guide route, there are a case of guiding to turn right and left and a case of guiding to go straight. It is possible to change the spacing of the guide arrows. Specifically, in the case of guiding to go straight ahead, it is desirable to make the interval between the guide arrows wider than in the case of guiding to turn right or left. Therefore, by switching the eyes of the occupant who visually recognizes the first guidance image at the timing when the interval of the guide arrows is changed, that is, at the timing when the display mode of the first guidance image is changed, the occupant feels uncomfortable. It is possible to switch the eyes to be visually recognized without giving them. The timing of changing the display mode of the first guide image may be, in addition to the timing of changing the interval of the guide arrows, the timing of changing the shape of the guide arrows, the timing of changing the display color of the guide arrows, or the like.
 そして、第1案内画像を視認させる乗員の目を他方の目に切り替えると判定された場合(S7:YES)には、第1案内画像を視認させる乗員の目を他方の目に切り替える(S8)。例えば左目が選択されていた場合には、以降は第1案内画像を視認させる乗員の目として右目を選択する。その後、前記S5で新たに切り替えた乗員の目の位置に基づいて第1案内画像55の表示範囲が決定され、表示範囲が更新される。また、左目用画像56で第1案内画像55が表示されていた場合には、以降は右目用画像58で第1案内画像55が表示され、一方で右目用画像58で第1案内画像55が表示されていた場合には、以降は左目用画像56で第1案内画像55が表示されることとなる。尚、第1案内画像55を視認させる乗員の目が切り替わる前後において乗員からは第1案内画像55が同位置に視認される。 Then, when it is determined that the eyes of the occupant who visually recognizes the first guidance image are switched to the other eye (S7: YES), the eyes of the occupant who visually recognizes the first guidance image is switched to the other eye (S8). . For example, when the left eye is selected, the right eye is subsequently selected as the eyes of the passenger who visually recognizes the first guidance image. Then, the display range of the first guidance image 55 is determined based on the position of the eyes of the occupant newly switched in S5, and the display range is updated. When the first guide image 55 is displayed on the left-eye image 56, the first guide image 55 is displayed on the right-eye image 58 thereafter, while the first guide image 55 is displayed on the right-eye image 58. If it is displayed, the first guide image 55 is displayed as the left-eye image 56 thereafter. The first guidance image 55 is visually recognized by the occupant at the same position before and after the eyes of the occupant visually recognizing the first guidance image 55 are switched.
 それに対して第1案内画像を視認させる乗員の目を他方の目に切り替えないと判定された場合(S7:NO)には、現在の表示態様を継続する。 On the other hand, when it is determined that the eyes of the passenger who visually recognizes the first guidance image are not switched to the other eyes (S7: NO), the current display mode is continued.
 一方、前記S4の判定処理において、フロントディスプレイ4を介して自車両に関する情報の提供を行わないと判定された場合(S4:NO)に実行されるS9では、S5と同様に後述の表示範囲決定処理(図8)を行う。但し、S9では第1案内画像のみを表示対象とし、第1案内画像の表示範囲のみを決定する。 On the other hand, in the determination process of S4, if it is determined that the information regarding the own vehicle is not provided via the front display 4 (S4: NO), in S9, similarly to S5, the display range determination described below is performed. Processing (FIG. 8) is performed. However, in S9, only the first guide image is displayed and only the display range of the first guide image is determined.
 その後、S10においてCPU41は、フロントディスプレイ4に対して制御信号を送信し、フロントディスプレイ4に対して第1案内画像を前記S9で決定された右目用画像又は左目用画像の表示範囲に表示する。その結果、第1案内画像の虚像が、車両の乗員から車両前方の周辺環境に重畳して視認される。特に第1案内画像の虚像は、車両の乗員が視認した際に重畳対象路面と重複して視認される。 After that, in S10, the CPU 41 transmits a control signal to the front display 4, and displays the first guide image on the front display 4 in the display range of the image for the right eye or the image for the left eye determined in S9. As a result, the virtual image of the first guidance image is visually recognized by the occupant of the vehicle in a superimposed manner on the surrounding environment in front of the vehicle. In particular, the virtual image of the first guide image is visually recognized as being overlapped with the road surface to be superimposed when visually recognized by an occupant of the vehicle.
 ここで、図7は前記S10でフロントディスプレイ4に表示される左目用画像及び右目用画像の内容と、左目用画像及び右目用画像を視認した結果、車両の乗員から視認される第1案内画像及び第2案内画像の虚像の一例を示した図である。 Here, FIG. 7 shows the contents of the left-eye image and the right-eye image displayed on the front display 4 in S10, and the first guide image visually recognized by the occupant of the vehicle as a result of visually recognizing the left-eye image and the right-eye image. FIG. 6 is a diagram showing an example of a virtual image of a second guide image.
 図7に示す例では第1案内画像55は左目用画像56で表示している。また、左目用画像56の内、第1案内画像55を表示していない表示領域、並びに右目用画像58の全領域については、黒色の画像又は輝度0の画像とする。従って、左目用画像56に表示された第1案内画像55と対応する右目用画像58の領域について黒色の画像又は輝度0の画像となり、両目でフロントディスプレイ4を視認させた場合であっても、一方の目のみに第1案内画像を視認させることが可能となる。そして、車両の乗員6が左目用画像56を左目で右目用画像58を右目でそれぞれ視認した結果、車両のフロントガラス5越しには、周辺環境に重畳する第1案内画像55の虚像が左目で視認される。 In the example shown in FIG. 7, the first guidance image 55 is displayed as the left-eye image 56. Further, in the left-eye image 56, the display area in which the first guide image 55 is not displayed and the entire right-eye image 58 are black images or images with zero brightness. Therefore, the region of the right-eye image 58 corresponding to the first guide image 55 displayed on the left-eye image 56 becomes a black image or an image of 0 brightness, and even when the front display 4 is visually recognized by both eyes, It is possible for only one eye to visually recognize the first guide image. As a result of the occupant 6 of the vehicle visually recognizing the left-eye image 56 with the left eye and the right-eye image 58 with the right eye, the virtual image of the first guide image 55 superimposed on the surrounding environment is displayed with the left eye over the windshield 5 of the vehicle. To be seen.
 ここで、第1案内画像55の虚像は、周辺環境の内、特に車両前方の重畳対象路面59に重畳して視認される。図7は案内経路が前方の案内交差点で右折する経路である場合に表示される第1案内画像55の例であり、第1案内画像55は案内交差点で右方向に曲がることを示す矢印の画像とする。その結果、乗員は視線を進行方向前方から移動させることなく案内交差点の位置と案内交差点における車両の進行方向を把握することが可能となる。また、第1案内画像55の虚像は車両の乗員の左目のみで視認されるので、両目視差により画像が2重に視認される事象が生じることを防止することが可能となる。尚、第1案内画像55の表示は、基本的に案内経路が設定されている間において行われる(直進する場合には直進の矢印が表示される)が、案内交差点に接近した場合にのみ行うようにしても良い。その後、S7へと移行する。 Here, the virtual image of the first guide image 55 is visually recognized in the surrounding environment, in particular, on the road surface 59 to be superimposed in front of the vehicle. FIG. 7 is an example of the first guidance image 55 displayed when the guidance route is a route that makes a right turn at the guidance intersection ahead, and the first guidance image 55 is an image of an arrow indicating that the guidance route turns right at the guidance intersection. And As a result, the occupant can grasp the position of the guided intersection and the traveling direction of the vehicle at the guided intersection without moving the line of sight from the front of the traveling direction. Further, since the virtual image of the first guide image 55 is visually recognized only by the left eye of the occupant of the vehicle, it is possible to prevent a phenomenon in which the image is double visually recognized due to the difference between both eyes. The display of the first guide image 55 is basically performed while the guide route is set (a straight arrow is displayed when going straight), but is only displayed when approaching the guidance intersection. You may do it. Then, the process proceeds to S7.
 次に、前記S5及びS9において実行される表示範囲決定処理のサブ処理について図8に基づき説明する。図8は表示範囲決定処理のサブ処理プログラムのフローチャートである。 Next, the sub-process of the display range determination process executed in S5 and S9 will be described with reference to FIG. FIG. 8 is a flowchart of a sub-process program of the display range determination process.
 先ず、S11においてCPU41は、車両の乗員の左右の目の位置(視線開始点)を車内カメラ12で撮像した撮像画像に基づいて検出する。尚、検出された目の位置は3次元の位置座標で特定される。 First, in S11, the CPU 41 detects the positions of the left and right eyes of the occupant of the vehicle (starting point of sight line) based on the captured image captured by the in-vehicle camera 12. The detected eye position is specified by three-dimensional position coordinates.
 次に、S12においてCPU41は、第1案内画像を視認させる乗員の目を右目又は左目のいずれかに選択する。例えば以下の(1)~(4)の基準により選択する。(1)の条件を最も優先し、(2)~(4)の条件についてはいずれを採用しても良い。
 (1)車内カメラ12によって乗員6の目の位置を検出可能な検出領域に一方の目のみが含まれている場合には、含まれる目を選択する。
 (2)検出領域に両目が含まれている場合であって、乗員の利き目が判定できる場合には、乗員の利き目を選択する。
 (3)検出領域に両目が含まれている場合に、車内カメラ12によって撮像した画像から左右の目の疲労度を検出し、疲労度の少ない方の目を選択する。
 (4)検出領域に両目が含まれている場合に、検出領域の中心により近い方の目を選択する。
Next, in S12, the CPU 41 selects either the right eye or the left eye as the eyes of the occupant who visually recognizes the first guidance image. For example, the selection is made according to the following criteria (1) to (4). The condition (1) is given the highest priority, and any of the conditions (2) to (4) may be adopted.
(1) If only one eye is included in the detection area in which the position of the eyes of the occupant 6 can be detected by the in-vehicle camera 12, the included eyes are selected.
(2) If both eyes are included in the detection area and the dominant eye of the occupant can be determined, the dominant eye of the occupant is selected.
(3) When both eyes are included in the detection area, the fatigue level of the left and right eyes is detected from the image captured by the in-vehicle camera 12, and the eye with the lesser fatigue level is selected.
(4) When the detection area includes both eyes, the eye closer to the center of the detection area is selected.
 尚、前記S12で選択された第1案内画像を視認させる乗員の目はその後継続して固定されるのではなく、その後のS8の処理において他方の目に切り替えられる可能性がある。但し、第1案内画像を視認させる乗員の目は固定しても良い。 The eyes of the occupant who visually recognizes the first guidance image selected in S12 may not be continuously fixed thereafter, but may be switched to the other eye in the subsequent processing of S8. However, the eyes of the occupant who visually recognizes the first guidance image may be fixed.
 続いて、S13においてCPU41は、フロントディスプレイ4の表示がONになっているか否かを判定する。尚、フロントディスプレイ4の表示のON又はOFFの切り替えは車両の乗員の操作によって行うことが可能である。また、周辺状況や車両の状態に基づいてON又はOFFを自動で切り替えても良い。 Next, in S13, the CPU 41 determines whether or not the display on the front display 4 is ON. The display of the front display 4 can be switched between ON and OFF by an operation of a vehicle occupant. Further, ON or OFF may be automatically switched based on the surrounding situation or the state of the vehicle.
 そして、フロントディスプレイ4の表示がONになっていると判定された場合(S13:YES)には、S14へと移行する。一方、フロントディスプレイ4の表示がOFFになっていると判定された場合(S13:NO)には、フロントディスプレイ4による第1案内画像及び第2案内画像の虚像の表示を行うことなく終了する。 If it is determined that the display on the front display 4 is ON (S13: YES), the process proceeds to S14. On the other hand, when it is determined that the display on the front display 4 is OFF (S13: NO), the process ends without displaying the virtual images of the first guide image and the second guide image on the front display 4.
 S14においてCPU41は、フロントディスプレイ4によって画像を投影する対象となるフロントガラス5の位置座標を取得する。尚、フロントガラス5の位置座標は3次元の位置座標で特定される。 In S14, the CPU 41 acquires the position coordinates of the windshield 5 on which the image is projected by the front display 4. The position coordinates of the windshield 5 are specified by three-dimensional position coordinates.
 次に、S15においてCPU41は、前記S3で検出された重畳対象路面の位置座標と、前記S11で検出した車両の乗員の目の位置(視線開始点)の位置座標と、前記S14で取得したフロントガラス5の位置座標とに基づいて、フロントガラス5上において第1案内画像及び第2案内画像を表示する位置座標を算出する。具体的には、第1案内画像の表示位置は、図8及び図9に示すように第1案内画像の虚像が重畳対象路面と重複して車両の乗員から視認される位置とする。一方、第2案内画像の表示位置は、図8に示すようにフロントガラス5の固定位置(例えば下縁中央付近)とする。 Next, in S15, the CPU 41 causes the position coordinates of the road surface to be superimposed detected in S3, the position coordinates of the position of the eyes of the occupant of the vehicle (gaze starting point) detected in S11, and the front acquired in S14. Based on the position coordinates of the glass 5, position coordinates for displaying the first guide image and the second guide image on the windshield 5 are calculated. Specifically, the display position of the first guide image is set to a position where the virtual image of the first guide image overlaps with the road surface to be superimposed and is visually recognized by the vehicle occupant as shown in FIGS. 8 and 9. On the other hand, the display position of the second guide image is the fixed position of the windshield 5 (for example, near the center of the lower edge) as shown in FIG.
 続いて、S16においてCPU41は、前記S15で算出された位置座標に基づいて、フロントガラス5における第1案内画像及び第2案内画像の投影範囲を決定する。更に、決定された投影範囲からフロントディスプレイ4における第1案内画像及び第2案内画像の表示範囲についても決定する。また、本実施形態のフロントディスプレイ4は、乗員6の右目のみに視認させる右目用画像と、乗員6の左目のみに視認させる左目用画像を夫々表示可能である。そして、第1案内画像については左目用画像及び右目用画像の内、前記S12で選択された目に対応する画像(右目で視認させるのであれば右目用画像)における表示範囲を特定する。一方、第2案内画像については左目用画像及び右目用画像の両方の画像における表示範囲を特定する。 Subsequently, in S16, the CPU 41 determines the projection range of the first guidance image and the second guidance image on the windshield 5 based on the position coordinates calculated in S15. Further, the display range of the first guide image and the second guide image on the front display 4 is also determined from the determined projection range. Further, the front display 4 of the present embodiment can respectively display a right-eye image that is viewed by only the right eye of the occupant 6 and a left-eye image that is viewed by only the left eye of the occupant 6. Then, for the first guide image, the display range in the image corresponding to the eye selected in S12 (the image for the right eye if viewed by the right eye) is specified from among the image for the left eye and the image for the right eye. On the other hand, for the second guidance image, the display range in both the left-eye image and the right-eye image is specified.
 例えば、図9はフロントディスプレイ4における第1案内画像の表示範囲の決定処理の詳細について説明した図である。図9に示すように、乗員の左右の目の内、前記S12で選択された目の位置を視線開始点Sとすると、視線開始点Sと重畳対象路面59を直線で結ぶことによってフロントガラス5における第1案内画像の投影範囲が決定される。更に、視線開始点Sから投影範囲への視認角度とフロントガラス5の傾斜角度に基づいて、フロントディスプレイ4における第1案内画像の表示範囲についても決定される。 For example, FIG. 9 is a diagram illustrating the details of the determination process of the display range of the first guide image on the front display 4. As shown in FIG. 9, when the line of sight starting point S is the position of the eye selected in S12 among the left and right eyes of the occupant, the windshield 5 is connected by a straight line to connect the line of sight starting point S and the road surface 59 to be superimposed. The projection range of the first guidance image at is determined. Further, the display range of the first guide image on the front display 4 is also determined based on the visual angle from the line-of-sight start point S to the projection range and the inclination angle of the windshield 5.
 尚、第1案内画像を片方の目で視認させる場合については、視線開始点Sが一点となるので、上記フロントディスプレイ4における第1案内画像の表示範囲の決定に係る処理について、両目で視認させる場合と比較して簡略化することが可能となる。 When the first guidance image is viewed with one eye, the line-of-sight start point S is one point, and therefore the processing relating to the determination of the display range of the first guidance image on the front display 4 is viewed with both eyes. It can be simplified as compared with the case.
 その後、S6やS10へと移行し、決定された投影範囲や表示範囲に基づいて第1案内画像及び第2案内画像の虚像の表示を行う。尚、フロントディスプレイ4の表示がオフになるまで繰り返しS1~S8の処理を行うこととなる。 After that, the process proceeds to S6 and S10, and virtual images of the first guidance image and the second guidance image are displayed based on the determined projection range and display range. The processes of S1 to S8 are repeated until the display on the front display 4 is turned off.
 以上詳細に説明した通り、本実施形態に係る重畳画像表示装置及び重畳画像表示装置において実行されるコンピュータプログラムによれば、ユーザの右目のみに視認させる右目用画像とユーザの左目のみに視認させる左目用画像を夫々表示可能なフロントディスプレイ4を備え、車両の乗員の両目の位置を夫々検出し(S11)、検出した乗員の両目の位置に基づいて、フロントディスプレイ4における案内画像の表示位置を決定し(S16)、特に第1案内画像については右目用画像又は左目用画像のいずれか一方として表示することによって、フロントディスプレイ4に表示された第1案内画像を車両周辺の周辺環境に重畳して視認させる(S6、S10)ので、乗員の片方の目のみに第1案内画像を視認させることによって、両目視差により第1案内画像が2重に視認される事象が生じることを防止することが可能となる。また、乗員の両目の位置を夫々検出可能とすることによって、乗員の現在の目の位置や視認させる目の種類に関わらず適切な位置に重畳させた第1案内画像を視認させることを可能とする。 As described in detail above, according to the superimposed image display device and the computer program executed in the superimposed image display device according to the present embodiment, the image for the right eye to be viewed only by the right eye of the user and the left eye that is viewed only by the left eye of the user. The front display 4 capable of displaying the use images respectively is provided to detect the positions of both eyes of the vehicle occupant (S11), and the display position of the guide image on the front display 4 is determined based on the detected positions of both eyes of the occupant. (S16), in particular, by displaying the first guidance image as either the right-eye image or the left-eye image, the first guidance image displayed on the front display 4 is superimposed on the surrounding environment around the vehicle. Since it is made to be visually recognized (S6, S10), both eyes can be viewed by making only one eye of the occupant visually recognize the first guidance image. First guide image it is possible to prevent that the event is visible to the doubly caused by. Further, by making it possible to detect the positions of both eyes of the occupant, it is possible to visually recognize the first guide image superimposed at an appropriate position regardless of the current position of the occupant's eyes and the type of eyes to be visually recognized. To do.
 尚、本発明は上記実施形態に限定されるものではなく、本発明の要旨を逸脱しない範囲内で種々の改良、変形が可能であることは勿論である。
 例えば、上記本実施形態では、フロントディスプレイ4によって車両2のフロントガラス5の前方に虚像を生成する構成としているが、フロントガラス5以外のウィンドウの前方に虚像を生成する構成としても良い。また、フロントディスプレイ4により映像を反射させる対象はフロントガラス5自身ではなくフロントガラス5の周辺に設置されたバイザー(コンバイナー)であっても良い。
It should be noted that the present invention is not limited to the above embodiment, and it is needless to say that various improvements and modifications can be made without departing from the gist of the present invention.
For example, in the above-described embodiment, the virtual image is generated in front of the windshield 5 of the vehicle 2 by the front display 4, but the virtual image may be generated in front of the window other than the windshield 5. Further, the object on which the image is reflected by the front display 4 may be the visor (combiner) installed around the windshield 5 instead of the windshield 5 itself.
 また、本実施形態では、周辺環境に重畳する画像を表示する手段としてフロントディスプレイ4を用いているが、フロントガラス5に対して画像を表示するウインドウシールドディスプレイ(WSD)を用いても良い。 Further, in the present embodiment, the front display 4 is used as a means for displaying an image superimposed on the surrounding environment, but a window shield display (WSD) for displaying an image on the windshield 5 may be used.
 また、本実施形態では第1案内画像については右目用画像又は左目用画像のいずれか一方として表示し、第2案内画像については右目用画像と左目用画像の両方として表示しているが、第2案内画像についても右目用画像又は左目用画像のいずれか一方として表示するようにしても良い。また、その場合には、第1案内画像と第2案内画像を同じ目で視認させても良いし、異なる目で視認させても良い(例えば第1案内画像は左目、第2案内画像は右目)。 In the present embodiment, the first guide image is displayed as either the right-eye image or the left-eye image, and the second guide image is displayed as both the right-eye image and the left-eye image. The two-guidance image may be displayed as either the right-eye image or the left-eye image. In that case, the first guidance image and the second guidance image may be viewed with the same eye or different eyes (for example, the first guidance image is the left eye, and the second guidance image is the right eye. ).
 また、本実施形態では第1案内画像はナビゲーション装置3で設定された案内経路に沿った車両の進行方向を示す案内矢印の画像としているが、他の画像としても良い。例えば、車両の乗員への警告対象となる対象物(例えば他車両、歩行者、案内標識)に対する警告画像、車両が走行する車線の区画線の画像などでも良い。 Further, in the present embodiment, the first guide image is an image of a guide arrow indicating the traveling direction of the vehicle along the guide route set by the navigation device 3, but other images may be used. For example, a warning image of an object (for example, another vehicle, a pedestrian, a guide sign) that is a warning target for a vehicle occupant, an image of a lane marking in which the vehicle travels, or the like may be used.
 また、本実施形態では第1案内画像は、車両から視認される周辺環境に含まれる特定の対象物(例えば路面)に重畳した状態を維持することが必要な画像としているが、特定の対象物に重畳した状態を維持することが不要な画像(例えばスマートフォンの画像、地図画像等)としても良い。 Further, in the present embodiment, the first guidance image is an image that needs to be maintained in a state of being superimposed on a specific object (for example, a road surface) included in the surrounding environment visually recognized from the vehicle. It may be an image (for example, an image of a smartphone, a map image, etc.) that does not need to be maintained in the state of being superposed on.
 また、本実施形態では、走行支援処理プログラム(図5)の処理をナビゲーション装置3のナビゲーションECU15が実行する構成としているが、実行主体は適宜変更することが可能である。例えば、フロントディスプレイ4の制御部、車両制御ECU、その他の車載器が実行する構成としても良い。尚、フロントディスプレイ4の制御部が実行する場合には、本発明に係る重畳画像表示装置はフロントディスプレイ4のみで構成することも可能である。 Further, in the present embodiment, the processing of the driving support processing program (FIG. 5) is configured to be executed by the navigation ECU 15 of the navigation device 3, but the execution subject can be appropriately changed. For example, it may be configured to be executed by the control unit of the front display 4, the vehicle control ECU, and other vehicle-mounted devices. When the control unit of the front display 4 executes the superimposing image display device according to the present invention, the superimposition image display device may be configured by only the front display 4.
 また、本発明に係る重畳画像表示装置を具体化した実施例について上記に説明したが、重畳画像表示装置は以下の構成を有することも可能であり、その場合には以下の効果を奏する。 Further, although the embodiment in which the superimposed image display device according to the present invention is embodied has been described above, the superimposed image display device can also have the following configuration, and in that case, the following effects are exhibited.
 例えば、第1の構成は以下のとおりである。
 車両(2)に搭載され、ユーザ(6)の右目のみに視認させる右目用画像(58)とユーザの左目のみに視認させる左目用画像(56)を夫々表示可能な画像表示面(4)をユーザの両目で視認させることによって、前記画像表示面に表示された画像を前記車両周辺の周辺環境に重畳して視認させる重畳画像表示装置(1)であって、ユーザの両目の位置を夫々検出する位置検出手段(41)と、前記位置検出手段により検出したユーザの両目の位置に基づいて、前記画像表示面における画像の表示位置を決定する表示位置決定手段(41)と、を有し、ユーザに対して案内対象となる案内画像(55)を前記画像表示面に表示する場合に、前記右目用画像又は前記左目用画像のいずれか一方として表示する。
 上記構成を有する重畳画像表示装置によれば、車両の周辺環境に重畳して案内画像を視認させる場合において、ユーザの片方の目のみに案内画像を視認させることによって、両目視差により案内画像が2重に視認される事象が生じることを防止することが可能となる。また、ユーザの両目の位置を夫々検出可能とすることによって、ユーザの現在の目の位置や視認させる目の種類に関わらず適切な位置に重畳させた案内画像を視認させることを可能とする。
For example, the first configuration is as follows.
An image display surface (4) mounted on the vehicle (2) and capable of displaying a right-eye image (58) that is visible only to the right eye of the user (6) and a left-eye image (56) that is visible only to the user's left eye. A superimposed image display device (1) for visually recognizing an image displayed on the image display surface by observing the image with both eyes of a user, by detecting the positions of both eyes of the user. And a display position determining means (41) for determining the display position of the image on the image display surface based on the positions of both eyes of the user detected by the position detecting means, When the guide image (55) to be guided to the user is displayed on the image display surface, it is displayed as either the right-eye image or the left-eye image.
According to the superimposed image display device having the above-described configuration, when the guide image is visually recognized by superimposing it on the surrounding environment of the vehicle, the guide image is visually recognized only by one of the eyes of the user. It is possible to prevent the occurrence of a heavily visually recognized event. Further, by making it possible to detect the positions of both eyes of the user, it is possible to visually recognize the guide image superimposed at an appropriate position regardless of the current position of the user's eyes and the type of eyes to be visually recognized.
 また、第2の構成は以下のとおりである。
 前記案内画像は、前記右目用画像(58)又は前記左目用画像(56)のいずれか一方として表示する第1の案内画像(55)と、前記右目用画像及び前記左目用画像の両方として表示する第2の案内画像(57)と、を含む。
 上記構成を有する重畳画像表示装置によれば、第1の案内画像については片方の目のみで視認させることによって両目視差により案内画像が2重に視認される事象が生じることを防止することが可能となる一方で、第2の案内画像については両目で視認させることによって違和感なく視認させることが可能となる。
The second configuration is as follows.
The guidance image is displayed as both a first guidance image (55) displayed as either the right-eye image (58) or the left-eye image (56), and both the right-eye image and the left-eye image. A second guide image (57) to be executed.
According to the superimposed image display device having the above configuration, by visually recognizing the first guide image with only one eye, it is possible to prevent the phenomenon in which the guide image is double recognized due to the difference between the two eyes. On the other hand, the second guide image can be visually recognized by both eyes without any discomfort.
 また、第3の構成は以下のとおりである。
 前記第1の案内画像(55)は、前記車両(2)から視認される周辺環境に含まれる特定の対象物(59)に重畳した状態を維持することが必要な画像であり、前記第2の案内画像(57)は、前記車両から視認される周辺環境に含まれる特定の対象物に重畳した状態を維持することが不要な画像である。
 上記構成を有する重畳画像表示装置によれば、重畳位置の正確性が必要な第1の案内画像については片方の目のみで視認させることによって両目視差により案内画像が2重に視認される事象が生じることを防止することが可能となる一方で、重畳位置の正確性が不要な第2の案内画像については両目で視認させることによって違和感なく視認させることが可能となる。
The third configuration is as follows.
The first guide image (55) is an image that needs to be maintained in a state of being superposed on a specific object (59) included in the surrounding environment visually recognized by the vehicle (2), and the second guide image (55) The guide image (57) is an image that does not need to be maintained in a state of being superimposed on a specific target object included in the surrounding environment visually recognized by the vehicle.
According to the superimposed image display device having the above-described configuration, when the first guide image, which requires the accuracy of the superimposed position, is visually recognized by only one eye, there is a phenomenon that the guide image is double visually recognized due to the difference between both eyes. While it is possible to prevent the second guide image from being generated, it is possible to visually recognize the second guide image that does not require the accuracy of the overlapping position with both eyes.
 また、第4の構成は以下のとおりである。
 前記案内画像(55)を前記右目用画像(58)又は前記左目用画像(56)のいずれか一方として表示する場合に、一方の画像として表示された前記案内画像と対応する他方の画像の領域については黒色の画像とする。
 上記構成を有する重畳画像表示装置によれば、ユーザの両目で画像表示面を視認させた場合であっても、一方の目のみに案内画像を視認させることが可能となる。
The fourth configuration is as follows.
When the guide image (55) is displayed as either the right-eye image (58) or the left-eye image (56), the area of the other image corresponding to the guide image displayed as one image Is a black image.
According to the superimposed image display device having the above configuration, even when the image display surface is viewed by both eyes of the user, it is possible to allow only one eye to view the guide image.
 また、第5の構成は以下のとおりである。
 前記案内画像(55)を前記右目用画像(58)又は前記左目用画像(56)のいずれか一方として表示する場合に、一方の画像として表示された前記案内画像と対応する他方の画像の領域については輝度0の画像とする。
 上記構成を有する重畳画像表示装置によれば、ユーザの両目で画像表示面を視認させた場合であっても、一方の目のみに案内画像を視認させることが可能となる。
The fifth configuration is as follows.
When the guide image (55) is displayed as either the right-eye image (58) or the left-eye image (56), the area of the other image corresponding to the guide image displayed as one image Is an image with 0 brightness.
According to the superimposed image display device having the above configuration, even when the image display surface is viewed by both eyes of the user, it is possible to allow only one eye to view the guide image.
 また、第6の構成は以下のとおりである。
 前記案内画像(55)を前記画像表示面(4)に表示する場合に、前記案内画像を視認させるユーザ(6)の目を選択する目選択手段(41)を有し、前記案内画像は、前記右目用画像(58)又は前記左目用画像(56)の内、前記目選択手段により選択された目に対応する画像に表示する。
 上記構成を有する重畳画像表示装置によれば、車両の周辺環境に重畳して案内画像を視認させる場合において、選択されたユーザの片方の目のみに案内画像を視認させることが可能となる。
The sixth configuration is as follows.
When the guide image (55) is displayed on the image display surface (4), the guide image has an eye selecting unit (41) for selecting the eyes of the user (6) who visually recognizes the guide image. The image for the right eye (58) or the image for the left eye (56) is displayed in the image corresponding to the eye selected by the eye selecting means.
According to the superimposed image display device having the above-described configuration, when the guide image is viewed by being superimposed on the surrounding environment of the vehicle, the guide image can be viewed only by one eye of the selected user.
 また、第7の構成は以下のとおりである。
 前記目選択手段(41)は、前記位置検出手段(41)の検出範囲に含まれる目を優先的に選択する。
 上記構成を有する重畳画像表示装置によれば、装置側で現在の位置を検出可能な目に案内画像を視認させることが可能となる。その結果、適切な位置に重畳させた案内画像を視認させることを可能とする。
The seventh configuration is as follows.
The eye selecting means (41) preferentially selects eyes included in the detection range of the position detecting means (41).
According to the superimposed image display device having the above configuration, the guide image can be visually recognized by the eye capable of detecting the current position on the device side. As a result, it is possible to visually recognize the guide image superimposed on an appropriate position.
 また、第8の構成は以下のとおりである。
 前記目選択手段(41)は、前記位置検出手段(41)の検出範囲に対するユーザの目の位置関係に基づいて、前記案内画像を視認させる対象として選択するユーザの目を切り換える。
 上記構成を有する重畳画像表示装置によれば、案内画像を視認している目がユーザの姿勢の変化によって装置側で現在の位置を検出可能な範囲から外れる虞があったとしても、案内画像を視認させる目を他方の目に切り換えることによって、継続して片方の目のみに案内画像を視認させることが可能となる。
The eighth configuration is as follows.
The eye selecting means (41) switches the user's eyes to be selected as a target for visually recognizing the guidance image based on the positional relationship of the user's eyes with respect to the detection range of the position detecting means (41).
According to the superimposed image display device having the above-mentioned configuration, even if the eyes visually recognizing the guide image may be out of the range where the current position can be detected on the device side due to the change in the posture of the user, the guide image is displayed. By switching the eye to be visually recognized to the other eye, it is possible to continuously make only one eye visually recognize the guide image.
 また、第9の構成は以下のとおりである。
 ユーザの目の疲労度を夫々検出する疲労度検出手段(41)を有し、前記目選択手段(41)は、ユーザの目の疲労度に基づいて、前記案内画像(55)を視認させる対象として選択するユーザの目を切り換える。
 上記構成を有する重畳画像表示装置によれば、案内画像を視認している目の疲労が貯まった場合でも、案内画像を視認させる目を他方の目に切り換えることによって、ユーザの負担が大きくすることなく継続して片方の目のみに案内画像を視認させることが可能となる。
The ninth configuration is as follows.
A target for visualizing the guide image (55) based on the degree of fatigue of the eyes of the user, which has a degree of fatigue detecting means (41) for detecting the degree of fatigue of the eyes of the user. Switch the user's eyes to select as.
According to the superimposed image display device having the above-described configuration, even if the fatigue of the eye visually recognizing the guide image is accumulated, the burden on the user is increased by switching the eye to visually recognize the guide image to the other eye. Instead, it is possible to continuously make the guide image visible only to one eye.
 また、第10の構成は以下のとおりである。
 前記目選択手段(41)は、前記案内画像(55)を非表示とするタイミングで前記案内画像を視認させる対象として選択するユーザの目を切り換える。
 上記構成を有する重畳画像表示装置によれば、案内画像を視認させる目を他方の目に切り換えたとしても、ユーザから視認される案内画像に違和感を与えることが無い。
The tenth configuration is as follows.
The eye selecting means (41) switches the eyes of the user to be selected as a target for visually recognizing the guide image at the timing when the guide image (55) is hidden.
According to the superimposed image display device having the above configuration, even if the eye for visually recognizing the guide image is switched to the other eye, the guide image visually recognized by the user does not feel uncomfortable.
 また、第11の構成は以下のとおりである。
 前記目選択手段(41)は、前記案内画像(55)の表示態様を変更するタイミングで前記案内画像を視認させる対象として選択するユーザの目を切り換える。
 上記構成を有する重畳画像表示装置によれば、案内画像を視認させる目を他方の目に切り換えたとしても、ユーザから視認される案内画像に違和感を与えることが無い。
The eleventh configuration is as follows.
The eye selecting means (41) switches the eyes of the user to be selected as the target for visually confirming the guide image at the timing of changing the display mode of the guide image (55).
According to the superimposed image display device having the above configuration, even if the eye for visually recognizing the guide image is switched to the other eye, the guide image visually recognized by the user does not feel uncomfortable.
 また、第12の構成は以下のとおりである。
 前記表示位置決定手段(41)は、周辺環境において前記案内画像(55)を重畳させる位置と前記目選択手段(41)により選択された目の位置に基づいて、前記画像表示面(4)に対する前記案内画像の表示位置を決定する。
 上記構成を有する重畳画像表示装置によれば、両目で案内画像を視認させる場合と比較して、案内画像の表示位置の決定に係る処理を簡略化することが可能となる。また、ユーザの目の位置に関わらず、選択されたユーザの片方の目のみに案内画像を確実に視認させることが可能となる。
The twelfth configuration is as follows.
The display position determining means (41) is arranged on the image display surface (4) based on the position where the guide image (55) is superimposed in the surrounding environment and the eye position selected by the eye selecting means (41). The display position of the guide image is determined.
According to the superimposed image display device having the above configuration, it is possible to simplify the processing relating to the determination of the display position of the guide image, as compared with the case where the guide image is visually recognized by both eyes. Further, regardless of the position of the eyes of the user, it is possible to surely make the guide image visible only to one eye of the selected user.
 また、第13の構成は以下のとおりである。
 前記画像表示面(4)に表示された画像を車両(2)のフロントガラス(5)に反射してユーザ(6)に視認させることによって、前記画像表示面に表示された画像の虚像を前記車両周辺の周辺環境に重畳して視認させる。
 上記構成を有する重畳画像表示装置によれば、画像の虚像によって車両の周辺環境に重畳して案内画像を視認させることが可能となる。
The thirteenth configuration is as follows.
By reflecting the image displayed on the image display surface (4) on the windshield (5) of the vehicle (2) and making it visible to the user (6), a virtual image of the image displayed on the image display surface is displayed. Make it visible in the surrounding environment around the vehicle.
According to the superimposed image display device having the above configuration, it is possible to visually recognize the guide image by superimposing it on the surrounding environment of the vehicle by the virtual image of the image.
 また、第14の構成は以下のとおりである。
 前記ユーザ(6)に装着させた偏光眼鏡又は液晶シャッター眼鏡を介して前記画像表示面(4)を視認させることによって、ユーザの右目のみに視認させる右目用画像(58)とユーザの左目のみに視認させる左目用画像(56)を前記画像表示面に夫々表示可能とする。
 上記構成を有する重畳画像表示装置によれば、偏光眼鏡又は液晶シャッター眼鏡をユーザに装着させることによって、ユーザの片方の目のみに案内画像を視認させることが可能となる。
The fourteenth configuration is as follows.
By viewing the image display surface (4) through polarizing glasses or liquid crystal shutter glasses worn by the user (6), only the right-eye image (58) to be viewed by only the right eye of the user and the left eye of the user are displayed. The left-eye image (56) to be visually recognized can be displayed on the image display surface.
According to the superimposed image display device having the above configuration, it is possible to allow the user to visually recognize the guide image only by wearing the polarizing glasses or the liquid crystal shutter glasses.
  1      重畳画像表示装置
  2      車両
  3      ナビゲーション装置
  4      フロントディスプレイ
  5      フロントガラス
  6      乗員
  7      ダッシュボード
  41     CPU
  42     RAM
  43     ROM
  55     第1案内画像
  56     左目用画像
  57     第2案内画像
  58     右目用画像
  59     重畳対象路面
1 superimposed image display device 2 vehicle 3 navigation device 4 front display 5 windshield 6 occupant 7 dashboard 41 CPU
42 RAM
43 ROM
55 First guidance image 56 Left eye image 57 Second guidance image 58 Right eye image 59 Superimposed road surface

Claims (15)

  1.  車両に搭載され、ユーザの右目のみに視認させる右目用画像とユーザの左目のみに視認させる左目用画像を夫々表示可能な画像表示面をユーザの両目で視認させることによって、前記画像表示面に表示された画像を前記車両周辺の周辺環境に重畳して視認させる重畳画像表示装置であって、
     ユーザの両目の位置を夫々検出する位置検出手段と、
     前記位置検出手段により検出したユーザの両目の位置に基づいて、前記画像表示面における画像の表示位置を決定する表示位置決定手段と、を有し、
     ユーザに対して案内対象となる案内画像を前記画像表示面に表示する場合に、前記右目用画像又は前記左目用画像のいずれか一方として表示する重畳画像表示装置。
    Displayed on the image display surface by allowing the user's eyes to visually recognize an image display surface that is mounted on a vehicle and that can display a right-eye image that can be viewed only by the user's right eye and a left-eye image that can be viewed only by the user's left eye A superimposed image display device for visually recognizing a superimposed image in the surrounding environment around the vehicle,
    Position detecting means for detecting the positions of both eyes of the user,
    A display position determining unit that determines the display position of the image on the image display surface based on the positions of both eyes of the user detected by the position detecting unit;
    A superimposed image display device for displaying a guidance image to be guided to a user as either the right-eye image or the left-eye image when the guidance image is displayed on the image display surface.
  2.  前記案内画像は、
       前記右目用画像又は前記左目用画像のいずれか一方として表示する第1の案内画像と、
       前記右目用画像及び前記左目用画像の両方として表示する第2の案内画像と、を含む請求項1に記載の重畳画像表示装置。
    The guide image is
    A first guide image displayed as one of the right-eye image or the left-eye image,
    The superimposed image display device according to claim 1, further comprising a second guide image displayed as both the image for the right eye and the image for the left eye.
  3.  前記第1の案内画像は、前記車両から視認される周辺環境に含まれる特定の対象物に重畳した状態を維持することが必要な画像であり、
     前記第2の案内画像は、前記車両から視認される周辺環境に含まれる特定の対象物に重畳した状態を維持することが不要な画像である請求項2に記載の重畳画像表示装置。
    The first guidance image is an image that needs to be maintained in a state of being superimposed on a specific object included in the surrounding environment visually recognized from the vehicle,
    The superimposed image display device according to claim 2, wherein the second guide image is an image that does not need to be maintained in a state of being superimposed on a specific object included in the surrounding environment visually recognized by the vehicle.
  4.  前記案内画像を前記右目用画像又は前記左目用画像のいずれか一方として表示する場合に、一方の画像として表示された前記案内画像と対応する他方の画像の領域については黒色の画像とする請求項1乃至請求項3のいずれかに記載の重畳画像表示装置。 When the guide image is displayed as either the right-eye image or the left-eye image, the area of the other image corresponding to the guide image displayed as one image is a black image. The superimposed image display device according to claim 1.
  5.  前記案内画像を前記右目用画像又は前記左目用画像のいずれか一方として表示する場合に、一方の画像として表示された前記案内画像と対応する他方の画像の領域については輝度0の画像とする請求項1乃至請求項3のいずれかに記載の重畳画像表示装置。 When the guide image is displayed as either the right-eye image or the left-eye image, the area of the other image corresponding to the guide image displayed as one image is an image with zero brightness. Item 10. The superimposed image display device according to any one of items 1 to 3.
  6.  前記案内画像を前記画像表示面に表示する場合に、前記案内画像を視認させるユーザの目を選択する目選択手段を有し、
     前記案内画像は、前記右目用画像又は前記左目用画像の内、前記目選択手段により選択された目に対応する画像に表示する請求項1乃至請求項5のいずれかに記載の重畳画像表示装置。
    When the guide image is displayed on the image display surface, it has an eye selecting unit that selects the eyes of the user who visually recognizes the guide image,
    The superimposing image display device according to claim 1, wherein the guide image is displayed in an image corresponding to an eye selected by the eye selecting unit out of the right eye image or the left eye image. .
  7.  前記目選択手段は、前記位置検出手段の検出範囲に含まれる目を優先的に選択する請求項6に記載の重畳画像表示装置。 The superposed image display device according to claim 6, wherein the eye selecting means preferentially selects eyes included in the detection range of the position detecting means.
  8.  前記目選択手段は、前記位置検出手段の検出範囲に対するユーザの目の位置関係に基づいて、前記案内画像を視認させる対象として選択するユーザの目を切り換える請求項6又は請求項7に記載の重畳画像表示装置。 The superimposition according to claim 6 or 7, wherein the eye selecting unit switches the user's eyes to be selected as the target for visually recognizing the guidance image based on the positional relationship of the user's eyes with respect to the detection range of the position detecting unit. Image display device.
  9.  ユーザの目の疲労度を夫々検出する疲労度検出手段を有し、
     前記目選択手段は、ユーザの目の疲労度に基づいて、前記案内画像を視認させる対象として選択するユーザの目を切り換える請求項6又は請求項7に記載の重畳画像表示装置。
    It has a fatigue degree detection means for detecting the degree of fatigue of each user's eyes,
    The superposed image display device according to claim 6 or 7, wherein the eye selection unit switches the eyes of the user to be selected as the target for visually recognizing the guidance image based on the degree of fatigue of the eyes of the user.
  10.  前記目選択手段は、前記案内画像を非表示とするタイミングで前記案内画像を視認させる対象として選択するユーザの目を切り換える請求項8又は請求項9に記載の重畳画像表示装置。 10. The superimposed image display device according to claim 8 or 9, wherein the eye selection unit switches the eyes of the user to be selected as a target for viewing the guide image at a timing when the guide image is hidden.
  11.  前記目選択手段は、前記案内画像の表示態様を変更するタイミングで前記案内画像を視認させる対象として選択するユーザの目を切り換える請求項8又は請求項9に記載の重畳画像表示装置。 The superimposed image display device according to claim 8 or 9, wherein the eye selecting unit switches the eyes of the user to be selected as the target for visually recognizing the guide image at the timing of changing the display mode of the guide image.
  12.  前記表示位置決定手段は、周辺環境において前記案内画像を重畳させる位置と前記目選択手段により選択された目の位置に基づいて、前記画像表示面に対する前記案内画像の表示位置を決定する請求項6乃至請求項11のいずれかに記載の重畳画像表示装置。 7. The display position determining means determines the display position of the guide image on the image display surface based on the position where the guide image is superimposed in the surrounding environment and the position of the eye selected by the eye selecting means. The superimposed image display device according to claim 11.
  13.  前記画像表示面に表示された画像を車両のフロントガラスに反射してユーザに視認させることによって、前記画像表示面に表示された画像の虚像を前記車両周辺の周辺環境に重畳して視認させる請求項1乃至請求項12のいずれかに記載の重畳画像表示装置。 A virtual image of the image displayed on the image display surface is superimposed on the surrounding environment of the vehicle to be visually recognized by reflecting the image displayed on the image display surface on the windshield of the vehicle and allowing the user to visually recognize the image. Item 13. The superimposed image display device according to any one of items 1 to 12.
  14.  前記ユーザに装着させた偏光眼鏡又は液晶シャッター眼鏡を介して前記画像表示面を視認させることによって、ユーザの右目のみに視認させる右目用画像とユーザの左目のみに視認させる左目用画像を前記画像表示面に夫々表示可能とする請求項1乃至請求項13のいずれかに記載の重畳画像表示装置。 By visually recognizing the image display surface through polarizing glasses or liquid crystal shutter glasses worn by the user, the image display for the right eye image to be visually recognized only by the user's right eye and the left eye image to be visually recognized only for the user's left eye The superimposed image display device according to any one of claims 1 to 13, wherein the superimposed image display device is capable of displaying each on a surface.
  15.  車両に搭載され、ユーザの右目のみに視認させる右目用画像とユーザの左目のみに視認させる左目用画像を夫々表示可能な画像表示面をユーザの両目で視認させることによって、前記画像表示面に表示された画像を前記車両周辺の周辺環境に重畳して視認させる重畳画像表示装置を、
     ユーザの両目の位置を夫々検出する位置検出手段と、
     前記位置検出手段により検出したユーザの両目の位置に基づいて、前記画像表示面における画像の表示位置を決定する表示位置決定手段と、して機能させる為のコンピュータプログラムであって、
     ユーザに対して案内対象となる案内画像を前記画像表示面に表示する場合に、前記右目用画像又は前記左目用画像のいずれか一方として表示するコンピュータプログラム。
    Displayed on the image display surface by allowing the user's eyes to visually recognize an image display surface that is mounted on a vehicle and that can display a right-eye image that can be viewed only by the user's right eye and a left-eye image that can be viewed only by the user's left eye A superimposed image display device for superimposing the displayed image on the surrounding environment around the vehicle for visual recognition,
    Position detecting means for detecting the positions of both eyes of the user,
    A computer program for functioning as a display position determining unit that determines a display position of an image on the image display surface based on the positions of both eyes of the user detected by the position detecting unit,
    A computer program for displaying a guidance image as a guidance target for a user on the image display surface as either the right-eye image or the left-eye image.
PCT/JP2019/023606 2018-10-25 2019-06-14 Superimposed-image display device, and computer program WO2020084827A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018200551A JP2020067576A (en) 2018-10-25 2018-10-25 Superimposed image display device and computer program
JP2018-200551 2018-10-25

Publications (1)

Publication Number Publication Date
WO2020084827A1 true WO2020084827A1 (en) 2020-04-30

Family

ID=70331506

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/023606 WO2020084827A1 (en) 2018-10-25 2019-06-14 Superimposed-image display device, and computer program

Country Status (2)

Country Link
JP (1) JP2020067576A (en)
WO (1) WO2020084827A1 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007199295A (en) * 2006-01-25 2007-08-09 Epson Imaging Devices Corp Display
JP2007310285A (en) * 2006-05-22 2007-11-29 Denso Corp Display device
JP2010072596A (en) * 2008-09-22 2010-04-02 Toshiba Corp Display device and mobile body
JP2012169726A (en) * 2011-02-10 2012-09-06 Seiko Epson Corp Head-mounted display device and control method therefor
JP2012211959A (en) * 2011-03-30 2012-11-01 Brother Ind Ltd Head-mounted display
JP2015194709A (en) * 2014-03-28 2015-11-05 パナソニックIpマネジメント株式会社 image display device
JP2017185988A (en) * 2016-04-01 2017-10-12 株式会社デンソー Device for vehicle, program for vehicle, and filter design program

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007199295A (en) * 2006-01-25 2007-08-09 Epson Imaging Devices Corp Display
JP2007310285A (en) * 2006-05-22 2007-11-29 Denso Corp Display device
JP2010072596A (en) * 2008-09-22 2010-04-02 Toshiba Corp Display device and mobile body
JP2012169726A (en) * 2011-02-10 2012-09-06 Seiko Epson Corp Head-mounted display device and control method therefor
JP2012211959A (en) * 2011-03-30 2012-11-01 Brother Ind Ltd Head-mounted display
JP2015194709A (en) * 2014-03-28 2015-11-05 パナソニックIpマネジメント株式会社 image display device
JP2017185988A (en) * 2016-04-01 2017-10-12 株式会社デンソー Device for vehicle, program for vehicle, and filter design program

Also Published As

Publication number Publication date
JP2020067576A (en) 2020-04-30

Similar Documents

Publication Publication Date Title
WO2018066711A1 (en) Travel assistance device and computer program
WO2019097763A1 (en) Superposed-image display device and computer program
JP6516642B2 (en) Electronic device, image display method and image display program
WO2014208165A1 (en) Head-up display device
WO2014208164A1 (en) Head-up display device
JP2017094882A (en) Virtual image generation system, virtual image generation method and computer program
WO2019097762A1 (en) Superimposed-image display device and computer program
JP5327025B2 (en) Vehicle travel guidance device, vehicle travel guidance method, and computer program
US20140043466A1 (en) Environment image display apparatus for transport machine
JP2014120111A (en) Travel support system, travel support method, and computer program
JP2019116229A (en) Display system
JP2021039085A (en) Superimposed image display device, superimposed image drawing method, and computer program
JP4908779B2 (en) Navigation device, image display method, and image display program
JP2019056884A (en) Superimposed image display device
JP6825433B2 (en) Virtual image display device and computer program
JP2014120114A (en) Travel support system, travel support method, and computer program
WO2022209439A1 (en) Virtual image display device
JP2018173399A (en) Display device and computer program
WO2019208365A1 (en) Information display device
JP6287351B2 (en) Head-up display device
JP2023012793A (en) Superimposed image display device
JP2014120113A (en) Travel support system, travel support method, and computer program
JP6485310B2 (en) Information providing system, information providing method, and computer program
JP2009098501A (en) Visual information display device and visual information display method
JP6805974B2 (en) Driving support device and computer program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19877042

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19877042

Country of ref document: EP

Kind code of ref document: A1