WO2023228752A1 - Image display device - Google Patents

Image display device Download PDF

Info

Publication number
WO2023228752A1
WO2023228752A1 PCT/JP2023/017619 JP2023017619W WO2023228752A1 WO 2023228752 A1 WO2023228752 A1 WO 2023228752A1 JP 2023017619 W JP2023017619 W JP 2023017619W WO 2023228752 A1 WO2023228752 A1 WO 2023228752A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
observer
parallax
display device
image display
Prior art date
Application number
PCT/JP2023/017619
Other languages
French (fr)
Japanese (ja)
Inventor
凌一 竹内
Original Assignee
京セラ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 京セラ株式会社 filed Critical 京セラ株式会社
Publication of WO2023228752A1 publication Critical patent/WO2023228752A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • B60R1/23Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
    • B60R1/24Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view in front of the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/29Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area inside the vehicle, e.g. for viewing passengers or cargo
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/128Adjusting depth or disparity
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/286Image signal generators having separate monoscopic and stereoscopic modes
    • H04N13/289Switching between monoscopic and stereoscopic modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/363Image reproducers using image projection screens
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present disclosure relates to an image display device.
  • the present disclosure particularly relates to an image display device that can display captured images of the surroundings of a vehicle on a shielding part, thereby making the driver perceive that the scenery outside the vehicle is connected.
  • Patent Document 1 Conventionally, a technique is known in which graphics superimposed on a real environment are displayed in a vehicle or the like (for example, see Patent Document 1). Further, there is a known transparency technology that utilizes a reflex projection technology to allow an observer to perceive an image as if the object is transparent. In this transparency technique, a computer creates a left-eye image and a right-eye image that are perceived by the observer's left and right eyes, respectively, based on images captured by a camera installed outside the vehicle, for example.
  • the left eye image and the right eye image are projected onto a shielding part that blocks the line of sight, such as a vehicle pillar or dashboard, and the driver, who is an observer, perceives the image of the scenery outside the vehicle as a 3D image, which becomes a blind spot. It is perceived as if parts of it are transparent and connected to the scenery of the outside world.
  • An image display device includes: A first imaging unit captures an image of the surroundings of the observer, and blocks the observer's field of view as seen by one of the observer's left eye and right eye based on the first image data obtained by capturing the image.
  • an image data processing unit that generates a first image in a range corresponding to the shielding part and a second image in a range corresponding to the shielding part, which is seen by the other of the observer's left eye and right eye; an image display unit that displays the first image and the second image on the shielding unit; a line of sight detection unit that detects the focus position of the observer; the second image has parallax with respect to the first image,
  • the image display section displays a parallax-free image, which is an image without parallax, on the shielding section, instead of the first image and the second image, based on the focus position of the observer's line of sight.
  • An image display device includes: an imaging unit that captures an image of an observer and the surroundings of the observer and outputs image data obtained by capturing the image; an image display unit that causes the observer to perceive a parallax image reflecting the parallax between the left eye and the right eye of the observer based on the image data in a shielding unit that blocks the observer's field of view; a line-of-sight detection unit that detects the focus position of the observer's line of sight;
  • the image display section allows the observer to perceive a parallax-free image that does not reflect parallax between the left eye and right eye of the observer when the focus position of the observer's line of sight is on the shielding section.
  • FIG. 1 is a plan view schematically showing an example of an image display device according to an embodiment of the present disclosure.
  • FIG. 2 is a partially enlarged sectional view showing the configuration of the screen and the diffuser plate.
  • FIG. 3 is a plan view schematically showing a vehicle equipped with an image display device.
  • FIG. 4 is a side view schematically showing a vehicle equipped with an image display device.
  • FIG. 5 is a block diagram showing the configuration of the image display device.
  • FIG. 6 is a flowchart for explaining the operation of the control section.
  • FIG. 7 is a diagram showing a state where no parallax image is displayed on the shielding section.
  • FIG. 8 is a diagram showing a state in which a parallax image is displayed on the shielding part and can be perceived correctly.
  • FIG. 7 is a diagram showing a state where no parallax image is displayed on the shielding section.
  • FIG. 8 is a diagram showing a state in which a parallax image is displayed on
  • FIG. 9 is a diagram illustrating a state in which a parallax image is displayed on a shielding part and cannot be perceived correctly.
  • FIG. 10 is a diagram for explaining the relationship between the observer's line of sight, the convergence angle, and the shielding part.
  • FIG. 11 is a flowchart of a process for generating a first image and a second image or an image without parallax.
  • FIG. 12 is a diagram illustrating an image without parallax.
  • FIG. 1 is a plan view schematically showing an image display device 1 according to the present embodiment.
  • FIG. 2 is a partially enlarged sectional view showing the configuration of the retroreflective screen 11 and the diffuser plate 16 provided in the image display device 1.
  • FIG. 3 is a plan view schematically showing a vehicle 2 on which the image display device 1 is mounted.
  • FIG. 4 is a side view schematically showing a vehicle 2 on which the image display device 1 is mounted.
  • the image display device 1 includes an imaging section, an image data processing device 33 (see FIG. 5) including a first image processing section 8 and a second image processing section 9, an image display section, and a line of sight detection section.
  • the image data processing device 33 is sometimes referred to as an image data processing section.
  • the imaging unit images the observer and the surroundings of the observer, and outputs the image data obtained by capturing the image.
  • the imaging unit includes a front exterior camera 3a, a rear exterior camera 3b, and an interior camera 6.
  • the front exterior camera 3a and the rear exterior camera 3b capture images of the surrounding scenery of the vehicle 2.
  • the external cameras including the front external camera 3a and the rear external camera 3b may be referred to as a first imaging section.
  • image data obtained by imaging the surroundings of the observer by the first imaging unit may be referred to as first image data.
  • the in-vehicle camera 6 images the driver 5.
  • the image data of the in-vehicle camera 6 detects the positions of the left eye EL and right eye ER of the driver 5 who is an observer seated in the driver's seat 4 of the vehicle 2, and calculates the positions of the left eye EL and right eye ER and the position of the pupil. It is used to detect the line of sight of the driver 5 from the position.
  • the in-vehicle camera 6 may be referred to as a second imaging section.
  • the image data obtained by photographing the observer by the second imaging unit may be referred to as second image data.
  • the imaging unit may include only one of the front exterior camera 3a and the rear exterior camera 3b.
  • the first image processing section 8 of the image data processing section applies an image to the shielding section 7, which is seen by one of the left eye EL and the right eye ER, based on the image data output from the front exterior camera 3a and the rear exterior camera 3b.
  • a first image of the corresponding range is generated.
  • the shielding part 7 is an object that blocks the observer's line of sight, and is an object that blocks the observer's field of view when looking outside the vehicle from the left eye EL and right eye ER.
  • the first image may be an image in the range corresponding to the shielding part 7 that is seen by the right eye ER.
  • the second image processing section 9 of the image data processing section applies an image to the shielding section 7, which is seen by the other of the left eye EL and the right eye ER, based on the image data output from the front exterior camera 3a and the rear exterior camera 3b.
  • a second image of the corresponding range is generated.
  • the image of the range corresponding to the shielding part 7, which is seen by the left eye LR may be set as the second image.
  • the second image has parallax with respect to the first image.
  • the first image has parallax with respect to the second image. Therefore, even if the first image and the second image contain the same object, since the viewpoints from which the object is viewed are different, the position and shape of the object in the images differ depending on the parallax.
  • the range corresponding to the shielding part 7 that can be seen by the left eye EL refers to the range that would be visible by the left eye EL if the shielding part 7 did not exist.
  • the range corresponding to the shielding part 7 that can be seen by the right eye RL refers to the range that would be visible by the right eye RL if the shielding part 7 did not exist.
  • the image display section displays the first image and the second image on the shielding section 7.
  • the image display unit may display a parallax image, which is an image in which the first image and the second image are mixed, on the shielding unit 7.
  • the image display unit displays the parallax image, thereby allowing the viewer to perceive an image that reflects the parallax.
  • the image display section includes a display device 10.
  • a dashboard, a door, a pillar, a back seat 23, etc. can be enumerated.
  • the line of sight detection unit detects the focus position of the observer.
  • the line of sight detection unit may detect the line of sight of the observer based on image data of the observer.
  • image data captured by the observer is output from the in-vehicle camera 6.
  • the line of sight detection unit also detects the convergence angle of the observer.
  • the image data processing unit determines that the observer is unable to correctly perceive the parallax image in which the first image and the second image are mixed when the detected line of sight and convergence angle of the observer satisfy a certain condition. judge. Then, the image data processing section generates a parallax-free image, and the image display section displays the parallax-free image on the shielding section 7.
  • a parallax-free image is an image that does not reflect the viewer's parallax.
  • the convergence angle is the angle at which the left and right lines of sight intersect from the eyeball.
  • the display device 10 includes a retroreflective screen 11 and a diffuser plate 16 provided close to the retroreflective screen 11.
  • the diffuser plate 16 may be attached and laminated to the surface of the retroreflective screen 11 on the side facing the viewer.
  • the display device 10 includes a dashboard display device 10a provided on the dashboard, a right side pillar display device 10b provided on the right side pillar, a left side pillar display device 10c provided on the left side pillar, and a rear seat display device 10a.
  • the backsheet display device 10d may include a backsheet display device 10d provided with 22 backsheets 23.
  • the display device 10 also includes, as a projection section, a first projection section that projects the first image onto the retroreflective screen 11 and a second projection section that projects the second image onto the retroreflective screen 11.
  • the projection section (right side pillar projection section 12) of the right side pillar display device 10b attached to the right side pillar includes a first projection section 12R that projects a first image and a second projection section 12L that projects a second image.
  • the display devices 10a, 10b, 10c, and 10d have flexibility, and are bonded to each shielding portion 7 with adhesive or the like in a state of being flexibly curved according to the ups and downs of each shielding portion 7. . Since each projection section has the same configuration, the right side pillar projection section 12 will be explained in detail as an example.
  • the first projection unit 12R includes a liquid crystal display device 13R that displays a first image, and a first projection lens 14R that projects the image light of the first image emitted from the liquid crystal display device 13R onto the retroreflective screen 11. You may do so.
  • the second projection unit 12L includes a liquid crystal display device 13L that displays a second image, and a second projection lens 14L that projects the image light of the second image emitted from the liquid crystal display device 13L onto the retroreflective screen 11. You may do so.
  • Each of the liquid crystal display devices 13R and 13L may include a transmissive liquid crystal display element and a backlight device that emits light to the back surface of the liquid crystal display element.
  • an LED light emitting display device may be used instead of the liquid crystal display device.
  • Each projection lens 14R, 14L may be configured by a combination of a plurality of lenses, respectively, so that the first image and the second image are formed on the retroreflective screen 11 with parallax.
  • the driver 5 is illustrated as the observer, the observer may also be a fellow passenger sitting in the front passenger seat.
  • the first projection unit 12R may be arranged, for example, on the right side of the headrest so that its exit pupil is at the same height as and near the right eye ER of the observer.
  • the second projection section 12L may be arranged, for example, on the left side of the headrest so that its exit pupil is at the same height as and near the left eye EL of the observer.
  • the backseat projection section 24 and the dashboard projection section 25 are also configured similarly to the right side pillar projection section 12, and the backseat projection section 24 displays an image corresponding to the range covered by the backseat 23 on the backseat display device 10d. to project. Further, the dashboard projection unit 25 projects an image corresponding to the range covered by the dashboard onto the dashboard display device 10a.
  • the dashboard projection unit 25 may be attached to the center of the ceiling of the vehicle 2, for example.
  • the back seat projection section 24 may be attached, for example, to the upper part of the backrest seat of the driver's seat 4.
  • the retroreflective screen 11 has retroreflectivity and reflects incident light in the direction of incidence.
  • the image light of the first image and the image light of the second image emitted from the first projection lens 14R and the second projection lens 14L are directed to the first projection lens 14R and the second projection lens 14L by the retroreflective screen 11. reflected. Therefore, the image light of the first image and the image light of the second image, which are overlapped (mixed) on the retroreflective screen 11, are perceived separately at the observer's position.
  • a diffuser plate 16 is disposed on the viewer-side surface of the retroreflective screen 11.
  • the diffusing plate 16 has a diffusing ability to direct the light reflected by the retroreflective screen 11 to both eyes of the viewer.
  • the dashboard display device 10a may use a diffusion plate 16 with a large diffusion capacity in the vertical direction and a small diffusion capacity in the horizontal direction.
  • the diffusion power in the left and right direction is higher than the diffusion power in the vertical direction. It is preferable that it is also small.
  • the diffuser plate 16 may be, for example, a holographic optical element bonded onto the reflective surface of the retroreflective screen 11.
  • the retroreflective screen 11 may have a configuration in which a plurality of minute glass beads 11a having a diameter of, for example, 20 ⁇ m or more and 100 ⁇ m or less are arranged on a reflective film 11b.
  • the image light projected onto the retroreflective screen 11 enters the glass beads 11a, is refracted on the surface of the glass beads 11a, reaches the back surface on the reflective film 11b side, and is reflected by the reflective film 11b.
  • the light reflected by the reflective film 11b is refracted again at the back surface of the glass bead 11a, reaches the surface of the glass bead 11a, is separated from the incident path of the incident light by a minute distance less than the diameter of the glass bead 11a, and is reflected by the incident light. Since the light travels along a parallel optical path, retroreflection is achieved.
  • the image light of the first image for the right eye ER and the image light of the second image for the left eye EL, which overlap on the retroreflective screen 11, are separated at the observer's position, The light enters the right eye ER and the left eye EL separately.
  • the driver 5 who is an observer can perceive a three-dimensional image from the mixed image of the image light of the first image and the image light of the second image.
  • a mixed image of the image light of the first image and the image light of the second image reflecting the parallax of the observer is called a parallax image.
  • a coordinate system based on the vehicle body may be executed.
  • a coordinate system in which the vehicle length direction is the X axis, the vehicle width direction is the Y axis, and the vehicle height direction is the Z axis is used, and the positions of the left eye EL and right eye ER of the driver 5 are determined by the coordinates in this coordinate system.
  • the positions of both eyes of the driver 5 are determined in a coordinate system based on the vehicle body, and calculations regarding reflexive projection are performed, thereby ensuring continuity between the scenery outside the vehicle and the image displayed on the shielding part 7. can have. Furthermore, by detecting the positions of both eyes of the driver 5, it is possible to flexibly follow differences in body shape and posture of the driver 5 and display an image.
  • the front exterior camera 3a installed at the front of the vehicle 2 and the rear exterior camera 3b installed at the rear of the vehicle 2 may be cameras equipped with fisheye lenses. By using a fisheye lens, it is possible to image the scenery outside the vehicle over a wide range of solid angles.
  • the number of cameras 3 outside the vehicle is not limited, and may be one, for example, or three or more.
  • the installation location of the vehicle exterior camera 3 is not limited as long as it is possible to image the exterior of the vehicle. That is, the vehicle exterior camera 3 may be installed outside the vehicle or may be installed inside the vehicle.
  • the in-vehicle camera 6 is installed at a position where it can image the driver 5, such as a position adjacent to a room mirror.
  • the camera outside the vehicle 3 and the camera inside the vehicle 6 may be, for example, CCD cameras, but are not limited to a specific type of camera.
  • a line-of-sight detection unit is configured including the in-vehicle camera 6, the line-of-sight recognition device 31, and the in-vehicle camera control device 37.
  • FIG. 5 is a block diagram showing the configuration of the image display device 1.
  • Image data of images captured by the front exterior camera 3 a and the rear exterior camera 3 b are sent to the exterior camera control device 35 .
  • the vehicle exterior camera control device 35 constitutes a part of the image display device 1 .
  • the vehicle exterior camera control device 35 performs necessary signal processing (analog-to-digital conversion, for example) on the image data and outputs it to the image data processing device 33.
  • the image display device 1 includes a seating sensor 36 that detects whether or not the driver 5 is seated.
  • the seating sensor 36 is provided in the driver's seat 4.
  • the seating sensor 36 may be constituted by a known load sensor or limit switch.
  • the seat sensor 36 installed in the driver's seat 4 detects that the driver 5 is seated.
  • the detection result of the seating sensor 36 is sent to the line-of-sight recognition device 31, and the line-of-sight detection unit starts measuring the positions of the driver's 5 eyes.
  • the line of sight recognition device 31 extracts the positions of the left eye EL and right eye ER and the pupil position of the driver 5 from the captured image of the in-vehicle camera 6 through image recognition processing, and calculates the convergence angle and line of sight of the driver 5.
  • a known method may be used to calculate the vergence angle and line of sight.
  • the line of sight recognition device 31 may treat the eyeball as a sphere and calculate the convergence angle and line of sight using the deviation of the position of the pupil from the reference position (eyeball angle).
  • the calculation result of the line of sight recognition device 31 is output to the image data processing device 33.
  • the image data processing device 33 executes image processing to make the shielding portion 7 appear transparent, and the display control device 39 controls the display device 10 based on the result of the image processing.
  • a control unit 50 is configured including an external camera control device 35, an image data processing device 33, an in-vehicle camera control device 37, a line of sight recognition device 31, and a display control device 39.
  • the control unit 50 controls the image display device 1 .
  • the control unit 50 is realized by a processor such as an electronic control unit (ECU) as a hardware resource, and a computer-readable program as a software resource.
  • Control unit 50 may include one or more processors.
  • the processor may include a general-purpose processor that loads a specific program and executes a specific function, and a dedicated processor that is specialized for specific processing.
  • the dedicated processor may include an application specific integrated circuit (ASIC).
  • the processor may include a programmable logic device (PLD).
  • the PLD may include an FPGA (Field-Programmable Gate Array).
  • the control unit 50 may be either an SoC (System-on-a-Chip) or an SiP (System In-a-Package) in which one or more processors cooperate.
  • the control unit 50 includes a storage unit, and may store various information or programs for operating each component of the image display device 1 in the storage unit.
  • the storage unit may be composed of, for example, a semiconductor memory.
  • FIG. 6 is a flowchart for explaining the operation of the control unit 50.
  • the seating sensor 36 detects that the driver 5 is seated in the driver's seat 4 .
  • the vehicle exterior camera control device 35 operates each vehicle exterior camera 3a, 3b to image the surroundings of the vehicle 2 (step S1).
  • the in-vehicle camera control device 37 operates the line-of-sight recognition device 31 and the in-vehicle camera 6, and the in-vehicle camera 6 starts capturing an image.
  • the line of sight recognition device 31 detects the line of sight of the driver 5 based on the image captured by the in-vehicle camera 6 (step S2).
  • the image data processing device 33 cuts out the image to be projected onto the shielding part 7 from the image captured by the camera 3 outside the vehicle, and generates a first image and a second image (step S3).
  • a parallax-free image which is an image without parallax, is generated instead of the first image and the second image. Details of the parallax-free image will be described later.
  • the display control device 39 causes the shielding unit 7 to display the first image and the second image (step S4).
  • the display control device 39 causes the right side pillar display device 10b attached to the right side pillar, for example, to display an image that is originally blocked by the right side pillar and is not visible to the driver 5.
  • the parallax-free image is displayed in step S4.
  • the image display device 1 is in a state where it cannot correctly perceive an image in which the first image and the second image are mixed when the observer's line of sight and the convergence angle satisfy certain conditions. It is determined that the image without parallax is displayed on the shielding unit 7.
  • the observer's focus must be on the scenery, that is, the viewer must be able to look into the distance. is necessary.
  • the observer may end up focusing on the structures themselves, and may not be able to correctly perceive the projected image.
  • the image display device 1 when the observer's line of sight is within the range of the shielding unit 7 and the convergence angle of the observer detected by the line of sight detection unit is equal to or greater than the threshold, It is determined that the object cannot be perceived correctly. Then, when the driver 5 who is an observer cannot see a double image, the display of the mixed image of the first image and the second image is stopped and the image without parallax is displayed in a state where the driver 5 cannot perceive it correctly. (misidentification of objects) and can help improve driving comfort. That is, the image display device 1 according to the present embodiment allows the viewer to perceive the shielding portion 7 as being transparent, thereby avoiding a situation where the viewer cannot perceive it correctly.
  • Such display processing of a parallax-free image is executed by the image data processing device 33 regardless of whether the display target is the left or right side pillar.
  • display processing of a non-parallax image may be performed with the dashboard, door, and back seat 23 as display targets. It is particularly suitable for use in displaying pillars because objects that were conventionally in blind spots can be perceived as if they were made transparent without being misrecognized.
  • FIG. 7 is a diagram showing a state in which no parallax images are displayed on the shielding unit 7.
  • a parallax image an image in which the first image and the second image are mixed
  • FIG. 8 is a diagram showing a state in which a parallax image is displayed on the shielding unit 7 and can be perceived correctly by the observer.
  • FIG. 9 is a diagram showing a state in which a parallax image is displayed on the shielding unit 7 and the observer cannot perceive it correctly.
  • object 44 is perceived as a double image. In other words, the actual scene of the outside world is not connected, and the image displayed on the right side pillar display device 10b cannot be perceived as a continuous transparent image.
  • the image display device 1 when the image display device 1 detects that the observer is unable to perceive correctly based on the focus position of the line of sight, the image display device 1 displays the first image and the second image instead of the first image and the second image. , display a parallax-free image.
  • step S3 of FIG. 6 the image data processing device 33 determines whether the observer is in a state where the observer cannot perceive correctly based on the detected line of sight and convergence angle.
  • FIG. 10 is a diagram for explaining the relationship between the observer's line of sight, the convergence angle, and the shielding part 7. The line of sight and the convergence angle are calculated by the line of sight recognition device 31 based on the image captured by the in-vehicle camera 6, as described above. The convergence angle is indicated by ⁇ in FIG. Further, the display surface is a surface on which the first image and the second image of the shielding section 7 are displayed, as shown in FIG. The image data processing device 33 determines whether the observer's line of sight is within the range of the shielding part 7.
  • the range of the shielding part 7 is the width on the display surface. If the line of sight is within the range sandwiched between the left end 7A and right end 7B of the shielding part 7, it is determined that the observer is looking at the shielding part 7. Here, it is determined whether the line of sight of the right eye ER and the line of sight of the left eye EL are within the range of the shielding part 7, and when at least one line of sight is not within the range of the shielding part 7, the observer It may be determined that the person is not looking at the shielding part 7. Further, as another example, one of the line of sight of the right eye ER and the line of sight of the left eye EL may be used as a representative to determine whether the line of sight is within the range of the shielding part 7.
  • the image data processing device 33 next determines whether the observer is looking into the distance. When the convergence angle of the observer is equal to or greater than the threshold, the image data processing device 33 determines that the observer is looking at the display surface or something in front of the display surface and is unable to correctly perceive the image based on the parallax image. . When the observer is in a state where he or she cannot perceive correctly, the image data processing device 33 generates a parallax-free image. Further, the image display unit displays a parallax-free image on the shielding unit 7.
  • the above threshold value is determined based on the positional relationship between the positions of both eyes of the observer and the display surface of the shielding section 7. The threshold value may be determined based on an image captured by the in-vehicle camera 6, for example, when the vehicle 2 starts driving.
  • FIG. 11 is a flowchart for explaining details of the process (corresponding to step S3 in FIG. 6) of generating the first image and the second image or the non-parallax image.
  • the image data processing device 33 generates a parallax-free image when the detected line of sight of the observer is within the range of the shielding unit 7 (YES in step S11) and the convergence angle is greater than or equal to the threshold (YES in step S12). is generated (step S13). If the detected line of sight of the observer is not within the range of the shielding unit 7 (NO in step S11), or if the convergence angle is smaller than the threshold (NO in step S12), the image data processing device 33 processes the first image. and generates a second image (step S14).
  • the image display unit displays the generated image (“parallax-free image” or “first image and second image”) on the shielding unit 7.
  • the parallax-free image may be colorless and transparent, or may be a monochrome uniform image.
  • the single color may be an achromatic color such as white or gray.
  • the parallax-free image may be an image outside the vehicle.
  • the parallax-free image may be an image in a range corresponding to the shielding part 7 that is seen by either the right eye ER or the left eye LR.
  • the parallax-free image may be an image in a range corresponding to the shielding part 7 that is seen from a virtual viewpoint between the right eye ER and the left eye LR. From the viewpoint of alerting, the non-parallax image may include text information or content.
  • the text information may be, for example, a warning urging the user to look far away, a notification urging the user to focus on the outside of the vehicle, or a notification urging the user not to focus on the shielding portion 7 .
  • the first image and the second image may be displayed on the shielding unit 7 as the same image.
  • a parallax-free image including a warning "Check the front outside the vehicle” is displayed on the shielding unit 7, so that the driver 5 who is an observer tries to focus on a distant object.
  • the driver 5 is expected to correctly perceive the image based on the parallax image.
  • the image display device 1 allows the viewer to perceive the shielding part as being transparent, thereby avoiding a situation where the viewer cannot perceive it correctly.
  • Embodiments according to the present disclosure can also be realized as a method, a program, or a storage medium on which a program is recorded, which is executed by a processor included in an apparatus. It is to be understood that these are also encompassed within the scope of the present disclosure.
  • Image display device 2 Vehicle 3 External camera 4 Driver's seat 5 Driver 6 In-vehicle camera 7 Shielding section 8 First image processing section 9 Second image processing section 10
  • Display device 11 Retroreflective screen 11a Glass beads 11b Reflective film 12 Projection section 12L Second projection section 12R First projection section 13L, 13R Liquid crystal display device 14L Second projection lens 14R First projection lens 16 Diffusion plate 22 Rear seat 23 Back seat 24 Back seat projection section 25 Dashboard projection section 31 Line of sight recognition device 33 Image data processing device 35 External camera control device 36 Seating sensor 37 In-vehicle camera control device 39 Display control device 40 Right side pillar 42 Windshield 44 Object 46 Right window glass 50 Control unit EL Left eye ER Right eye

Abstract

Provided is an image display device that can avoid a state of incorrect perception by causing an observer to perceive a shielding portion as being transparent. The image display device comprises: an image data processing unit for generating a first image that is viewed by one of the left eye and the right eye of an observer and is generated in a range corresponding to a shielding portion obstructing the view of the observer and a second image that is viewed by the other of the left eye and the right eye of the observer and is generated in the range corresponding to the shielding portion, the first image being generated on the basis of first image data obtained by capturing an image around the observer by means of a first imaging unit; an image display unit that displays the first image and the second image in the shielding portion; and a sightline detection unit that detects the focus position of the observer. The second image has a parallax with respect to the first image, and the image display unit displays, in the shielding portion, an image having no parallax instead of the first image and the second image on the basis of the focus position of the sightline of the observer.

Description

画像表示装置image display device 関連出願の相互参照Cross-reference of related applications
 本出願は、日本国特許出願2022-087156号(2022年5月27日出願)の優先権を主張するものであり、当該出願の開示全体を、ここに参照のために取り込む。 This application claims priority to Japanese Patent Application No. 2022-087156 (filed on May 27, 2022), and the entire disclosure of that application is incorporated herein by reference.
 本開示は画像表示装置に関する。本開示は、特に車両周囲の撮像画像を遮蔽部に表示することによって、車外の景色が繋がったように運転者に知覚させることができる画像表示装置に関する。 The present disclosure relates to an image display device. The present disclosure particularly relates to an image display device that can display captured images of the surroundings of a vehicle on a shielding part, thereby making the driver perceive that the scenery outside the vehicle is connected.
 従来、車両などにおいて現実環境と重ね合わせられたグラフィックスが表示される技術が知られている(例えば、特許文献1参照)。さらに再帰性投影技術を活用して、あたかも物体が透けて見えるように画像を観察者に知覚させることができる透明化技術が知られている。この透明化技術では、例えば車両の外部に設けたカメラの撮像画像に基づいて、観察者の左眼及び右眼のそれぞれが知覚する左眼画像及び右眼画像がコンピュータによって作成される。左眼画像及び右眼画像が例えば車両のピラー、ダッシュボード等の視線を遮る遮蔽部に投射され、観察者である運転者は車外の景色を撮像した画像を立体画像として知覚し、死角となる部分が透けて外界の景色とつながったように知覚する。 Conventionally, a technique is known in which graphics superimposed on a real environment are displayed in a vehicle or the like (for example, see Patent Document 1). Further, there is a known transparency technology that utilizes a reflex projection technology to allow an observer to perceive an image as if the object is transparent. In this transparency technique, a computer creates a left-eye image and a right-eye image that are perceived by the observer's left and right eyes, respectively, based on images captured by a camera installed outside the vehicle, for example. The left eye image and the right eye image are projected onto a shielding part that blocks the line of sight, such as a vehicle pillar or dashboard, and the driver, who is an observer, perceives the image of the scenery outside the vehicle as a 3D image, which becomes a blind spot. It is perceived as if parts of it are transparent and connected to the scenery of the outside world.
特開2015-194473号公報Japanese Patent Application Publication No. 2015-194473
 本開示の一実施形態に係る画像表示装置は、
 第1撮像部が前記観察者の周囲を撮像し、撮像して得られた第1画像データに基づいて、前記観察者の左眼及び右眼の一方によって見られる、前記観察者の視界を遮る遮蔽部に対応する範囲の第1画像と、前記観察者の左眼及び右眼の他方によって見られる、前記遮蔽部に対応する範囲の第2画像を生成する、画像データ処理部と、
 前記遮蔽部に前記第1画像及び前記第2画像を表示する画像表示部と、
 前記観察者のピントの位置を検出する視線検出部と、を備え、
 前記第2画像は前記第1画像に対して視差を有し、
 前記画像表示部は、前記観察者の視線のピントの位置に基づいて、前記第1画像及び前記第2画像に代えて、視差を有しない画像である視差なし画像を前記遮蔽部に表示する。
An image display device according to an embodiment of the present disclosure includes:
A first imaging unit captures an image of the surroundings of the observer, and blocks the observer's field of view as seen by one of the observer's left eye and right eye based on the first image data obtained by capturing the image. an image data processing unit that generates a first image in a range corresponding to the shielding part and a second image in a range corresponding to the shielding part, which is seen by the other of the observer's left eye and right eye;
an image display unit that displays the first image and the second image on the shielding unit;
a line of sight detection unit that detects the focus position of the observer;
the second image has parallax with respect to the first image,
The image display section displays a parallax-free image, which is an image without parallax, on the shielding section, instead of the first image and the second image, based on the focus position of the observer's line of sight.
 本開示の一実施形態に係る画像表示装置は、
 観察者及び前記観察者の周囲を撮像し、撮像して得られた画像データを出力する撮像部と、
 前記画像データに基づいて、前記観察者の左眼及び右眼の視差を反映させた視差画像を前記観察者の視界を遮る遮蔽部において、前記観察者に知覚させる画像表示部と、
 前記観察者の視線のピントの位置を検出する視線検出部と、を備え、
 前記画像表示部は、前記観察者の視線のピントの位置が前記遮蔽部にある場合に、前記観察者の左眼及び右眼の視差を反映させない視差なし画像を前記観察者に知覚させる。
An image display device according to an embodiment of the present disclosure includes:
an imaging unit that captures an image of an observer and the surroundings of the observer and outputs image data obtained by capturing the image;
an image display unit that causes the observer to perceive a parallax image reflecting the parallax between the left eye and the right eye of the observer based on the image data in a shielding unit that blocks the observer's field of view;
a line-of-sight detection unit that detects the focus position of the observer's line of sight;
The image display section allows the observer to perceive a parallax-free image that does not reflect parallax between the left eye and right eye of the observer when the focus position of the observer's line of sight is on the shielding section.
図1は、本開示の一実施形態に係る画像表示装置の一例を模式的に示す平面図である。FIG. 1 is a plan view schematically showing an example of an image display device according to an embodiment of the present disclosure. 図2は、スクリーン及び拡散板の構成を示す一部の拡大断面図である。FIG. 2 is a partially enlarged sectional view showing the configuration of the screen and the diffuser plate. 図3は、画像表示装置を装備した車両を模式的に示す平面図である。FIG. 3 is a plan view schematically showing a vehicle equipped with an image display device. 図4は、画像表示装置を装備した車両を模式的に示す側面図である。FIG. 4 is a side view schematically showing a vehicle equipped with an image display device. 図5は、画像表示装置の構成を示すブロック図である。FIG. 5 is a block diagram showing the configuration of the image display device. 図6は、制御部の動作を説明するためのフローチャートである。FIG. 6 is a flowchart for explaining the operation of the control section. 図7は、遮蔽部に視差画像が表示されていない状態を示す図である。FIG. 7 is a diagram showing a state where no parallax image is displayed on the shielding section. 図8は、遮蔽部に視差画像が表示されて、正しく知覚できた状態を示す図である。FIG. 8 is a diagram showing a state in which a parallax image is displayed on the shielding part and can be perceived correctly. 図9は、遮蔽部に視差画像が表示されて、正しく知覚できない状態を示す図である。FIG. 9 is a diagram illustrating a state in which a parallax image is displayed on a shielding part and cannot be perceived correctly. 図10は、観察者の視線及び輻輳角と遮蔽部との関係について説明するための図である。FIG. 10 is a diagram for explaining the relationship between the observer's line of sight, the convergence angle, and the shielding part. 図11は、第1画像及び第2画像又は視差なし画像を生成する処理のフローチャートである。FIG. 11 is a flowchart of a process for generating a first image and a second image or an image without parallax. 図12は、視差なし画像を例示する図である。FIG. 12 is a diagram illustrating an image without parallax.
 以下、図面を参照して本開示の実施形態に係る画像表示装置が説明される。各図中、同一又は相当する部分には、同一符号が付されている。以下の実施形態の説明において、同一又は相当する部分については、説明を適宜省略又は簡略化する。 An image display device according to an embodiment of the present disclosure will be described below with reference to the drawings. In each figure, the same or corresponding parts are given the same reference numerals. In the following description of the embodiment, the description of the same or corresponding parts will be omitted or simplified as appropriate.
 図1は、本実施形態に係る画像表示装置1を模式的に示す平面図である。図2は、画像表示装置1に備えられる再帰反射性スクリーン11及び拡散板16の構成を示す一部の拡大断面図である。図3は、画像表示装置1を搭載する車両2を模式的に示す平面図である。図4は、画像表示装置1を搭載する車両2を模式的に示す側面図である。 FIG. 1 is a plan view schematically showing an image display device 1 according to the present embodiment. FIG. 2 is a partially enlarged sectional view showing the configuration of the retroreflective screen 11 and the diffuser plate 16 provided in the image display device 1. As shown in FIG. FIG. 3 is a plan view schematically showing a vehicle 2 on which the image display device 1 is mounted. FIG. 4 is a side view schematically showing a vehicle 2 on which the image display device 1 is mounted.
 画像表示装置1は、撮像部と、第1画像処理部8及び第2画像処理部9を含む画像データ処理装置33(図5参照)と、画像表示部と、視線検出部と、を備える。画像データ処理装置33は画像データ処理部と称されることがある。 The image display device 1 includes an imaging section, an image data processing device 33 (see FIG. 5) including a first image processing section 8 and a second image processing section 9, an image display section, and a line of sight detection section. The image data processing device 33 is sometimes referred to as an image data processing section.
 撮像部は、観察者及び観察者の周囲を撮像し、撮像して得られた画像データを出力する。本実施形態において、撮像部は、前部車外カメラ3a及び後部車外カメラ3bと、車内カメラ6と、を含んで構成される。前部車外カメラ3a及び後部車外カメラ3bは車両2の周囲の景色を撮像する。以下において、前部車外カメラ3a及び後部車外カメラ3bを含む車外カメラを第1撮像部と称することがある。また、第1撮像部が観察者の周囲を撮像して得られた画像データを第1画像データと称することがある。車内カメラ6は運転者5を撮像する。車内カメラ6の画像データは、車両2の運転席4に着座した観察者である運転者5の左眼EL及び右眼ERの位置を検出し、左眼EL及び右眼ERの位置及び瞳の位置から運転者5の視線を検出することに用いられる。以下において、車内カメラ6を第2撮像部と称することがある。また、第2撮像部が観察者を撮影して得られた画像データを第2画像データと称することがある。ここで、撮像部は、前部車外カメラ3a及び後部車外カメラ3bのうち、一方のみを含んでよい。 The imaging unit images the observer and the surroundings of the observer, and outputs the image data obtained by capturing the image. In this embodiment, the imaging unit includes a front exterior camera 3a, a rear exterior camera 3b, and an interior camera 6. The front exterior camera 3a and the rear exterior camera 3b capture images of the surrounding scenery of the vehicle 2. Hereinafter, the external cameras including the front external camera 3a and the rear external camera 3b may be referred to as a first imaging section. Further, image data obtained by imaging the surroundings of the observer by the first imaging unit may be referred to as first image data. The in-vehicle camera 6 images the driver 5. The image data of the in-vehicle camera 6 detects the positions of the left eye EL and right eye ER of the driver 5 who is an observer seated in the driver's seat 4 of the vehicle 2, and calculates the positions of the left eye EL and right eye ER and the position of the pupil. It is used to detect the line of sight of the driver 5 from the position. In the following, the in-vehicle camera 6 may be referred to as a second imaging section. Moreover, the image data obtained by photographing the observer by the second imaging unit may be referred to as second image data. Here, the imaging unit may include only one of the front exterior camera 3a and the rear exterior camera 3b.
 画像データ処理部の第1画像処理部8は、前部車外カメラ3a及び後部車外カメラ3bから出力される画像データに基づいて、左眼EL及び右眼ERの一方によって見られる、遮蔽部7に対応する範囲の第1画像を生成する。ここで、遮蔽部7は、観察者の視線を遮る物体であって、左眼EL及び右眼ERから車外を見たときに観察者の視界を遮る物体である。例えば、右眼ERによって見られる、遮蔽部7に対応する範囲の画像が第1画像とされてよい。 The first image processing section 8 of the image data processing section applies an image to the shielding section 7, which is seen by one of the left eye EL and the right eye ER, based on the image data output from the front exterior camera 3a and the rear exterior camera 3b. A first image of the corresponding range is generated. Here, the shielding part 7 is an object that blocks the observer's line of sight, and is an object that blocks the observer's field of view when looking outside the vehicle from the left eye EL and right eye ER. For example, the first image may be an image in the range corresponding to the shielding part 7 that is seen by the right eye ER.
 画像データ処理部の第2画像処理部9は、前部車外カメラ3a及び後部車外カメラ3bから出力される画像データに基づいて、左眼EL及び右眼ERの他方によって見られる、遮蔽部7に対応する範囲の第2画像を生成する。例えば、左眼LRによって見られる、遮蔽部7に対応する範囲の画像が第2画像とされてよい。第2画像は、第1画像に対して視差を有する。同様に、第1画像は、第2画像に対して視差を有する。従って、第1画像と第2画像に含まれる同一の物体であっても、当該物体を見る視点が異なっているため、画像中における当該物体の位置及び形状は視差に応じて異なっている。ここで、左眼ELによって見られる、遮蔽部7に対応する範囲とは、遮蔽部7が無かったならば左眼ELによって視認しうる範囲を指す。同様に、右眼RLによって見られる、遮蔽部7に対応する範囲とは、遮蔽部7が無かったならば右眼RLによって視認しうる範囲を指す。 The second image processing section 9 of the image data processing section applies an image to the shielding section 7, which is seen by the other of the left eye EL and the right eye ER, based on the image data output from the front exterior camera 3a and the rear exterior camera 3b. A second image of the corresponding range is generated. For example, the image of the range corresponding to the shielding part 7, which is seen by the left eye LR, may be set as the second image. The second image has parallax with respect to the first image. Similarly, the first image has parallax with respect to the second image. Therefore, even if the first image and the second image contain the same object, since the viewpoints from which the object is viewed are different, the position and shape of the object in the images differ depending on the parallax. Here, the range corresponding to the shielding part 7 that can be seen by the left eye EL refers to the range that would be visible by the left eye EL if the shielding part 7 did not exist. Similarly, the range corresponding to the shielding part 7 that can be seen by the right eye RL refers to the range that would be visible by the right eye RL if the shielding part 7 did not exist.
 画像表示部は、遮蔽部7に第1画像及び第2画像を表示する。画像表示部は、遮蔽部7に第1画像及び第2画像が混合された画像である視差画像を表示してよい。画像表示部は、視差画像を表示することで、観察者に視差を反映させた画像を知覚させる。本実施形態において、画像表示部は、表示装置10を含んで構成される。ここで、車両2における遮蔽部7としては、ダッシュボード、ドア、ピラー、バックシート23等を列挙することができる。遮蔽部7に車外カメラ3で撮像され、画像処理された画像が映し出されることによって、観察者である運転者5は、画像が車外の景色とつながっているとの知覚を得ることができる。 The image display section displays the first image and the second image on the shielding section 7. The image display unit may display a parallax image, which is an image in which the first image and the second image are mixed, on the shielding unit 7. The image display unit displays the parallax image, thereby allowing the viewer to perceive an image that reflects the parallax. In this embodiment, the image display section includes a display device 10. Here, as the shielding part 7 in the vehicle 2, a dashboard, a door, a pillar, a back seat 23, etc. can be enumerated. By displaying the image taken by the camera 3 outside the vehicle and subjected to image processing on the shielding portion 7, the driver 5, who is an observer, can perceive that the image is connected to the scenery outside the vehicle.
 視線検出部は、観察者のピントの位置を検出する。視線検出部は、観察者が撮影された画像データに基づいて、観察者の視線を検出してよい。本実施形態において、観察者が撮影された画像データは車内カメラ6から出力される。詳細について後述するが、本実施形態において、視線検出部は観察者の輻輳角も検出する。画像データ処理部は、検出された観察者の視線と輻輳角とが一定の条件を満たす場合に、観察者が第1画像及び第2画像が混合された視差画像を正しく知覚できない状態にあると判定する。そして、画像データ処理部が視差なし画像を生成し、画像表示部は遮蔽部7に視差なし画像を表示する。視差なし画像は、観察者の視差が反映されていない画像である。ここで、輻輳角は眼球から左右の視線が交差する角度である。 The line of sight detection unit detects the focus position of the observer. The line of sight detection unit may detect the line of sight of the observer based on image data of the observer. In this embodiment, image data captured by the observer is output from the in-vehicle camera 6. Although details will be described later, in this embodiment, the line of sight detection unit also detects the convergence angle of the observer. The image data processing unit determines that the observer is unable to correctly perceive the parallax image in which the first image and the second image are mixed when the detected line of sight and convergence angle of the observer satisfy a certain condition. judge. Then, the image data processing section generates a parallax-free image, and the image display section displays the parallax-free image on the shielding section 7. A parallax-free image is an image that does not reflect the viewer's parallax. Here, the convergence angle is the angle at which the left and right lines of sight intersect from the eyeball.
 図1及び図2に示すように、表示装置10は、再帰反射性スクリーン11と、再帰反射性スクリーン11に近接して設けられる拡散板16と、を含んで構成される。拡散板16は再帰反射性スクリーン11の観察者に臨む側の表面に貼り付けられて積層されてよい。表示装置10は、ダッシュボードに設けられたダッシュボード表示装置10aと、右サイドピラーに設けられた右サイドピラー表示装置10bと、左サイドピラーに設けられた左サイドピラー表示装置10cと、後部座席22のバックシート23の設けられたバックシート表示装置10dとを含んでよい。 As shown in FIGS. 1 and 2, the display device 10 includes a retroreflective screen 11 and a diffuser plate 16 provided close to the retroreflective screen 11. The diffuser plate 16 may be attached and laminated to the surface of the retroreflective screen 11 on the side facing the viewer. The display device 10 includes a dashboard display device 10a provided on the dashboard, a right side pillar display device 10b provided on the right side pillar, a left side pillar display device 10c provided on the left side pillar, and a rear seat display device 10a. The backsheet display device 10d may include a backsheet display device 10d provided with 22 backsheets 23.
 また、表示装置10は、投射部として、第1画像を再帰反射性スクリーン11に投射する第1投射部と、第2画像を再帰反射性スクリーン11に投射する第2投射部を含んで構成される。例えば右サイドピラーに貼付けられた右サイドピラー表示装置10bの投射部(右サイドピラー投射部12)は、第1画像を投射する第1投射部12R及び第2画像を投射する第2投射部12Lを含んで構成される。ここで、表示装置10a、10b、10c、10dは可撓性を有し、各遮蔽部7の起伏に応じて柔軟に湾曲させた状態で、接着剤等によって各遮蔽部7に接合されている。各投射部は同じ構成であるので、右サイドピラー投射部12を例に詳細に説明する。 The display device 10 also includes, as a projection section, a first projection section that projects the first image onto the retroreflective screen 11 and a second projection section that projects the second image onto the retroreflective screen 11. Ru. For example, the projection section (right side pillar projection section 12) of the right side pillar display device 10b attached to the right side pillar includes a first projection section 12R that projects a first image and a second projection section 12L that projects a second image. It consists of: Here, the display devices 10a, 10b, 10c, and 10d have flexibility, and are bonded to each shielding portion 7 with adhesive or the like in a state of being flexibly curved according to the ups and downs of each shielding portion 7. . Since each projection section has the same configuration, the right side pillar projection section 12 will be explained in detail as an example.
 第1投射部12Rは、第1画像を表示する液晶表示装置13Rと、液晶表示装置13Rから出射された第1画像の画像光を再帰反射性スクリーン11に投影する第1投射レンズ14Rとを有してよい。第2投射部12Lは、第2画像を表示する液晶表示装置13Lと、液晶表示装置13Lから出射された第2画像の画像光を再帰反射性スクリーン11に投影する第2投射レンズ14Lとを有してよい。各液晶表示装置13R、13Lは、透過型液晶表示素子と、液晶表示素子の背面に光を出射するバックライト装置とを備えてよい。ここで、液晶表示装置に代えてLED発光表示装置が用いられてよい。各投射レンズ14R、14Lは、第1画像及び第2画像が互いに視差をもたせて再帰反射性スクリーン11上に結像されるように、それぞれ複数のレンズの組合わせによって構成されてよい。ここで、観察者として運転者5を例示したが、助手席に座っている同乗者が観察者であってよい。第1投射部12Rは、その射出瞳が観察者の右眼ERと同じ高さかつ右眼ERの近傍となるように、例えばヘッドレストの右側に配置されてよい。第2投射部12Lも同様に、その射出瞳が観察者の左眼ELと同じ高さかつ左眼ELの近傍となるように、例えばヘッドレストの左側に配置されてよい。 The first projection unit 12R includes a liquid crystal display device 13R that displays a first image, and a first projection lens 14R that projects the image light of the first image emitted from the liquid crystal display device 13R onto the retroreflective screen 11. You may do so. The second projection unit 12L includes a liquid crystal display device 13L that displays a second image, and a second projection lens 14L that projects the image light of the second image emitted from the liquid crystal display device 13L onto the retroreflective screen 11. You may do so. Each of the liquid crystal display devices 13R and 13L may include a transmissive liquid crystal display element and a backlight device that emits light to the back surface of the liquid crystal display element. Here, an LED light emitting display device may be used instead of the liquid crystal display device. Each projection lens 14R, 14L may be configured by a combination of a plurality of lenses, respectively, so that the first image and the second image are formed on the retroreflective screen 11 with parallax. Here, although the driver 5 is illustrated as the observer, the observer may also be a fellow passenger sitting in the front passenger seat. The first projection unit 12R may be arranged, for example, on the right side of the headrest so that its exit pupil is at the same height as and near the right eye ER of the observer. Similarly, the second projection section 12L may be arranged, for example, on the left side of the headrest so that its exit pupil is at the same height as and near the left eye EL of the observer.
 バックシート投射部24及びダッシュボード投射部25もまた、右サイドピラー投射部12と同様に構成され、バックシート投射部24はバックシート表示装置10dにバックシート23によって遮蔽される範囲に対応する画像を投射する。またダッシュボード投射部25はダッシュボード表示装置10aにダッシュボードによって遮蔽される範囲に対応する画像を投射する。 The backseat projection section 24 and the dashboard projection section 25 are also configured similarly to the right side pillar projection section 12, and the backseat projection section 24 displays an image corresponding to the range covered by the backseat 23 on the backseat display device 10d. to project. Further, the dashboard projection unit 25 projects an image corresponding to the range covered by the dashboard onto the dashboard display device 10a.
 ダッシュボード投射部25は、例えば車両2の天井の中央部に取り付けられてよい。バックシート投射部24は、例えば運転席4の背もたれシートの上部に取付けられてよい。 The dashboard projection unit 25 may be attached to the center of the ceiling of the vehicle 2, for example. The back seat projection section 24 may be attached, for example, to the upper part of the backrest seat of the driver's seat 4.
 再帰反射性スクリーン11は、再帰反射性を有し、入射した光を入射方向に反射する。第1投射レンズ14R及び第2投射レンズ14Lから出射された第1画像の画像光及び第2画像の画像光は、再帰反射性スクリーン11によって、第1投射レンズ14R及び第2投射レンズ14Lに向けて反射される。そのため、再帰反射性スクリーン11上で重なっている(混合された)第1画像の画像光と第2画像の画像光は、観察者の位置において分離して知覚される。また、本実施形態において、再帰反射性スクリーン11の観察者側の表面には拡散板16が配置される。拡散板16は、再帰反射性スクリーン11で反射した光を、観察者の両眼に向けさせる拡散能力を有する。例えば、観察者の上方にダッシュボード投射部25がある場合に、拡散能が上下方向に大きく、左右方向に小さい拡散板16がダッシュボード表示装置10aで用いられてよい。ここで、右眼ER用の画像が左眼ELに入るといった画像の混同をなるべく抑制して、観察者に明瞭な立体画像を知覚させるために、左右方向の拡散能は上下方向の拡散能よりも小さいことが好ましい。拡散板16は、例えば再帰反射性スクリーン11の反射面上に接合されたホログラフィック光学素子であってよい。 The retroreflective screen 11 has retroreflectivity and reflects incident light in the direction of incidence. The image light of the first image and the image light of the second image emitted from the first projection lens 14R and the second projection lens 14L are directed to the first projection lens 14R and the second projection lens 14L by the retroreflective screen 11. reflected. Therefore, the image light of the first image and the image light of the second image, which are overlapped (mixed) on the retroreflective screen 11, are perceived separately at the observer's position. Furthermore, in this embodiment, a diffuser plate 16 is disposed on the viewer-side surface of the retroreflective screen 11. The diffusing plate 16 has a diffusing ability to direct the light reflected by the retroreflective screen 11 to both eyes of the viewer. For example, when the dashboard projection unit 25 is located above the observer, the dashboard display device 10a may use a diffusion plate 16 with a large diffusion capacity in the vertical direction and a small diffusion capacity in the horizontal direction. Here, in order to suppress image confusion such as an image for the right eye ER entering the left eye EL as much as possible, and to allow the viewer to perceive a clear stereoscopic image, the diffusion power in the left and right direction is higher than the diffusion power in the vertical direction. It is preferable that it is also small. The diffuser plate 16 may be, for example, a holographic optical element bonded onto the reflective surface of the retroreflective screen 11.
 図2に示すように、再帰反射性スクリーン11は、直径が例えば20μm以上100μm以下の微小な複数のガラスビーズ11aを反射膜11bに配置した構成であってよい。再帰反射性スクリーン11に投射された画像光は、ガラスビーズ11aに入射し、ガラスビーズ11aの表面で屈折して反射膜11b側の背面に達し、反射膜11bによって反射される。反射膜11bによって反射された光は、ガラスビーズ11aの背面で再び屈折し、ガラスビーズ11aの表面に達し、ガラスビーズ11aの直径以下の微小距離だけ入射光の入射経路から離間して、入射光と平行な光路を進むため、再帰反射が実現される。 As shown in FIG. 2, the retroreflective screen 11 may have a configuration in which a plurality of minute glass beads 11a having a diameter of, for example, 20 μm or more and 100 μm or less are arranged on a reflective film 11b. The image light projected onto the retroreflective screen 11 enters the glass beads 11a, is refracted on the surface of the glass beads 11a, reaches the back surface on the reflective film 11b side, and is reflected by the reflective film 11b. The light reflected by the reflective film 11b is refracted again at the back surface of the glass bead 11a, reaches the surface of the glass bead 11a, is separated from the incident path of the incident light by a minute distance less than the diameter of the glass bead 11a, and is reflected by the incident light. Since the light travels along a parallel optical path, retroreflection is achieved.
 上記のように、再帰反射性スクリーン11上で重なっている右眼ER用の第1画像の画像光と左眼EL用の第2画像の画像光は、観察者の位置で分離されており、右眼ER及び左眼ELに個別に入射する。観察者である運転者5は、第1画像の画像光と第2画像の画像光との混合画像から立体的な像を知覚することができる。観察者の視差を反映させた第1画像の画像光と第2画像の画像光との混合画像を視差画像という。 As described above, the image light of the first image for the right eye ER and the image light of the second image for the left eye EL, which overlap on the retroreflective screen 11, are separated at the observer's position, The light enters the right eye ER and the left eye EL separately. The driver 5 who is an observer can perceive a three-dimensional image from the mixed image of the image light of the first image and the image light of the second image. A mixed image of the image light of the first image and the image light of the second image reflecting the parallax of the observer is called a parallax image.
 ここで、フロントガラス及び後部ウインドガラスから見える車外の景色と、遮蔽部7に表示される画像とがつながったように知覚させるために、車体を基準とする座標系を用いて再帰性投影に関する計算が実行されてよい。例えば車長方向をX軸、車幅方向をY軸、車高方向をZ軸とする座標系が用いられて、運転者5の左眼ELと右眼ERの位置などがこの座標系における座標として定められてよい。車体を基準とする座標系において、運転者5の両眼の位置が定められて、再帰性投影に関する計算が実行されることによって、車外の景色と遮蔽部7に表示される画像とに連続性を持たせることができる。また、運転者5の両眼位置を検出することによって、運転者5の体形及び姿勢の相違に対しても柔軟に追従して画像を表示することができる。 Here, in order to make the scenery outside the vehicle seen through the windshield and rear window glass appear to be connected to the image displayed on the shielding part 7, calculations regarding the reflexive projection are performed using a coordinate system based on the vehicle body. may be executed. For example, a coordinate system in which the vehicle length direction is the X axis, the vehicle width direction is the Y axis, and the vehicle height direction is the Z axis is used, and the positions of the left eye EL and right eye ER of the driver 5 are determined by the coordinates in this coordinate system. may be defined as The positions of both eyes of the driver 5 are determined in a coordinate system based on the vehicle body, and calculations regarding reflexive projection are performed, thereby ensuring continuity between the scenery outside the vehicle and the image displayed on the shielding part 7. can have. Furthermore, by detecting the positions of both eyes of the driver 5, it is possible to flexibly follow differences in body shape and posture of the driver 5 and display an image.
 また、車両2の前部に設置される前部車外カメラ3aと後部に設置される後部車外カメラ3bは、魚眼レンズを取り付けたカメラであってよい。魚眼レンズを用いることによって、車外の景色を立体角で広範囲にわたって撮像することができる。ここで、車外カメラ3の数は限定されず、例えば1台であってよいし、3台以上であってよい。また、車外の撮像が可能であれば、車外カメラ3の設置場所は限定されない。つまり、車外カメラ3は、車外に設置されていてよいし、車内に設置されていてよい。また、車内カメラ6は、例えばルームミラーに隣接する位置など、運転者5を撮像できる位置に設置される。車外カメラ3及び車内カメラ6は、例えばCCDカメラであってよいが、特定の種類のカメラに限定されない。ここで、車内カメラ6、視線認識装置31及び車内カメラ制御装置37を含んで、視線検出部が構成される。 Further, the front exterior camera 3a installed at the front of the vehicle 2 and the rear exterior camera 3b installed at the rear of the vehicle 2 may be cameras equipped with fisheye lenses. By using a fisheye lens, it is possible to image the scenery outside the vehicle over a wide range of solid angles. Here, the number of cameras 3 outside the vehicle is not limited, and may be one, for example, or three or more. Moreover, the installation location of the vehicle exterior camera 3 is not limited as long as it is possible to image the exterior of the vehicle. That is, the vehicle exterior camera 3 may be installed outside the vehicle or may be installed inside the vehicle. Further, the in-vehicle camera 6 is installed at a position where it can image the driver 5, such as a position adjacent to a room mirror. The camera outside the vehicle 3 and the camera inside the vehicle 6 may be, for example, CCD cameras, but are not limited to a specific type of camera. Here, a line-of-sight detection unit is configured including the in-vehicle camera 6, the line-of-sight recognition device 31, and the in-vehicle camera control device 37.
 図5は、画像表示装置1の構成を示すブロック図である。前部車外カメラ3a及び後部車外カメラ3bによって撮像された画像の画像データは、車外カメラ制御装置35に送られる。車外カメラ制御装置35は、画像表示装置1の一部を構成する。車外カメラ制御装置35は、画像データに対して必要な信号処理(一例としてアナログ-デジタル変換)を実行して、画像データ処理装置33に出力する。 FIG. 5 is a block diagram showing the configuration of the image display device 1. Image data of images captured by the front exterior camera 3 a and the rear exterior camera 3 b are sent to the exterior camera control device 35 . The vehicle exterior camera control device 35 constitutes a part of the image display device 1 . The vehicle exterior camera control device 35 performs necessary signal processing (analog-to-digital conversion, for example) on the image data and outputs it to the image data processing device 33.
 また、本実施形態において、画像表示装置1は運転者5の着席の有無を検出する着座センサ36を備える。着座センサ36は運転席4に設けられる。着座センサ36は、公知の荷重センサ又はリミットスイッチによって構成されてよい。 Furthermore, in this embodiment, the image display device 1 includes a seating sensor 36 that detects whether or not the driver 5 is seated. The seating sensor 36 is provided in the driver's seat 4. The seating sensor 36 may be constituted by a known load sensor or limit switch.
 運転席4に運転者5が着座すると、運転席4に設置してある着座センサ36によって運転者5の着席が検出される。着座センサ36の検出結果が視線認識装置31に送られて、視線検出部は運転者5の両眼位置などの計測を開始する。視線認識装置31は、車内カメラ6の撮影画像から運転者5の左眼EL及び右眼ERの位置及び瞳位置を画像認識処理によって抽出し、運転者5の輻輳角及び視線を計算する。輻輳角及び視線の計算は公知の手法が用いられてよい。視線認識装置31は、眼球を球体と扱って、基準位置からの瞳の位置のずれ(眼球角度)を用いて輻輳角及び視線を計算してよい。視線認識装置31の計算結果は画像データ処理装置33に出力される。画像データ処理装置33は遮蔽部7が透明化されたように知覚させるための画像処理を実行し、画像処理の結果に基づいて表示制御装置39が表示装置10を制御する。 When the driver 5 is seated in the driver's seat 4, the seat sensor 36 installed in the driver's seat 4 detects that the driver 5 is seated. The detection result of the seating sensor 36 is sent to the line-of-sight recognition device 31, and the line-of-sight detection unit starts measuring the positions of the driver's 5 eyes. The line of sight recognition device 31 extracts the positions of the left eye EL and right eye ER and the pupil position of the driver 5 from the captured image of the in-vehicle camera 6 through image recognition processing, and calculates the convergence angle and line of sight of the driver 5. A known method may be used to calculate the vergence angle and line of sight. The line of sight recognition device 31 may treat the eyeball as a sphere and calculate the convergence angle and line of sight using the deviation of the position of the pupil from the reference position (eyeball angle). The calculation result of the line of sight recognition device 31 is output to the image data processing device 33. The image data processing device 33 executes image processing to make the shielding portion 7 appear transparent, and the display control device 39 controls the display device 10 based on the result of the image processing.
 また、画像表示装置1において、車外カメラ制御装置35、画像データ処理装置33、車内カメラ制御装置37、視線認識装置31及び表示制御装置39を含んで、制御部50が構成される。制御部50は、画像表示装置1を制御する。制御部50は、ハードウェア資源として例えば電子制御装置(Electronic Control Unit;ECU)等のプロセッサと、ソフトウェア資源としてコンピュータよって読み取り可能なプログラムとによって実現される。制御部50は、1以上のプロセッサを含んでよい。プロセッサは、特定のプログラムを読み込ませて特定の機能を実行する汎用のプロセッサ及び特定の処理に特化した専用のプロセッサを含んでよい。専用のプロセッサは、特定用途向けIC(ASIC:Application Specific Integrated Circuit)を含んでよい。プロセッサは、プログラマブルロジックデバイス(PLD:Programmable Logic Device)を含んでよい。PLDは、FPGA(Field-Programmable Gate Array)を含んでよい。制御部50は、1個又は複数のプロセッサが協働するSoC(System-on-a-Chip)及びSiP(System In a Package)のいずれかであってよい。制御部50は、記憶部を備え、記憶部に各種情報又は画像表示装置1の各構成要素を動作させるためのプログラム等を格納してよい。記憶部は、例えば半導体メモリ等で構成されてよい。 Furthermore, in the image display device 1, a control unit 50 is configured including an external camera control device 35, an image data processing device 33, an in-vehicle camera control device 37, a line of sight recognition device 31, and a display control device 39. The control unit 50 controls the image display device 1 . The control unit 50 is realized by a processor such as an electronic control unit (ECU) as a hardware resource, and a computer-readable program as a software resource. Control unit 50 may include one or more processors. The processor may include a general-purpose processor that loads a specific program and executes a specific function, and a dedicated processor that is specialized for specific processing. The dedicated processor may include an application specific integrated circuit (ASIC). The processor may include a programmable logic device (PLD). The PLD may include an FPGA (Field-Programmable Gate Array). The control unit 50 may be either an SoC (System-on-a-Chip) or an SiP (System In-a-Package) in which one or more processors cooperate. The control unit 50 includes a storage unit, and may store various information or programs for operating each component of the image display device 1 in the storage unit. The storage unit may be composed of, for example, a semiconductor memory.
 図6は、制御部50の動作を説明するためのフローチャートである。運転者5が運転席4に着座し、スタートボタン等によって車両2の運転を開始すると、運転者5が運転席4に着座したことが着座センサ36によって検出される。また、車外カメラ制御装置35が各車外カメラ3a、3bを作動させて、車両2の周囲を撮像する(ステップS1)。車内カメラ制御装置37が視線認識装置31及び車内カメラ6を作動させて、車内カメラ6による撮像が開始される。視線認識装置31は、車内カメラ6によって撮像された画像に基づいて、運転者5の視線を検出する(ステップS2)。画像データ処理装置33は、遮蔽部7に投射する画像を、車外カメラ3によって撮像された画像から切り出し、第1画像及び第2画像を生成する(ステップS3)。ステップS3において、観察者の視線と輻輳角とが一定の条件を満たす場合には、第1画像及び第2画像に代えて、視差を有しない画像である視差なし画像が生成される。視差なし画像の詳細については後述する。また、第1画像及び第2画像が生成されると、表示制御装置39は、遮蔽部7に第1画像及び第2画像を表示させる(ステップS4)。表示制御装置39は、例えば右サイドピラーに取付けられた右サイドピラー表示装置10bに、本来右サイドピラーによって遮られて運転者5から見えていない画像を表示させる。ここで、ステップS3で視差なし画像が生成された場合には、ステップS4において、視差なし画像が表示される。 FIG. 6 is a flowchart for explaining the operation of the control unit 50. When the driver 5 is seated in the driver's seat 4 and starts driving the vehicle 2 by pressing a start button or the like, the seating sensor 36 detects that the driver 5 is seated in the driver's seat 4 . Further, the vehicle exterior camera control device 35 operates each vehicle exterior camera 3a, 3b to image the surroundings of the vehicle 2 (step S1). The in-vehicle camera control device 37 operates the line-of-sight recognition device 31 and the in-vehicle camera 6, and the in-vehicle camera 6 starts capturing an image. The line of sight recognition device 31 detects the line of sight of the driver 5 based on the image captured by the in-vehicle camera 6 (step S2). The image data processing device 33 cuts out the image to be projected onto the shielding part 7 from the image captured by the camera 3 outside the vehicle, and generates a first image and a second image (step S3). In step S3, if the observer's line of sight and the convergence angle satisfy a certain condition, a parallax-free image, which is an image without parallax, is generated instead of the first image and the second image. Details of the parallax-free image will be described later. Moreover, when the first image and the second image are generated, the display control device 39 causes the shielding unit 7 to display the first image and the second image (step S4). The display control device 39 causes the right side pillar display device 10b attached to the right side pillar, for example, to display an image that is originally blocked by the right side pillar and is not visible to the driver 5. Here, if a parallax-free image is generated in step S3, the parallax-free image is displayed in step S4.
 ここで、本実施形態に係る画像表示装置1は、観察者の視線と輻輳角とが一定の条件を満たす場合に、第1画像及び第2画像が混合された画像を正しく知覚できない状態にあると判定し、遮蔽部7に視差なし画像を表示する。ピラー、ダッシュボード等に投射される景色の画像を外界の景色とつながったように正しく知覚するためには、観察者のピントが景色にあっていること、すなわち遠方を見るようになっていることが必要である。しかし、車内の構造物であるピラー、ダッシュボード等を見る場合に、観察者が車内の構造物自体にピントを合わせてしまって、投射された画像を正しく知覚できないことがある。本実施形態に係る画像表示装置1は、観察者の視線が遮蔽部7の範囲内にあり、かつ、視線検出部によって検出された観察者の輻輳角が閾値以上である場合に、観察者が正しく知覚できない状態にあると判定する。そして、正しく知覚できない状態において、第1画像及び第2画像が混合された画像の表示を停止し、視差なし画像を表示することによって、観察者である運転者5が二重像などを見ること(物体を誤認すること)を回避し、運転の快適性の向上を支援することができる。つまり、本実施形態に係る画像表示装置1は、観察者に遮蔽部7が透明化されたように知覚させ、正しく知覚できない状態を回避することが可能である。 Here, the image display device 1 according to the present embodiment is in a state where it cannot correctly perceive an image in which the first image and the second image are mixed when the observer's line of sight and the convergence angle satisfy certain conditions. It is determined that the image without parallax is displayed on the shielding unit 7. In order for the image of the scenery projected on pillars, dashboards, etc. to be correctly perceived as connected to the scenery of the outside world, the observer's focus must be on the scenery, that is, the viewer must be able to look into the distance. is necessary. However, when looking at structures inside a car, such as pillars and dashboards, the observer may end up focusing on the structures themselves, and may not be able to correctly perceive the projected image. In the image display device 1 according to the present embodiment, when the observer's line of sight is within the range of the shielding unit 7 and the convergence angle of the observer detected by the line of sight detection unit is equal to or greater than the threshold, It is determined that the object cannot be perceived correctly. Then, when the driver 5 who is an observer cannot see a double image, the display of the mixed image of the first image and the second image is stopped and the image without parallax is displayed in a state where the driver 5 cannot perceive it correctly. (misidentification of objects) and can help improve driving comfort. That is, the image display device 1 according to the present embodiment allows the viewer to perceive the shielding portion 7 as being transparent, thereby avoiding a situation where the viewer cannot perceive it correctly.
 ここで、このような視差なし画像の表示処理は、表示対象が左右サイドピラーのいずれに対しても、画像データ処理装置33によって実行される。また、ダッシュボード、ドア、バックシート23を表示対象として、視差なし画像の表示処理が実行されてよい。従来死角にあった物体を誤認することなく、あたかも透明化されたように知覚することができるため、特にピラーを表示対象として適用されることがよい。 Here, such display processing of a parallax-free image is executed by the image data processing device 33 regardless of whether the display target is the left or right side pillar. Further, display processing of a non-parallax image may be performed with the dashboard, door, and back seat 23 as display targets. It is particularly suitable for use in displaying pillars because objects that were conventionally in blind spots can be perceived as if they were made transparent without being misrecognized.
 以下、図面を参照しながら、視差なし画像の表示処理が詳細に説明される。図7は、遮蔽部7に視差画像が表示されていない状態を示す図である。右サイドピラー表示装置10bに視差画像(第1画像及び第2画像が混合された画像)が表示されていない場合に、フロントガラス42及び右ウインドガラス46越しに見える建物だけが視認される。図8は、遮蔽部7に視差画像が表示されて、観察者が正しく知覚できた状態を示す図である。視差画像が表示されて、観察者が遠方(すなわち、車外)を見るようにピントを合わせている場合に、観察者は、右サイドピラー40の右サイドピラー表示装置10bの部分に、物体44の像を知覚する。これに対して、図9は、遮蔽部7に視差画像が表示されて、観察者が正しく知覚できない状態を示す図である。視差画像が表示されて、観察者が右サイドピラー40自体又はその手前を見るようにピントが合っている場合に、物体44は二重像として知覚される。つまり、外界の実景が繋がらず、右サイドピラー表示装置10bに表示された画像を連続した透過映像として知覚することができない。したがって、運転の快適性の観点から、画像表示装置1は、視線のピントの位置に基づいて観察者が正しく知覚できない状態にあることを検知した場合に、第1画像及び第2画像に代えて、視差なし画像を表示する。 Hereinafter, the display process of a parallax-free image will be explained in detail with reference to the drawings. FIG. 7 is a diagram showing a state in which no parallax images are displayed on the shielding unit 7. When a parallax image (an image in which the first image and the second image are mixed) is not displayed on the right side pillar display device 10b, only the building visible through the windshield 42 and the right window glass 46 is visible. FIG. 8 is a diagram showing a state in which a parallax image is displayed on the shielding unit 7 and can be perceived correctly by the observer. When the parallax image is displayed and the observer focuses to look far away (that is, outside the vehicle), the observer can see the object 44 on the right side pillar display device 10b of the right side pillar 40. Perceive images. On the other hand, FIG. 9 is a diagram showing a state in which a parallax image is displayed on the shielding unit 7 and the observer cannot perceive it correctly. When a parallax image is displayed and the viewer focuses on right side pillar 40 itself or in front of it, object 44 is perceived as a double image. In other words, the actual scene of the outside world is not connected, and the image displayed on the right side pillar display device 10b cannot be perceived as a continuous transparent image. Therefore, from the viewpoint of driving comfort, when the image display device 1 detects that the observer is unable to perceive correctly based on the focus position of the line of sight, the image display device 1 displays the first image and the second image instead of the first image and the second image. , display a parallax-free image.
 画像データ処理装置33は、図6のステップS3において、検出された視線及び輻輳角に基づいて、観察者が正しく知覚できない状態にあるか否かを判定する。図10は、観察者の視線及び輻輳角と遮蔽部7との関係について説明するための図である。視線及び輻輳角は、上記のように、車内カメラ6の撮影画像に基づき視線認識装置31によって計算される。輻輳角は、図10においてθで示されている。また、表示面は、図10のように遮蔽部7の第1画像及び第2画像が表示される面である。画像データ処理装置33は、観察者の視線が遮蔽部7の範囲内にあるかを判定する。遮蔽部7の範囲は、表示面における幅である。視線が遮蔽部7の左端7Aと右端7Bとで挟まれる範囲にあれば、観察者が遮蔽部7を見ている状態にあると判定される。ここで、右眼ERの視線と左眼ELの視線のそれぞれについて遮蔽部7の範囲内にあるかの判定が行われて、少なくとも1つの視線が遮蔽部7の範囲にないときに、観察者が遮蔽部7を見ていないと判定されてよい。また、別の例として、右眼ERの視線と左眼ELの視線のうちの1つが代表となって、遮蔽部7の範囲内にあるかの判定で用いられてよい。画像データ処理装置33は、観察者が遮蔽部7を見ている状態にある場合に、続いて、遠方を見ているかを判定する。画像データ処理装置33は、観察者の輻輳角が閾値以上である場合に、観察者が表示面又は表示面より手前を見ており、視差画像に基づく像を正しく知覚できない状態にあると判定する。観察者が正しく知覚できない状態にある場合に、画像データ処理装置33は視差なし画像を生成する。また、画像表示部は遮蔽部7に視差なし画像を表示する。ここで、上記の閾値は、観察者の両眼の位置と遮蔽部7の表示面との位置関係に基づいて決定される。閾値は、例えば車両2の運転が開始されるときに、車内カメラ6の撮影画像に基づいて決定されてよい。 In step S3 of FIG. 6, the image data processing device 33 determines whether the observer is in a state where the observer cannot perceive correctly based on the detected line of sight and convergence angle. FIG. 10 is a diagram for explaining the relationship between the observer's line of sight, the convergence angle, and the shielding part 7. The line of sight and the convergence angle are calculated by the line of sight recognition device 31 based on the image captured by the in-vehicle camera 6, as described above. The convergence angle is indicated by θ in FIG. Further, the display surface is a surface on which the first image and the second image of the shielding section 7 are displayed, as shown in FIG. The image data processing device 33 determines whether the observer's line of sight is within the range of the shielding part 7. The range of the shielding part 7 is the width on the display surface. If the line of sight is within the range sandwiched between the left end 7A and right end 7B of the shielding part 7, it is determined that the observer is looking at the shielding part 7. Here, it is determined whether the line of sight of the right eye ER and the line of sight of the left eye EL are within the range of the shielding part 7, and when at least one line of sight is not within the range of the shielding part 7, the observer It may be determined that the person is not looking at the shielding part 7. Further, as another example, one of the line of sight of the right eye ER and the line of sight of the left eye EL may be used as a representative to determine whether the line of sight is within the range of the shielding part 7. When the observer is looking at the shielding part 7, the image data processing device 33 next determines whether the observer is looking into the distance. When the convergence angle of the observer is equal to or greater than the threshold, the image data processing device 33 determines that the observer is looking at the display surface or something in front of the display surface and is unable to correctly perceive the image based on the parallax image. . When the observer is in a state where he or she cannot perceive correctly, the image data processing device 33 generates a parallax-free image. Further, the image display unit displays a parallax-free image on the shielding unit 7. Here, the above threshold value is determined based on the positional relationship between the positions of both eyes of the observer and the display surface of the shielding section 7. The threshold value may be determined based on an image captured by the in-vehicle camera 6, for example, when the vehicle 2 starts driving.
 図11は、第1画像及び第2画像又は視差なし画像を生成する処理(図6のステップS3に対応)の詳細を説明するためのフローチャートである。画像データ処理装置33は、検出された観察者の視線が遮蔽部7の範囲内にあって(ステップS11のYES)、輻輳角が閾値以上である場合に(ステップS12のYES)、視差なし画像を生成する(ステップS13)。画像データ処理装置33は、検出された観察者の視線が遮蔽部7の範囲内にない(ステップS11のNO)、又は、輻輳角が閾値より小さい場合に(ステップS12のNO)、第1画像及び第2画像を生成する(ステップS14)。画像表示部は、遮蔽部7に、生成された画像(「視差なし画像」又は「第1画像及び第2画像」)を表示する。 FIG. 11 is a flowchart for explaining details of the process (corresponding to step S3 in FIG. 6) of generating the first image and the second image or the non-parallax image. The image data processing device 33 generates a parallax-free image when the detected line of sight of the observer is within the range of the shielding unit 7 (YES in step S11) and the convergence angle is greater than or equal to the threshold (YES in step S12). is generated (step S13). If the detected line of sight of the observer is not within the range of the shielding unit 7 (NO in step S11), or if the convergence angle is smaller than the threshold (NO in step S12), the image data processing device 33 processes the first image. and generates a second image (step S14). The image display unit displays the generated image (“parallax-free image” or “first image and second image”) on the shielding unit 7.
 ここで、視差なし画像は無色透明であってよいし、単色の均一画像であってよい。単色は白又はグレーの無彩色であってよい。視差なし画像は、車外の画像であってよい。視差なし画像は、右眼ER又は左目LRのいずれか一方によって見られる、遮蔽部7に対応する範囲の画像であってよい。視差なし画像は、右眼ER及び左目LRの間の仮想の視点から見られる、遮蔽部7に対応する範囲の画像であってよい。注意喚起の観点から、視差なし画像は文字情報又はコンテンツを含むものであってよい。文字情報は、例えば遠方を見ることを促す警告、車外にピントを合わせることを促す通知、遮蔽部7にピントを合わせないことを促す通知であってよい。視差なし画像を表示する場合は、第1画像及び第2画像を同一の画像として遮蔽部7に表示させてよい。例えば図12に示すように、「車外前方確認」という警告を含む視差なし画像が遮蔽部7に表示されることによって、観察者である運転者5は遠方にピントを合わせようとする。その後に、遮蔽部7に視差画像が表示された場合に、運転者5は視差画像に基づく像を正しく知覚することが期待される。 Here, the parallax-free image may be colorless and transparent, or may be a monochrome uniform image. The single color may be an achromatic color such as white or gray. The parallax-free image may be an image outside the vehicle. The parallax-free image may be an image in a range corresponding to the shielding part 7 that is seen by either the right eye ER or the left eye LR. The parallax-free image may be an image in a range corresponding to the shielding part 7 that is seen from a virtual viewpoint between the right eye ER and the left eye LR. From the viewpoint of alerting, the non-parallax image may include text information or content. The text information may be, for example, a warning urging the user to look far away, a notification urging the user to focus on the outside of the vehicle, or a notification urging the user not to focus on the shielding portion 7 . When displaying an image without parallax, the first image and the second image may be displayed on the shielding unit 7 as the same image. For example, as shown in FIG. 12, a parallax-free image including a warning "Check the front outside the vehicle" is displayed on the shielding unit 7, so that the driver 5 who is an observer tries to focus on a distant object. After that, when the parallax image is displayed on the shielding unit 7, the driver 5 is expected to correctly perceive the image based on the parallax image.
 以上のように、本実施形態に係る画像表示装置1は、上記の構成によって、観察者に遮蔽部が透明化されたように知覚させ、正しく知覚できない状態を回避することが可能である。 As described above, the image display device 1 according to the present embodiment, with the above configuration, allows the viewer to perceive the shielding part as being transparent, thereby avoiding a situation where the viewer cannot perceive it correctly.
 本開示に係る実施形態について、諸図面及び実施例に基づき説明してきたが、当業者であれば本開示に基づき種々の変形又は修正を行うことが容易であることに注意されたい。従って、これらの変形又は修正は本開示の範囲に含まれることに留意されたい。例えば、各構成部又は各工程などに含まれる機能などは論理的に矛盾しないように再配置可能であり、複数の構成部又は工程などを1つに組み合わせたり、或いは分割したりすることが可能である。本開示に係る実施形態について装置を中心に説明してきたが、本開示に係る実施形態は装置の各構成部が実行するステップを含む方法としても実現し得るものである。本開示に係る実施形態は装置が備えるプロセッサにより実行される方法、プログラム又はプログラムを記録した記憶媒体としても実現し得るものである。本開示の範囲にはこれらも包含されるものと理解されたい。 Although the embodiments according to the present disclosure have been described based on the drawings and examples, it should be noted that those skilled in the art can easily make various changes or modifications based on the present disclosure. It should therefore be noted that these variations or modifications are included within the scope of this disclosure. For example, functions included in each component or each process can be rearranged to avoid logical contradictions, and multiple components or processes can be combined or divided into one. It is. Although the embodiments according to the present disclosure have been described with a focus on the apparatus, the embodiments according to the present disclosure can also be realized as a method including steps executed by each component of the apparatus. Embodiments according to the present disclosure can also be realized as a method, a program, or a storage medium on which a program is recorded, which is executed by a processor included in an apparatus. It is to be understood that these are also encompassed within the scope of the present disclosure.
 1 画像表示装置
 2 車両
 3 車外カメラ
 4 運転席
 5 運転者
 6 車内カメラ
 7 遮蔽部
 8 第1画像処理部
 9 第2画像処理部
 10 表示装置
 11 再帰反射性スクリーン
 11a ガラスビーズ
 11b 反射膜
 12 投射部
 12L 第2投射部
 12R 第1投射部
 13L、13R 液晶表示装置
 14L 第2投射レンズ
 14R 第1投射レンズ
 16 拡散板
 22 後部座席
 23 バックシート
 24 バックシート投射部
 25 ダッシュボード投射部
 31 視線認識装置
 33 画像データ処理装置
 35 車外カメラ制御装置
 36 着座センサ
 37 車内カメラ制御装置
 39 表示制御装置
 40 右サイドピラー
 42 フロントガラス
 44 物体
 46 右ウインドガラス
 50 制御部
 EL 左眼
 ER 右眼
1 Image display device 2 Vehicle 3 External camera 4 Driver's seat 5 Driver 6 In-vehicle camera 7 Shielding section 8 First image processing section 9 Second image processing section 10 Display device 11 Retroreflective screen 11a Glass beads 11b Reflective film 12 Projection section 12L Second projection section 12R First projection section 13L, 13R Liquid crystal display device 14L Second projection lens 14R First projection lens 16 Diffusion plate 22 Rear seat 23 Back seat 24 Back seat projection section 25 Dashboard projection section 31 Line of sight recognition device 33 Image data processing device 35 External camera control device 36 Seating sensor 37 In-vehicle camera control device 39 Display control device 40 Right side pillar 42 Windshield 44 Object 46 Right window glass 50 Control unit EL Left eye ER Right eye

Claims (10)

  1.  第1撮像部が観察者の周囲を撮像し、撮像して得られた第1画像データに基づいて、前記観察者の左眼及び右眼の一方によって見られる、前記観察者の視界を遮る遮蔽部に対応する範囲の第1画像と、前記観察者の左眼及び右眼の他方によって見られる、前記遮蔽部に対応する範囲の第2画像を生成する、画像データ処理部と、
     前記遮蔽部に前記第1画像及び前記第2画像を表示する画像表示部と、
     前記観察者のピントの位置を検出する視線検出部と、を備え、
     前記第2画像は前記第1画像に対して視差を有し、
     前記画像表示部は、前記観察者の視線のピントの位置に基づいて、前記第1画像及び前記第2画像に代えて、視差を有しない画像である視差なし画像を前記遮蔽部に表示する、画像表示装置。
    A first imaging unit images the surroundings of the observer, and based on the first image data obtained by the imaging, a shield that blocks the observer's field of view is seen by one of the observer's left eye and right eye. an image data processing unit that generates a first image in a range corresponding to the shielding part, and a second image in a range corresponding to the shielding part, which is seen by the other of the observer's left eye and right eye;
    an image display unit that displays the first image and the second image on the shielding unit;
    a line of sight detection unit that detects the focus position of the observer;
    the second image has parallax with respect to the first image,
    The image display unit displays a parallax-free image, which is an image without parallax, on the shielding unit, instead of the first image and the second image, based on the focus position of the observer's line of sight. Image display device.
  2.  前記画像表示部は、前記観察者の視線のピントの位置が、前記遮蔽部に合っている場合に、前記第1画像及び前記第2画像に代えて、視差を有しない画像である視差なし画像を前記遮蔽部に表示する、請求項1に記載の画像表示装置。 The image display unit displays a parallax-free image, which is an image without parallax, in place of the first image and the second image when the focus position of the observer's line of sight is on the shielding unit. The image display device according to claim 1, wherein the image display device displays on the shielding section.
  3.  前記視線検出部は、第2撮像部が前記観察者を撮影して得られた第2画像データに基づいて前記観察者の視線を検出し、
     前記画像表示部は、前記観察者の視線が前記遮蔽部の範囲内にあり、かつ、前記視線検出部によって検出された前記観察者の輻輳角が閾値以上である場合に、前記視差なし画像を前記遮蔽部に表示する、請求項1に記載の画像表示装置。
    The line of sight detection unit detects the line of sight of the observer based on second image data obtained by photographing the observer by a second imaging unit,
    The image display section displays the parallax-free image when the observer's line of sight is within the range of the shielding section and the convergence angle of the observer detected by the line of sight detection section is greater than or equal to a threshold. The image display device according to claim 1, wherein the image display device displays on the shielding section.
  4.  前記視差なし画像は文字情報を含む、請求項1から3のいずれか一項に記載の画像表示装置。 The image display device according to any one of claims 1 to 3, wherein the parallax-free image includes text information.
  5.  前記文字情報は遠方を見ることを促す警告である、請求項4に記載の画像表示装置。 The image display device according to claim 4, wherein the text information is a warning urging you to look into the distance.
  6.  車両に搭載され、前記遮蔽部がピラーである、請求項1から3のいずれか一項に記載の画像表示装置。 The image display device according to any one of claims 1 to 3, which is mounted on a vehicle and wherein the shielding portion is a pillar.
  7.  前記画像表示部は、
      前記遮蔽部に設けられた、再帰反射性スクリーンと、
      前記第1画像を、前記再帰反射性スクリーンに投射する第1投射部と、
      前記第2画像を、前記再帰反射性スクリーンに投射する第2投射部と、を含む、請求項1から3のいずれか一項に記載の画像表示装置。
    The image display section includes:
    a retroreflective screen provided in the shielding part;
    a first projection unit that projects the first image onto the retroreflective screen;
    The image display device according to any one of claims 1 to 3, further comprising a second projection unit that projects the second image onto the retroreflective screen.
  8.  前記視差なし画像を前記画像表示部に表示する場合には、前記第1投射部と前記第2投射部は、同一の画像を前記再帰反射性スクリーンに投射する、請求項7に記載の画像表示装置。 The image display according to claim 7, wherein when displaying the parallax-free image on the image display section, the first projection section and the second projection section project the same image onto the retroreflective screen. Device.
  9.  前記画像表示部は、
      前記再帰反射性スクリーンに近接して設けられた拡散板を、さらに含む、請求項7に記載の画像表示装置。
    The image display section includes:
    The image display device according to claim 7, further comprising a diffuser plate provided adjacent to the retroreflective screen.
  10.  観察者及び前記観察者の周囲を撮像し、撮像して得られた画像データを出力する撮像部と、
     前記画像データに基づいて、前記観察者の左眼及び右眼の視差を反映させた視差画像を前記観察者の視界を遮る遮蔽部において、前記観察者に知覚させる画像表示部と、
     前記観察者の視線のピントの位置を検出する視線検出部と、を備え、
     前記画像表示部は、前記観察者の視線のピントの位置が前記遮蔽部にある場合に、前記観察者の左眼及び右眼の視差を反映させない視差なし画像を前記観察者に知覚させる、画像表示装置。
    an imaging unit that captures an image of an observer and the surroundings of the observer and outputs image data obtained by capturing the image;
    an image display unit that causes the observer to perceive a parallax image reflecting the parallax between the left eye and the right eye of the observer based on the image data in a shielding unit that blocks the observer's field of view;
    a line-of-sight detection unit that detects the focus position of the observer's line of sight;
    The image display unit is configured to display an image that causes the observer to perceive a parallax-free image that does not reflect parallax between the left eye and right eye of the observer when the focus position of the observer's line of sight is on the shielding unit. Display device.
PCT/JP2023/017619 2022-05-27 2023-05-10 Image display device WO2023228752A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022-087156 2022-05-27
JP2022087156A JP2023174351A (en) 2022-05-27 2022-05-27 image display device

Publications (1)

Publication Number Publication Date
WO2023228752A1 true WO2023228752A1 (en) 2023-11-30

Family

ID=88919152

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/017619 WO2023228752A1 (en) 2022-05-27 2023-05-10 Image display device

Country Status (2)

Country Link
JP (1) JP2023174351A (en)
WO (1) WO2023228752A1 (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040196214A1 (en) * 1993-09-14 2004-10-07 Maguire Francis J. Method and apparatus for eye tracking in a vehicle
JP2005515487A (en) * 2002-01-04 2005-05-26 ニューローケイ・エルエルシー 3D image projection using retroreflective screen
JP2015012559A (en) * 2013-07-02 2015-01-19 株式会社デンソー Projection type display device
WO2017018122A1 (en) * 2015-07-29 2017-02-02 富士フイルム株式会社 Projection display device and projection control method
JP2017193190A (en) * 2016-04-18 2017-10-26 ソニー株式会社 Image display device, image display device and mobile body
US20180290593A1 (en) * 2017-04-10 2018-10-11 Hyundai Motor Company Pillar display system for blind spot of vehicle
JP2020091409A (en) * 2018-12-06 2020-06-11 株式会社Jvcケンウッド Display device, discrimination method and irradiation method
US20200344463A1 (en) * 2017-12-22 2020-10-29 Mirage 3.4D Pty Ltd Camera Projection Technique System and Method

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040196214A1 (en) * 1993-09-14 2004-10-07 Maguire Francis J. Method and apparatus for eye tracking in a vehicle
JP2005515487A (en) * 2002-01-04 2005-05-26 ニューローケイ・エルエルシー 3D image projection using retroreflective screen
JP2015012559A (en) * 2013-07-02 2015-01-19 株式会社デンソー Projection type display device
WO2017018122A1 (en) * 2015-07-29 2017-02-02 富士フイルム株式会社 Projection display device and projection control method
JP2017193190A (en) * 2016-04-18 2017-10-26 ソニー株式会社 Image display device, image display device and mobile body
US20180290593A1 (en) * 2017-04-10 2018-10-11 Hyundai Motor Company Pillar display system for blind spot of vehicle
US20200344463A1 (en) * 2017-12-22 2020-10-29 Mirage 3.4D Pty Ltd Camera Projection Technique System and Method
JP2020091409A (en) * 2018-12-06 2020-06-11 株式会社Jvcケンウッド Display device, discrimination method and irradiation method

Also Published As

Publication number Publication date
JP2023174351A (en) 2023-12-07

Similar Documents

Publication Publication Date Title
US7719621B2 (en) Image display device and method having image control unit preventing light source unit from outputting an image when observer is outside of predefined normal viewing area
WO2015174051A1 (en) Display device and display method
US10668857B2 (en) Reflector, information display apparatus, and movable body
US11235706B2 (en) Display system
JP6706802B2 (en) Display system, electronic mirror system, and moving body including the same
US9684166B2 (en) Motor vehicle and display of a three-dimensional graphical object
JP6697751B2 (en) Vehicle display system, electronic mirror system and moving body
CN115244451A (en) Aerial image display device
WO2022181767A1 (en) Image display device
WO2023228752A1 (en) Image display device
WO2023228771A1 (en) Image display device, vehicle, and image display method
WO2023228770A1 (en) Image display device
WO2023233919A1 (en) Image projection system
JP6945150B2 (en) Display system
WO2022230824A1 (en) Image display device and image display method
JP6697747B2 (en) Display system, electronic mirror system and moving body
US20240126095A1 (en) Image display device
WO2020031549A1 (en) Virtual image display device
EP3502953A1 (en) Driver monitoring system for a vehicle
JP2021018295A (en) Windshield display device
WO2022255424A1 (en) Video display device
JP6083338B2 (en) Projection display
JP2019120891A (en) Virtual display device and head-up display device
CN210666207U (en) Head-up display device, imaging system and vehicle
WO2022186189A1 (en) Imaging device and three-dimensional display device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23811627

Country of ref document: EP

Kind code of ref document: A1