WO2006118057A1 - Dispositif d’affichage d’image - Google Patents

Dispositif d’affichage d’image Download PDF

Info

Publication number
WO2006118057A1
WO2006118057A1 PCT/JP2006/308448 JP2006308448W WO2006118057A1 WO 2006118057 A1 WO2006118057 A1 WO 2006118057A1 JP 2006308448 W JP2006308448 W JP 2006308448W WO 2006118057 A1 WO2006118057 A1 WO 2006118057A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
unit
display device
observer
line
Prior art date
Application number
PCT/JP2006/308448
Other languages
English (en)
Japanese (ja)
Inventor
Tatsuki Okamoto
Yukio Satoh
Tatsuyuki Kawamura
Yasuyuki Kono
Masatsugu Kidode
Original Assignee
National University Corporation NARA Institute of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by National University Corporation NARA Institute of Science and Technology filed Critical National University Corporation NARA Institute of Science and Technology
Publication of WO2006118057A1 publication Critical patent/WO2006118057A1/fr

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/02Viewing or reading apparatus
    • G02B27/022Viewing apparatus
    • G02B27/024Viewing apparatus comprising a light source, e.g. for viewing photographic slides, X-ray transparancies
    • G02B27/026Viewing apparatus comprising a light source, e.g. for viewing photographic slides, X-ray transparancies and a display device, e.g. CRT, LCD, for adding markings or signs or to enhance the contrast of the viewed object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/193Preprocessing; Feature extraction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/74Projection arrangements for image reproduction, e.g. using eidophor
    • H04N5/7475Constructional details of television projection apparatus
    • H04N5/7491Constructional details of television projection apparatus of head mounted projectors
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye

Definitions

  • the present invention relates to an image display device, and more particularly to an eyeball projection type image display device that displays a virtual image (including a video image) superimposed on a real image.
  • a conventional display device such as an optical see-through display has several problems.
  • a conventional display device has a narrow viewing angle at which a real image can be obtained.
  • the brightness of the real image changes depending on the line of sight of the observer (user) (the object to be watched), whereas the brightness of the virtual image is constant, so the observer feels uncomfortable.
  • the virtual image related to the actual observed object (real image) 's distance to the point of gaze changes according to the observer's line-of-sight direction always looks the same, the perspective between the real image and the virtual image I can't get it, and I can't get mixed reality with more realism (verisimilar)! /.
  • Patent Document 1 For example, according to a conventional display device proposed in Japanese Patent Laid-Open No. 8-313843 (Patent Document 1), a background image (wide-field image) displayed at a low resolution and a line of sight are followed. High-reality images can be realized by combining and displaying high-resolution narrow-field images that move.
  • the above display device always displays a wide-field image that is related to the distance from the wide-field image and the narrow-field image to the observer (its eyeball). The distance between the object included in the object and the object included in the narrow field image cannot be accurately recognized. That is, according to the display device of Patent Document 1, it is not possible to express the perspective of an image depending on the difference in the distance (depth of field) in the viewing direction.
  • Patent Document 2 Japanese Patent Application Laid-Open No. 11-313843
  • Patent Document 2 describes a first video with a wide field of view and a high-resolution image.
  • An image display device that synthesizes and displays a fine second image is disclosed.
  • the above video display device it is suggested to display a virtual image with a depth of field of view that is always displayed clearly without depending on the lens effect of the eyeball.
  • the depth of field of the virtual image is arbitrarily set by the observer, and the distance to the target object of the real image obtained by actual observation is compared with the depth of field.
  • An object of the present invention is to provide an image display device that forms a composite image by appropriately fusing (superimposing) a virtual image with a real image and displaying the composite image without giving a sense of incongruity to an observer.
  • An image display device includes a gaze direction detection unit that detects a gaze direction of an observer's eyeball, and is disposed on the optical axis of the eyeball of the observer and is in the gaze direction of the eyeball.
  • An image capturing unit that captures a real image, a virtual image storage unit that stores a virtual image, and a composite image in which a virtual image stored in the virtual image storage unit is superimposed on a real image captured by the image capturing unit.
  • a display unit for displaying a composite image to an observer.
  • a realistic composite image without a sense of incongruity in which a virtual image is superimposed on a real image is displayed, thereby realizing a more real mixed reality. can do.
  • FIG. 1 is a schematic diagram showing an image display apparatus according to Embodiment 1 of the present invention.
  • FIG. 2 is a block diagram showing each component unit of the image display device in FIG. 1.
  • FIG. 3 is a schematic view showing a modification of the display unit of FIG.
  • FIG. 4 is a schematic view showing components of the display unit of FIG.
  • FIG. 5 is a schematic diagram showing a process of calibrating the vertical center axis of the line-of-sight direction detection unit with respect to the line-of-sight direction in the image display device according to Embodiment 1.
  • FIG. 6 is a schematic diagram showing a process of calibrating the vertical center axis of the line-of-sight direction detection unit with respect to the line-of-sight direction in the image display device according to Embodiment 1.
  • FIG. 7 is a schematic view showing another modification of the display unit of FIG.
  • FIG. 8 is a schematic view showing still another modification of the display unit of FIG.
  • FIG. 9 is a schematic view showing still another modification of the display unit of FIG.
  • FIG. 10 is a schematic view showing a modification of the drive unit in FIG. 1.
  • FIG. 11 is a schematic diagram showing a process of removing the drive unit of FIG. 1 from the line-of-sight direction when a failure occurs in the image display apparatus according to Embodiment 1.
  • FIG. 12 is a schematic diagram showing an image display apparatus according to Embodiment 2 of the present invention.
  • FIG. 13 is a schematic diagram showing an image display apparatus according to Embodiment 3 of the present invention.
  • FIG. 14 is a schematic diagram showing an arrangement position of an eyeball, a real image, and a virtual image when an observer observes a real image and a virtual image at the same time using the image display device of the third embodiment.
  • FIG. 15 is a schematic diagram showing an optical system that realizes Maxwellian vision using a point light source (directional light source) employed in an image display apparatus according to Embodiment 4 of the present invention.
  • FIG. 16 is a schematic diagram showing an optical system using a normal surface light source (diffusive light source).
  • 11 Housing, 12: LCD panel (display unit), 14: First mirror (optical component), 16: Eyeball camera (line-of-sight detection unit), 18: Camera for line-of-sight shooting (imaging unit), 20 : Second mirror, 22: Light source, 23: Infrared light source, 24: Eyepiece, 25: Correction lens, 32: Optical shirt,
  • Control unit 64: Non-volatile memory (virtual image storage unit), 66: Input unit,
  • LS Gaze direction
  • EB Eyeball
  • CB Calibration signal (light flux)
  • GP Gaze point
  • RI Actual image
  • VI Virtual image.
  • FIG. 1 is a schematic diagram showing an image display device 1 according to Embodiment 1 of the present invention
  • FIG. 2 is a block diagram showing each component unit of the image display device 1 of FIG. Note that this image display device is normally used by being attached to the observer's eyes (left eye and right eye). In the following embodiments, the image display device attached to the left eye unless otherwise specified. It is to be understood that the image display apparatus mounted on the right eye is configured symmetrically with respect to the symmetry plane SS indicated by the one-dot chain line in FIG.
  • an image display device 1 includes a display unit 10, a drive unit 40 that moves the display unit 10, and the display unit 10 and the drive unit 40. And an information processing unit 60 connected to be wired or wirelessly controllable.
  • the display unit 10 includes a display unit 12 such as a liquid crystal display (LCD) panel disposed inside a housing 11 having a substantially hemispherical outer shape, and an optical component 14 such as a semitransparent mirror. And an eye photography camera (line-of-sight detection unit) 16 for detecting the line-of-sight direction LS of the observer (user) indicated by a broken-line arrow in FIG. Further, the display unit 10 includes a camera (imaging unit) 18 for capturing a visual line direction for capturing an actual image observed in the visual line direction LS. According to the configuration shown in FIG. 1, the imaging unit 18 is located outside the nosing / housing 11 and is disposed in the line-of-sight direction LS.
  • a display unit 12 such as a liquid crystal display (LCD) panel disposed inside a housing 11 having a substantially hemispherical outer shape
  • an optical component 14 such as a semitransparent mirror.
  • an eye photography camera (line-of-sight detection unit) 16 for
  • the display unit 12, the optical component 14, the line-of-sight direction detection unit 16, and the imaging unit 18 constituting the display unit 10 are fixed with respect to each other in the housing 11, and as the housing 11 moves, It is possible to move together while maintaining the relative positional relationship.
  • these cameras process the obtained image information digitally by the information processing unit 60, and are preferably a charge-coupled device (CCD) or CMOS imaging unit. It is configured using a digital image sensor such as a device.
  • the drive unit 40 includes a first pivot part (rotary drive part) 42 connected to the housing 11 of the display unit 10 at one end and a second part at the other end. And a second arm 48 slidably attached to the second pivot 44.
  • the first arm 46 (including the first and second pivot portions 42 and 44) and the second arm 48 move the display unit 10 to an arbitrary position in response to a command from the information processing unit 60. Can be made.
  • the information processing unit 60 is a general information control terminal such as a personal 'computer or a portable' computer, and includes a control unit 62 such as a central processing unit (CPU) and a hard disk for storing virtual images or It has a virtual image storage unit 64 composed of nonvolatile memory such as a flash memory and an input unit 66 composed of a keyboard or touch panel.
  • a control unit 62 such as a central processing unit (CPU) and a hard disk for storing virtual images or It has a virtual image storage unit 64 composed of nonvolatile memory such as a flash memory and an input unit 66 composed of a keyboard or touch panel.
  • a virtual image storage unit 64 composed of nonvolatile memory such as a flash memory
  • an input unit 66 composed of a keyboard or touch panel.
  • the line-of-sight direction detection unit 16 passes through the semi-transparent mirror 14 and the pupil position of the observer's eyeball EB, that is, the observer's line of sight.
  • the direction LS is detected, and the line-of-sight direction data is transmitted to the control unit 62 of the information processing unit 60.
  • the control unit 62 drives the drive unit 40 to move the display unit 10 so that the imaging unit 18 is arranged in the detected line-of-sight direction LS.
  • the imaging unit 18 Since a series of operations of the line-of-sight direction detection unit 16, the information processing unit 60, and the drive unit 40 is continuously performed while the display unit 10 is mounted on the observer, the imaging unit 18 always observes. A real image of the actual object observed in the person's gaze direction LS is captured. Real image data captured by the imaging unit 18 is sequentially transmitted to the control unit 62. [0018] Similarly, the virtual image data stored in the virtual image storage unit 64 is transmitted to the control unit 62, where it is digitally processed and superimposed on the real image captured by the imaging unit 18, Data of the synthesized image is generated.
  • the composite image data is transmitted to the display unit 10, displayed on the display unit 12, and the light beam passing therethrough is reflected by the mirror 14 and projected onto the observer's eyeball EB.
  • the control unit 62 adjusts the brightness of the real image and the virtual image to generate composite image data, so that a more natural composite image can be displayed to the observer.
  • the line-of-sight direction detection unit 16 constantly monitors the line-of-sight direction of the observer and transmits line-of-sight direction data to the control unit 62. Therefore, when the line of sight of the observer moves, the control unit 62 A command is issued to 40, and the display unit 10 is moved so that the imaging unit 18 captures an image in the line-of-sight direction LS. At this time, preferably, the drive unit 40 moves the display unit 10 on a spherical surface centered on the eyeball EB of the observer so that the distance to the display unit 10 is constant also in the eyeball EB force of the observer. .
  • the real image always in the line-of-sight direction LS is captured following the observer's line-of-sight direction LS, and the composite image is superimposed on the virtual image. Since the image can be displayed to the observer, a more natural mixed reality can be given to the observer.
  • the imaging unit 18 of the display unit 10 shown in FIG. 1 is located outside the housing 11 and is disposed on the line-of-sight direction LS, but is not limited to this. That is, in the image display device 1 shown in FIG. 3, the imaging unit 18 is disposed inside the housing 11. In addition to the first mirror 14 that reflects the image displayed on the display unit 12 toward the observer's eyeball EB, a real image in the line-of-sight direction LS is projected onto the imaging unit 18 in the nosing 11. A second mirror 20 is provided. Thus, the real image in the line-of-sight direction is picked up by the image pickup unit 18 and superimposed on the virtual image by the control unit 62 as in the first embodiment. The composite image is displayed on the display unit 12, reflected by the first mirror 14, and displayed to the observer. Is done.
  • the first mirror 14 may be a force deflection separation mirror described as being translucent.
  • the second mirror 20 may be a dichroic mirror that reflects visible light and transmits infrared light.
  • the display unit 10 includes a light source 22, a transmissive LCD panel (display unit) 12 disposed adjacent to the light source 22, and a light flux from the LCD panel 12. It has a deflecting beam splitter (optical component) 14 that reflects toward the eyeball EB, and an eyepiece 24 that focuses the light beam onto the eyeball EB.
  • the line-of-sight direction detection unit 16 and the imaging unit 18 are assembled so that their vertical center axes are arranged in a straight line.
  • the display unit 10 configured as described above, white light from the light source 22 is transmitted through the transmissive LCD panel 12, and the light beam including the composite image displayed on the transmissive LCD panel 12 is reflected by the deflecting beam splitter 16. And projected onto the observer's eyeball EB. As a result, the observer can see the composite image displayed on the transmissive LCD panel 12.
  • the image display device 1 it is possible to follow the observer's line of sight, capture a real image always in the line of sight, and display it to the observer. it can.
  • the gaze direction detection unit 16 and the imaging unit 18 are easy to assemble (fix) so that their vertical central axes are arranged in a straight line. It is difficult to accurately align the vertical center axis of the line-of-sight direction detection unit 16 (imaging unit 18) and the line-of-sight direction LS. It is necessary to perform a process (calibration) that reliably calibrates the central axis to the line-of-sight direction LS.
  • the LCD panel 12 An image of a predetermined shape such as a circle is displayed at the center of the image sensor 13 (ie, the center of the imaging unit 18), and a calibration signal (light beam) CB is projected to the translucent mirror 14 by directing it (arrow 26a). Then, it is reflected by the semitransparent mirror 14 and irradiated to the eyeball EB (arrow 26b).
  • a calibration signal (light beam) CB is projected to the translucent mirror 14 by directing it (arrow 26a). Then, it is reflected by the semitransparent mirror 14 and irradiated to the eyeball EB (arrow 26b).
  • the calibration light beam CB is reflected by the eyeball EB in the line-of-sight direction LS, passes through the translucent mirror 14, It is detected by the line-of-sight direction detector 16 (arrow 26c). That is, if the center of the line-of-sight detection unit 16 (calibration signal CB of a predetermined shape) and the line-of-sight direction (intersection 28 of the dashed line in FIGS. 6 (a) and (b)) LS match, FIG.
  • a calibration-shaped luminous flux (image) with a predetermined shape is detected at the intersection 28 of the dotted chain line, but if the central axis of the line-of-sight direction detector 16 and the line-of-sight direction LS are not accurately aligned, As shown in Fig. 6 (b), the image of the predetermined shape deviates from the intersection 28 of the alternate long and short dash line.
  • the line-of-sight direction detection unit 16 detects that in FIG. Based on the image, the control unit 62 calculates a deviation amount of the center axis (that is, the calibration light beam CB) of the imaging unit 18 with respect to the intersection 28 (that is, the line-of-sight direction LS) of the one-dot chain line. Then, the control unit 62 moves the display unit 10 using the drive unit 40 in accordance with the amount of deviation of the central axes of the imaging unit 18 and the gaze direction detection unit 16 with respect to the gaze direction LS, as shown in FIG.
  • the control unit 62 moves the display unit 10 using the drive unit 40 in accordance with the amount of deviation of the central axes of the imaging unit 18 and the gaze direction detection unit 16 with respect to the gaze direction LS, as shown in FIG.
  • Such an image having a predetermined shape (calibration light beam CB) is preferably projected onto the eyeball EB at an interval shorter than the time recognizable by the observer when the observer observes the predetermined gazing point.
  • This makes it possible to calibrate the central axis of the gaze direction detection unit 16 (and the imaging unit 18) to the gaze direction LS without bothering the observer as much as possible.
  • such a calibration process is periodically performed so that the central axis of the line-of-sight direction detection unit 16 and the line-of-sight direction LS are always aligned.
  • the line-of-sight direction detection unit 16 constantly monitors the line-of-sight direction LS. Specifically, as shown in FIG. 7, the line-of-sight direction detection unit 16 has an infrared light source 23 such as an infrared LED lamp, and the infrared light from the infrared light source 23 passes through the translucent mirror 16. Then, the pupil position of the eyeball EB, that is, the line-of-sight direction LS is detected by detecting the reflected infrared ray irradiated to the eyeball EB of the observer.
  • an infrared light source 23 such as an infrared LED lamp
  • the line-of-sight direction LS is detected using infrared light that is not recognized by the observer, so the observer's line-of-sight direction LS is constantly monitored without giving the observer a sense of incongruity. be able to.
  • the display unit 10 described above with reference to FIG. 7 includes a gaze direction photographing camera (imaging unit) 18 for capturing an actual image observed in the gaze direction LS, and the gaze direction of the observer. Inspection As shown in FIG. 8, the present invention is not limited to this. However, the present invention is not limited to this, and the imaging unit 18 and the line-of-sight detection unit 1 6 , And a single CCD or CMOS imaging device 30 may be used to image both the real image and the pupil of the eyeball EB.
  • the image pickup device 30 is generally formed using a silicon substrate, and is configured to pick up a real image with the front side facing the line-of-sight direction LS and detect infrared light irradiated from the back side. can do.
  • an optical shirter 32 such as a liquid crystal shirter is arranged in front of the imaging unit 18, and the optical shirter 32 is opened to pick up a real image, and the line of sight is detected. Closes the optical shirt 32.
  • the image display apparatus 1 can always display the real image in the line-of-sight direction LS to the observer following the line of sight of the observer.
  • the gaze direction detection unit 16 and its peripheral circuit can be shared (omitted). Therefore, the number of parts can be reduced and the display unit 10 can be reduced in size and weight.
  • a force that employs a transmissive LCD panel as the display unit 12 is not limited to this. That is, the display unit 10 includes a light source 22 and a display unit 12 such as a reflective LCD panel, as shown in FIG. That is, the white light having the light source 22 passes through the deflecting beam splitter 14, and the light beam formed as an image by the reflective LCD panel 12 is reflected by the deflecting beam splitter 14 and irradiated to the observer's eyeball EB. Thus, the observer can observe the composite image displayed on the reflective LCD panel 12. Since the display unit 10 including the reflective LCD panel 12 can generally achieve higher definition image quality than the display unit including the transmissive LCD panel 12, the observer can use the reflective display unit 10 shown in FIG. It can be used to observe a more precise image.
  • the drive unit 40 may have the force and other structures described as having the first and second arms 42 and 44.
  • the drive unit 40 is roughly composed of a substantially hemispherical outer peripheral wall 56 and a transparent member. And a hollow guide housing 50 formed by the inner peripheral wall 57 and the end wall 58, and the display unit 10 is slidably disposed in the hollow guide housing 50.
  • a plurality of electromagnets 52 are disposed, while a plurality of permanent magnets 54 are disposed in the display unit 10.
  • the control unit 62 adjusts the magnetic force (the amount of current flowing through the electromagnet 52) of each electromagnet 52 in the hollow guide nosing 50 so that the imaging unit 18 is arranged on the detected line-of-sight direction LS.
  • the display unit 10 can be moved along the hollow guide nosing 50 (arrow 56).
  • the control unit 62 has failed in the display unit 10 or the information processing unit 60 and cannot display an appropriate composite image to the observer.
  • the failure of the display unit 10 and the information processing unit 60 is not limited to this, but for example, interruption of image data transmission from the line-of-sight direction detection unit 16 and the imaging unit 18 to the control unit 62, malfunction of the display unit 12
  • the power failure or malfunction of the control unit 62 is included.
  • a force that can be realized by using any means understood by those skilled in the art For example, the display unit 10 is energized to exclude the display unit 10.
  • a spring member (not shown) is released in response to the failure, and the state force of FIG. 11 (a) also moves the first arm 46 to the state of FIG. 11 (b).
  • Embodiment 2 of the image display apparatus according to the present invention will be described below with reference to FIG.
  • the real image is directly observed by the observer
  • the display unit 10 is the image of the first embodiment except that only the virtual image is projected onto the observer's eyeball EB. Since it has the same configuration as that of the display device 1, detailed description of the overlapping components is omitted. The same components as those in Embodiment 1 will be described using the same reference numerals.
  • the display unit 10 generally includes a light source 22 disposed inside the housing 11, a display unit 12 such as a liquid crystal display (LCD) panel, and reflects visible light by reflecting infrared light. Transmitting dichroic mirror 15; Translucent mirror 14 that reflects infrared light and transmits visible light; Eyepiece 24 and correction lens 2 5 for converging LS-direction luminous flux into eyeball EB, observer A line-of-sight direction detection unit 16 for detecting the line-of-sight direction LS, and an imaging unit 18 for measuring the brightness of an actual image observed in the line-of-sight direction LS.
  • a line-of-sight direction detection unit 16 for detecting the line-of-sight direction LS
  • an imaging unit 18 for measuring the brightness of an actual image observed in the line-of-sight direction LS.
  • the observer can directly see the object in the line-of-sight direction LS through the eyepiece lens 25, the translucent mirror 14, and the correction lens 24. Further, the luminous flux with the line-of-sight direction LS force is reflected by the semi-transparent mirror 14, and the amount of light (the actual brightness of the image observed in the line-of-sight direction LS) is detected by the imaging unit 18.
  • the line-of-sight direction detection unit 16 has an infrared light source 23. Infrared light from the infrared light source 23 is reflected by the dichroic mirror 15 and the semitransparent mirror 14 and projected onto the eyeball EB. It is detected by the detector 16. Thus, as in the first embodiment, the line-of-sight direction detection unit 16 can detect the line-of-sight direction LS of the eyeball EB of the observer.
  • the virtual image stored in the non-volatile memory 64 of the processing arithmetic unit 60 is processed by the control unit 62 and then transmitted to the LCD panel 12 to form an image.
  • the white light from the light source 22 passes through the LCD panel 12, and the light beam including the virtual image passes through the dichroic mirror 15, is reflected by the semitransparent mirror 14, and is projected onto the observer's eyeball EB.
  • the observer can observe the virtual image while directly viewing the real image.
  • the light amount of the real image (the actual brightness of the image observed in the line-of-sight direction LS) is detected by the imaging unit 18, and the light amount data is transmitted to the control unit 62. Can be adjusted according to the brightness of the real image.
  • the virtual image can be superimposed on the real image more naturally and displayed to the observer.
  • the image display device 3 of the third embodiment has the same configuration as the image display devices 1 and 2 described above except that it has a distance sensor that measures the distance from the eyeball EB to the gazing point. Detailed description of the constituent elements to be performed will be omitted. Also, Components similar to those in the first embodiment will be described using the same reference numerals.
  • a gaze point is generally defined as a point where an object to be observed by an observer exists, but when observing an object with both eyes, the gaze point GP is This is also the intersection of the eyes LS and LS.
  • the left and right eyeballs EB the line segment between the centers of EB C
  • Each eyeball EB can be calculated from the line-of-sight direction LS, for example, 0 and ⁇ .
  • L R L R 1 2 can be expressed as a function of distance L and angles ⁇ and ⁇ .
  • Image display device 3 is configured to reproduce the natural perspective obtained when observed with the naked eye as described above. That is, in FIG. 14, the observer is gazing at the chair (real image) RI at the gazing point GP, and the distance L from the eyeball EB to the gazing point is calculated as described above. At this time, when forming a composite image by superimposing a virtual image VI (image of a pigeon in FIG.
  • the control unit 62 Calculates the real image data obtained from the imaging unit 18 and the virtual image data stored in the virtual image storage unit 64 to display the real image RI clearly, while displaying the virtual image VI in a blurred manner. Thus, a composite image is formed.
  • attribute information X related to the distance (virtual distance) to the position where the virtual image VI should be seen (the position where it should be seen) is assigned to the virtual image VI.
  • the virtual image VI is stored in the nonvolatile memory 64.
  • the observer may input the attribute information of the virtual distance X using the input unit 66 of the information processing unit 60. That is, each virtual image can have an arbitrary virtual distance (attribute information) X.
  • the information processing unit 60 of Embodiment 3 determines the distance to the real image RI (the real distance ) Measure L and compare the distance (virtual distance X) to the position where the virtual image VI should be visible (the position where it will be visible) . If the real distance L and the virtual distance X are different, the virtual image is Real image RI is clearly displayed and virtual image VI is unclearly displayed when the image is far from or near the image.
  • the information processing unit 60 compares the virtual distance X with the real distance, and blurs the virtual image VI that is not placed near the real image (gaze point GP) in the gaze direction LS.
  • Real image Superimposes on RI to form a composite image. The synthesized image obtained in this way allows the observer to obtain a more realistic perspective.
  • the information processing unit 60 when the virtual distance X is within a predetermined distance d with respect to the actual distance L (that is, when X becomes L ⁇ d or X> L + d). ) Alternatively, only the virtual image VI may be blurred and superimposed on the real image RI and displayed on the LCD panel 12.
  • the distance from the point of sight GP to the eyeball EB may be measured directly.
  • the perspective of the virtual image VI with respect to the real image RI can be expressed. That is, the observer can enjoy a realistic image with only one eye using the display unit 10 having a simpler configuration.
  • the visual distance L is compared with the real distance L and the virtual distance X. It is not placed near the real image (gaze point GP) in the direction LS! ⁇ It is possible to provide the observer with a realistic virtual image VI by blurring the virtual image VI and projecting it on the eyeball EB. it can. [0044] Embodiment 4.
  • Embodiment 4 of the image display device according to the present invention will be described below with reference to FIGS.
  • the light source of the display unit 10 of the image display device described so far is a surface light source
  • the image display device 4 of the fourth embodiment is generally except that the display unit 10 includes a point light source. Since it has the same configuration as that of the image display apparatus described so far, detailed description of the overlapping components will be omitted. The same components as those in the above embodiment are described using the same reference numerals.
  • the light source 22 of the display unit 10 of Embodiment 4 shown in FIG. 15 is configured to be a point light source (a light source that emits a light beam having directivity), and the line-of-sight direction of the light source 22
  • the size in the direction orthogonal to LS (width in the line-of-sight direction) w is configured to have the same size as the pupil diameter a of the eyeball EB. More specifically, the light source 22 is set so that its size w is 4 times or less, preferably 2 times or less of the pupil diameter a, and when the general pupil diameter a is 3 mm,
  • the size w of 22 is set to be 12 mm or less, preferably 6 mm or less.
  • the light flux from the light source 22 is uniformly diffused from a limited area (point light source) as shown in FIG.
  • the light passes through 12 and is condensed by the condenser lens 34 so as to focus on the pupil P of the eyeball EB, and forms an image on the retina.
  • an observer without depending on the lens effect of the lens CL of the eyeball EB can always see a clear image (Maxwell's view).
  • the image displayed on the LCD panel 12 should be projected directly (with good reproducibility) onto the observer's eyeball EB. Can do.
  • the lens CL has a predetermined lens effect ( The observer can perceive a clear image only when it has a lens power.
  • the optical system that realizes Maxwellian vision described in the fourth embodiment is applied to the image display device 1 of the first embodiment, the combined image power SLCD in which the virtual image VI is superimposed on the real image RI. Displayed on the panel 12, the observer can clearly see the composite image displayed on the LCD panel 12 regardless of the real distance L and the virtual distance X.
  • an optical system that realizes Maxwell's vision is used in the image display device 2 of Embodiment 2, it is actually related to the actual distance L (depth of focus) when viewing a real image with the naked eye. Regardless of whether the observed object is far or near, the virtual image displayed on the LCD panel 12 can always be clearly recognized without depending on the lens effect of the lens CL of the eyeball EB. it can.
  • the observer sets an arbitrary value using the input unit 66 using the virtual distance X as attribute information of the virtual image VI, and the control unit 62 uses the LCD panel 12.
  • the real distance L and the virtual distance X are different, the real image RI may be displayed clearly and the virtual image VI may be displayed unclearly.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Multimedia (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Ophthalmology & Optometry (AREA)
  • Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Signal Processing (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

Dispositif d’affichage d’image permettant à un utilisateur de voir une image composite d’aspect réaliste sans la sensation désagréable provoquée par cette image virtuelle étant superposée à une image réelle. Le dispositif d’affichage d’image comprend : une unité de détection de direction de ligne de visée pour détecter la direction de ligne de visée des yeux d’un observateur ; une unité d’imagerie pour capter une image réelle agencée dans la direction de ligne de visée d’un oeil de l’observateur, une unité de stockage d’image virtuelle pour stocker une image virtuelle, une unité de commande pour former une image combinée en superposant l’image virtuelle stockée dans l’unité de stockage d’image virtuelle sur l’image réelle captée par l’unité d’imagerie, et une unité d’affichage pour afficher l’image combinée à l’observateur.
PCT/JP2006/308448 2005-04-26 2006-04-21 Dispositif d’affichage d’image WO2006118057A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2005128203A JP2006308674A (ja) 2005-04-26 2005-04-26 画像表示装置
JP2005-128203 2005-04-26

Publications (1)

Publication Number Publication Date
WO2006118057A1 true WO2006118057A1 (fr) 2006-11-09

Family

ID=37307857

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2006/308448 WO2006118057A1 (fr) 2005-04-26 2006-04-21 Dispositif d’affichage d’image

Country Status (2)

Country Link
JP (1) JP2006308674A (fr)
WO (1) WO2006118057A1 (fr)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009122550A (ja) * 2007-11-16 2009-06-04 Panasonic Electric Works Co Ltd 網膜投影ディスプレイ装置
JP2011232669A (ja) * 2010-04-30 2011-11-17 Casio Comput Co Ltd 表示装置
EP2590002A1 (fr) * 2011-11-04 2013-05-08 Honeywell International Inc. Affichage proche orientable et système d'affichage proche orientable
JP2015005972A (ja) * 2013-05-22 2015-01-08 株式会社テレパシーホールディングス 撮影画像のプライバシー保護機能を有するウェアラブルデバイス及びその制御方法並びに画像共有システム
JP2018173661A (ja) * 2018-07-23 2018-11-08 旭化成株式会社 眼鏡レンズを有する光学装置、及びそれを用いた眼鏡、並びに眼鏡型表示装置
CN109991746A (zh) * 2019-03-08 2019-07-09 成都理想境界科技有限公司 图像源模组及近眼显示系统
CN110199324A (zh) * 2017-01-31 2019-09-03 株式会社和冠 显示装置及其控制方法
WO2021020069A1 (fr) * 2019-07-26 2021-02-04 ソニー株式会社 Dispositif et procédé d'affichage et programme
WO2021181797A1 (fr) * 2020-03-11 2021-09-16 国立大学法人福井大学 Dispositif d'affichage à balayage rétinien et système d'affichage d'image
JPWO2022054740A1 (fr) * 2020-09-09 2022-03-17

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010085786A (ja) * 2008-09-30 2010-04-15 Brother Ind Ltd 頭部装着型表示装置
JP5295714B2 (ja) * 2008-10-27 2013-09-18 株式会社ソニー・コンピュータエンタテインメント 表示装置、画像処理方法、及びコンピュータプログラム
JP5420464B2 (ja) * 2010-04-02 2014-02-19 オリンパス株式会社 表示装置、電子機器、携帯用電子機器、携帯電話、及び撮像装置
JP2015213226A (ja) * 2014-05-02 2015-11-26 コニカミノルタ株式会社 ウエアラブルディスプレイ及びその表示制御プログラム
KR101648021B1 (ko) * 2014-11-28 2016-08-23 현대자동차주식회사 시선 인식 기능을 갖는 차량 및 그 제어방법과 시선 인식 시스템
WO2016113951A1 (fr) * 2015-01-15 2016-07-21 株式会社ソニー・インタラクティブエンタテインメント Visiocasque et système d'affichage vidéo
JP7207954B2 (ja) * 2018-11-05 2023-01-18 京セラ株式会社 3次元表示装置、ヘッドアップディスプレイシステム、移動体、およびプログラム
CN109856796A (zh) * 2018-11-20 2019-06-07 成都理想境界科技有限公司 图像源模组、波导、近眼显示系统及其控制方法

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11196351A (ja) * 1997-12-27 1999-07-21 Mr System Kenkyusho:Kk 表示装置
JP2000059666A (ja) * 1998-08-07 2000-02-25 Victor Co Of Japan Ltd 撮像装置
JP2002090688A (ja) * 2000-09-12 2002-03-27 Masahiko Inami 視線方向依存型の網膜ディスプレイ装置
JP2002271691A (ja) * 2001-03-13 2002-09-20 Canon Inc 画像処理方法、画像処理装置、記憶媒体及びプログラム
JP2004191962A (ja) * 2002-11-29 2004-07-08 Brother Ind Ltd 画像表示装置
JP2006039359A (ja) * 2004-07-29 2006-02-09 Shimadzu Corp 頭部装着型表示装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11196351A (ja) * 1997-12-27 1999-07-21 Mr System Kenkyusho:Kk 表示装置
JP2000059666A (ja) * 1998-08-07 2000-02-25 Victor Co Of Japan Ltd 撮像装置
JP2002090688A (ja) * 2000-09-12 2002-03-27 Masahiko Inami 視線方向依存型の網膜ディスプレイ装置
JP2002271691A (ja) * 2001-03-13 2002-09-20 Canon Inc 画像処理方法、画像処理装置、記憶媒体及びプログラム
JP2004191962A (ja) * 2002-11-29 2004-07-08 Brother Ind Ltd 画像表示装置
JP2006039359A (ja) * 2004-07-29 2006-02-09 Shimadzu Corp 頭部装着型表示装置

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009122550A (ja) * 2007-11-16 2009-06-04 Panasonic Electric Works Co Ltd 網膜投影ディスプレイ装置
JP2011232669A (ja) * 2010-04-30 2011-11-17 Casio Comput Co Ltd 表示装置
EP2590002A1 (fr) * 2011-11-04 2013-05-08 Honeywell International Inc. Affichage proche orientable et système d'affichage proche orientable
US8681426B2 (en) 2011-11-04 2014-03-25 Honeywell International Inc. Steerable near-to-eye display and steerable near-to-eye display system
JP2015005972A (ja) * 2013-05-22 2015-01-08 株式会社テレパシーホールディングス 撮影画像のプライバシー保護機能を有するウェアラブルデバイス及びその制御方法並びに画像共有システム
CN110199324B (zh) * 2017-01-31 2023-12-29 株式会社和冠 显示装置及其控制方法
CN110199324A (zh) * 2017-01-31 2019-09-03 株式会社和冠 显示装置及其控制方法
JP2018173661A (ja) * 2018-07-23 2018-11-08 旭化成株式会社 眼鏡レンズを有する光学装置、及びそれを用いた眼鏡、並びに眼鏡型表示装置
CN109991746A (zh) * 2019-03-08 2019-07-09 成都理想境界科技有限公司 图像源模组及近眼显示系统
US11854444B2 (en) 2019-07-26 2023-12-26 Sony Group Corporation Display device and display method
WO2021020069A1 (fr) * 2019-07-26 2021-02-04 ソニー株式会社 Dispositif et procédé d'affichage et programme
WO2021181797A1 (fr) * 2020-03-11 2021-09-16 国立大学法人福井大学 Dispositif d'affichage à balayage rétinien et système d'affichage d'image
JP2021144124A (ja) * 2020-03-11 2021-09-24 国立大学法人福井大学 網膜走査型表示装置および画像表示システム
JPWO2022054740A1 (fr) * 2020-09-09 2022-03-17
JP7123452B2 (ja) 2020-09-09 2022-08-23 株式会社Qdレーザ 画像投影装置

Also Published As

Publication number Publication date
JP2006308674A (ja) 2006-11-09

Similar Documents

Publication Publication Date Title
WO2006118057A1 (fr) Dispositif d’affichage d’image
US10382748B2 (en) Combining video-based and optic-based augmented reality in a near eye display
EP3330771B1 (fr) Afficheur et procédé d'affichage à l'aide d'un foyer et affichages de contexte
US9711114B1 (en) Display apparatus and method of displaying using projectors
JP5167545B2 (ja) 視点検出装置
JP5858433B2 (ja) 注視点検出方法及び注視点検出装置
JP2020506745A (ja) 拡張現実眼科手術用顕微鏡の投射のためのシステムと方法
JPH0759032A (ja) 画像ディスプレー装置
JPH08313843A (ja) 視線追従方式による広視野高解像度映像提示装置
WO2005063114A1 (fr) Dispositif et procede de detection de ligne visuelle et dispositif de mesure de point de vision tridimensionnel
TW201843494A (zh) 具有視訊透視之顯示系統
JP4500992B2 (ja) 三次元視点計測装置
CN114503011A (zh) 跟踪眼睛瞳孔的运动的紧凑视网膜扫描装置及其应用
JP5484453B2 (ja) 複数の動作モードの光学機器
JP2023539962A (ja) リアルタイム画像上に仮想画像を重ね合わせるためのシステムおよび方法
EP3548956B1 (fr) Système d'imagerie et procédé de production d'images de contexte et de mise au point
KR101941880B1 (ko) 자유 초점 디스플레이 장치
JP2006011145A (ja) 双眼顕微鏡装置
JP3976860B2 (ja) 立体映像撮像装置
JP2006053321A (ja) 投影観察装置
JP2507913B2 (ja) 投射方式による眼球運動追従型視覚提示装置
JP2008227834A (ja) 頭部搭載型映像提示装置、頭部搭載型撮影装置、及び頭部搭載型映像装置
JP2011158644A (ja) 表示装置
JP5962062B2 (ja) 自動合焦方法及び装置
JPH0638142A (ja) 視線追従型ヘッドマウントディスプレイ

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

NENP Non-entry into the national phase

Ref country code: RU

122 Ep: pct application non-entry in european phase

Ref document number: 06745579

Country of ref document: EP

Kind code of ref document: A1