US20010030715A1 - Stereo image display apparatus - Google Patents
Stereo image display apparatus Download PDFInfo
- Publication number
- US20010030715A1 US20010030715A1 US08/865,187 US86518797A US2001030715A1 US 20010030715 A1 US20010030715 A1 US 20010030715A1 US 86518797 A US86518797 A US 86518797A US 2001030715 A1 US2001030715 A1 US 2001030715A1
- Authority
- US
- United States
- Prior art keywords
- image
- right eye
- left eye
- display
- image display
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 230000003287 optical effect Effects 0.000 claims abstract description 81
- 230000000694 effects Effects 0.000 claims abstract description 23
- 230000001965 increasing effect Effects 0.000 claims description 23
- 230000008878 coupling Effects 0.000 claims description 2
- 238000010168 coupling process Methods 0.000 claims description 2
- 238000005859 coupling reaction Methods 0.000 claims description 2
- 238000003079 width control Methods 0.000 claims description 2
- 210000001508 eye Anatomy 0.000 description 250
- 230000004438 eyesight Effects 0.000 description 21
- 238000001514 detection method Methods 0.000 description 17
- 230000015654 memory Effects 0.000 description 17
- 230000004907 flux Effects 0.000 description 16
- 238000010586 diagram Methods 0.000 description 14
- 230000006870 function Effects 0.000 description 14
- 230000000875 corresponding effect Effects 0.000 description 11
- 230000000873 masking effect Effects 0.000 description 11
- 238000000034 method Methods 0.000 description 11
- 210000001747 pupil Anatomy 0.000 description 9
- 230000000007 visual effect Effects 0.000 description 9
- 230000033001 locomotion Effects 0.000 description 8
- 230000004308 accommodation Effects 0.000 description 7
- 210000005252 bulbus oculi Anatomy 0.000 description 6
- 238000000605 extraction Methods 0.000 description 6
- 230000008569 process Effects 0.000 description 6
- 230000001276 controlling effect Effects 0.000 description 5
- 210000003128 head Anatomy 0.000 description 5
- 230000008859 change Effects 0.000 description 4
- 238000010276 construction Methods 0.000 description 4
- 210000001525 retina Anatomy 0.000 description 4
- 206010010071 Coma Diseases 0.000 description 2
- 239000000284 extract Substances 0.000 description 2
- 238000005286 illumination Methods 0.000 description 2
- 230000001105 regulatory effect Effects 0.000 description 2
- 235000010627 Phaseolus vulgaris Nutrition 0.000 description 1
- 244000046052 Phaseolus vulgaris Species 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 238000010420 art technique Methods 0.000 description 1
- 201000009310 astigmatism Diseases 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 238000009499 grossing Methods 0.000 description 1
- 208000013057 hereditary mucoepithelial dysplasia Diseases 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 230000031700 light absorption Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 230000000630 rising effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B30/00—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
- G02B30/20—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
- G02B30/34—Stereoscopes providing a stereoscopic pair of separated images corresponding to parallactically displaced views of the same object, e.g. 3D slide viewers
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/122—Improving the 3D impression of stereoscopic images by modifying image signal contents, e.g. by filtering or adding monoscopic depth cues
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/128—Adjusting depth or disparity
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/156—Mixing image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/332—Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
- H04N13/344—Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/398—Synchronisation thereof; Control thereof
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0132—Head-up displays characterised by optical features comprising binocular systems
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B2027/0178—Eyeglass type
Definitions
- the present invention relates to stereo image display apparatuses capable of displaying stereo image to the viewer with left eye and right eye images with a binocular parallax and, more particularly, to improvements in the stereo image display apparatus for alleviating the departure from the natural sense of viewing and fatigue of the viewer viewing stereo image.
- FIG. 18 is a perspective view showing a head-mounted display (HMD) 700 as an example of such stereo image display apparatus.
- the illustrated HMD 700 is a binocular stereo display.
- the HMD 700 has a frame 702 , which is mounted on the viewer's head and supports a left and a right display element and also a left and a right enlarging optical system 701 in front of the viewer's left and right eyes.
- a left eye image is displayed to the left eye, while a right eye is displayed to the right eye, whereby the viewer can view the image as stereo image.
- the frame 702 has a sensor support 703 supporting a head motion sensor 704 , which is located on the head and detects motion of the head.
- a data processor 720 is connected via a cable 722 to a connector 706 , which is supported on a connector support 705 provided on the frame 702 .
- a loudspeaker 709 for outputting sound is provided in each air.
- the data processor 720 has operating buttons 720 a which are operable by the user for various operations. With the usual stereo image display apparatus such as the HMD, the viewing distance and the verging distance fail to coincide with each other, as will be described later in detail, thus resulting in a departure from the natural sense of viewing.
- FIGS. 19 ( a ) to 19 ( c ) are views for describing how a left and a right eye images are viewed as stereo image in the stereo image display apparatus.
- These figures show an example of stereo image viewed by the left and right eyes.
- the image includes two objects, i.e., a sphere and a triangle pyramid, the sphere becoming closer to the viewer.
- the left eye and right eye images are changed from those shown in FIG. 19( a ) to those shown in FIG. 19( b ) and then to those shown in FIG. 19( c ).
- the sphere is moved toward the center while being gradually increased in size. This means that the binocular parallax is being gradually increased.
- FIG. 20 shows the way in which the images shown in FIGS. 19 ( a ) to 19 ( c ) are viewed with the two eyes.
- Increasing binocular parallax leads to verging for merging, i.e., reaching or going to reach a viewer's state of perceiving one image on the basis of a plurality of images), so that the viewer's eyeballs are turned inward.
- This rotation of the eyes is called vergence, and the angle of the rotation is called vergence angle in the illustrated definition.
- the distance between the optical axes of the eyeballs in vergence and each eye is called parallax distance.
- the parallax distance is equal to the distance between the point of intersection of the main beans of the left and right images and the main plane of the eyepiece optical system.
- the vergence of the eyeballs immediately causes accommodation. With increasing vergence angle, the accommodation tends to be closer. Conversely, with reducing the vergence angle, the accommodation tends to be further apart.
- the stereo image display the plane in which image can be viewed with the best contrast is fixed.
- the distance from this plane to each eyeball is the viewing distance.
- inconsistency has heretofore taken place.
- the above phenomenon occurs not only in the HMD but also in various stereo television sets, such as those of shutter switching system, lenticular system, etc.
- the viewing distance of stereo television is the distance from the display surface of the display such as a CRT, to each eyeball of the viewer.
- Japanese Patent Publication Heisei 6-85590 proposes an HMD, in which the viewing distance is varied according to the image motion or the like through mechanical driving of the eyepiece lens.
- Japanese Laid-Open Patent Publication Heisei 3-292093 discloses a method of varying the viewing degree by detecting a point viewed by the viewer and moving the lenses according to depth information at the viewed point. These systems permit to obtain coincidence of the viewing degree and the verging angle with each other.
- Japanese Laid-Open Patent Publication Heisei 7-167633 shows a method of controlling the optimum viewing point, which permits the viewer to perceive the depth world of an object in the broadcast range, by calculating the point from the binocular parallax of image, such as the point is reproduced on the surface of a stereo image display unit or at a specified distance from the surface.
- parallax map is calculated from left and right images by using a correlation matching map, and then the mean value of parallax of the entire image or weighted mean parallax of a central part of the image is calculated.
- a parallax controller controls the horizontal read timing of left and right images to cause parallel movement of the image in the horizontal direction. This method does not require any mechanical drive system, and it is thus possible to prevent size increase.
- FIGS. 21 ( ) to 21 ( c ) are views showing left eye and right eye images displayed in a stereo image display apparatus, which was proposed earlier by the inventor (Japanese Patent Application Heisei 8-28856).
- a stereo image display apparatus which was proposed earlier by the inventor (Japanese Patent Application Heisei 8-28856).
- two objects i.e., a sphere and a triangular pyramid, are displayed, the sphere becoming closer to the viewer.
- the left eye and right eye images are changed from those shown in FIGS. 21 ( a ) to those shown in FIG. 21( b ) and then to those shown in FIG. 21( c ).
- the parallax of the left eye and right eye images is substantially fixed irrespective of the motion of the sphere toward and away from the viewer.
- FIG. 22 shows the way of viewing of the images shown in FIG. 21 displayed on an HMD with the two eyes.
- the verging distance L with respect to the sphere is unchanged although the image of the ball is increased as the ball becomes closer.
- the triangular pyramid on the other hand, is moved apart from the viewer although its size is unchanged. In other words, the distance difference between the triangular pyramid and the sphere is increased as in the prior art case. Nevertheless, the verging distance L with respect to the sphere is substantially fixed.
- the viewer perceive as though the sphere is becoming closer to him or her while the triangular pyramid is not changing its position. It is thus possible to provide images with a stereo sense while holding a substantially constant verging distance with respect to the ball.
- the verging distance L of the sphere in FIG. 22 is made coincident with the viewing distance. More preferably, an eye detector judges whether the viewer is viewing the sphere or the triangular pyramid, and the verging distance of the image being viewed is made substantially constant.
- FIG. 23 is a view for explaining the status of merging of stereo image, which is actually displayed on a left and a right display surface.
- the relation between the binocular parallax and the verging distance L when viewing a stereo image is now considered.
- the horizontal positions X 1 and X 2 of the sphere on the left and right display surface when the sphere is viewed to be at a verging distance L and on a horizontal position ⁇ H are respectively driven as equations (1) and (2).
- d is the distance from the mid point between a left and a right lens to each lens (the distance being positive for the right eye and negative for the left eye), and ⁇ is the half field angle.
- X 1 and X 2 are prescribed as follows.
- FIG. 24 is a view showing how the horizontal positions X 1 and X 2 in FIG. 23 are normalized. As shown in FIG. 24, the normalization is made by setting the horizontal center value of the display region to “0” and the horizontal length of the display region to “2”. Equation (1) can be derived from the fact that the triangle with pints A to C in FIG. 23 as the apices and the triangle with origin O and points X 1 and CX on the left display surfaces as the apices are similar to each other. Likewise, equation (2) can be derived from the similarity of the triangle with points D, B and E as the apices and the triangle with the origin O and points X 2 and E on the right display surface to each other.
- Equations (1) and (2) can rearranged into equation (3).
- Equation (3) shows that the verging distance L when the merging is attained is determined independent on the horizontal position H if the parallax is determined.
- FIG. 25 is a graph showing the correspondence relation between accommodation (i.e., state of focus of the eyes) and vergence.
- the figure shows the permissible range of the vergence accommodation and the parallax (“O Plus E”, Seiri Kogaku Dec. 15, 1985, pp. 103).
- the ordinate is taken for the accommodation (parallax) (D: diopter), and the abscissa is taken for the vergence (vergence angle MW). It will be seen from the graph that the vergence is obtainable in a short period of time so log as its changes are within 4 diopters.
- FIG. 26 is a schematic view for describing the influence given to a stereo image viewer by the display area frames (i.e., display area edges) in a display apparatus having a right eye and a left eye image display area.
- a right eye and a left eye LCDs 11 R and 11 L with a right eye and a left eye image display area 11 Rd and 11 Ld, respectively, are provided for the right and left eyes 10 R and 10 L. Images on the display areas of the LCDs 11 R and 11 L, are perceived as images of a right eye and a left eye eyepiece optical system 12 R and 12 L by the viewer through the right and left eyes 10 R and 10 L.
- a right eye image display area 11 Rd of the right eye LCD 11 R On the right eye image display area 11 Rd of the right eye LCD 11 R, a right side and a left side image edges (i.e., boundaries between display and non-display areas) are formed as a right and a left edge 11 R rr and 11 R rl , respectively.
- a right side and a left side image edges i.e., boundaries between display and non-display areas
- FIG. 26 In the case of FIG. 26, like the case described before in connection with FIG. 22, an image is assumed which contains two objects, i.e., a sphere and a triangular pyramid, the sphere becoming closer to the viewer.
- the verging distance L with respect to the sphere is unchanged although the image thereof is increased as the sphere is becoming closer.
- the triangular pyramid on the other hand, becomes away from the viewer although its size is unchanged. That is, the distance difference between the triangular pyramid and the sphere is being increased while the verging distance L with respect to the sphere is substantially fixed.
- the positions of right edge images 11 ir and 11 il which are merged or verged in a binocular visual field formed by the right and left edges 11 R rr and 11 R rl of the right eye LCD 11 R and the right and left edges 11 L rr and 11 L rl of the left eye LCD 11 L (i.e., the distance between the image of the sphere and the image of the edge), are fixed as shown.
- the stereo image display system shown in FIG. 26 utilizes the fact as shown in FIG. 22 that the person's eyes are not so sensitive to detect the absolute distance of an object although they are sensitive to relative distance changes.
- the system thus permits providing an image with a stereo sense as though the viewer sees the sphere becoming closer to him or her while the position of the triangular pyramid is unchanged, while holding the verging distance L with respect to the sphere substantially constant.
- the distance between the image of the sphere and the image of the edge is fixed.
- FIG. 27 is a schematic view showing a case, in which the distances of the right and left edges 11 R rr and 11 R rl of the right eye image display area 11 R d of the right eye LCD 11 R and the right and left edges 11 L rr and 11 L rl of the left eye image display area 11 L d of the left eye LCD 11 L from one another are variable. Increasing the edge-to-edge distance between the two eyes (i.e., between the edges 11 R rl and 11 L rr ) is increased as shown in FIG. 27, gives rise to commonly called field struggling when images produced on the right and left LCDs 11 R and 11 L are to be verged to produce a stereo image.
- the parallax concerning a left and a right image is detected from the correlation between the full frames of the left and right images.
- imposing a restriction on the images for obtaining the correlation therebetween may result in an erroneous judgment.
- the present invention has an object of providing a stereo image display apparatus of the pertaining type, which gives due considerations to the influence given to the viewer viewing a stereo image by the edges of the display areas of the display means, i.e., the boundaries (or edges) of the display and non-display areas of the display means, and can display stereo images which do not spoil the sense as though the viewer is actually on the site of the image.
- Another object of the present invention is to provide a stereo image display apparatus of the pertaining type, which permits adequate detection of the correlation between a left and a right image in connection with the detection of a parallax concerning these images.
- a stereo image display apparatus comprising: binocular parallax control means for executing a control operation to vary a right eye and a left eye images with a binocular parallax therebetween such that the binocular parallax is substantially fixed in effect; display means capable of displaying the left eye and right eye images on respective predetermined display areas; and shading-off means for shading off in effect edge portions of the display areas of the left eye and right eye images.
- edge portions of the display areas i.e., boundary portions between display and non-display areas
- the display is unnatural as stereo image display, spoiling the viewer's sense just like the viewer is actually in on the site of the image scene.
- edge portions of the display areas are shaded off in effect and made difficult to be clearly recognized.
- the display thus has a natural sense as stereo image display, providing enhanced sense of the viewer just like the viewer is actually on the site of the image scene.
- the shading-off means in the first aspect includes luminance restricting means for restricting the luminance of edge portions of the left eye and right eye display areas such that the luminance is reduced as one goes toward the edges of the display areas.
- the luminance of the display areas is reduced toward the edges thereof.
- the edge portions of the display areas are thus shaded off and made difficult to be verged, making the sense about the distance of the display area edge portions unclear.
- the viewer's sense just like the viewer is actually on the site of the image scene is thus made difficult to be interfered with.
- the shading-off means in the first aspect includes resolution restricting optical means for restricting the resolution of edge portions of the left eye and right eye image display areas such that the resolution becomes coarser as one goes toward the edges of the display areas.
- the edge portions of the display areas are made difficult to be revolved.
- the edge portions are thus shaded off and made difficult to be verged, making the distance sense of the display area edge portions unclear.
- the viewer's sense just like the viewer is actually on the site of the image scene is thus made difficult to be interfered with.
- a stereo image display apparatus comprising: display means capable of displaying a left eye and a right eye images with a binocular parallax therebetween on respective predetermined display areas; horizontal display position control means for controlling the horizontal display positions of the left eye and right eye images on the left eye and right eye image display areas in opposite directions, respectively; monochrome display area generating means for making the left and right edges and neighborhood thereof of the left eye and right eye image display areas to be predetermined monochrome display areas; and monochrome display area width control means for controlling the width of the monochrome display areas, constituted by the left and right edges and neighborhood thereof of the left eye and right eye image display areas, such as to be increased on the right edge side of the display and reduced on the left edge side thereof when the position of the images on the display areas is shifted to the left, and increased on the left edge side of the display areas and reduced on the right edge side thereof when the position of the images on the display areas is shifted to the left, and increased on the left edge side of the display areas and reduced on the right edge side thereof when
- the widths of the left and right edge portions of the left eye and right eye image display areas can be varied by the monochrome display area, the verging distance of the edges of (or boundaries between) of the monochrome display areas and the rest of the image display areas is varied in effect. It is thus possible to vary the relative distances of the monochrome display areas and the rest of the image display areas, thus making the display natural as stereo image display and enhancing the viewer's sense just like the viewer is actually on the site of the image scene.
- the monochrome display area generating means in the fourth aspect regulates the width of the monochrome display area constituted by the left edge and neighborhood thereof of the left eye image display area to be greater than the width of the monochrome display area constituted by the right edge and neighborhood thereof of the same display area, and also regulates the width of the monochrome display area constituted by the right edge and neighborhood thereof of the right eye image display area to be greater than the left edge and neighborhood thereof of the same display area.
- the width of the monochrome display area which is constituted by the left edge and neighborhood thereof of the left eye image display area, is made to be greater than the width of the monochrome display area constituted by the right edge and neighborhood thereof of the same display area, and the width of the monochrome display area constituted by the right edge and neighborhood thereof of the right eye image display area is made to be greater than the monochrome display area constituted by the left edge and neighborhood thereof of the same display area.
- the monochrome display area generating means generates black display areas as monochrome display areas.
- the outside of the image display areas are usually dark.
- black display areas as the monochrome display areas
- the edge portions of the image display areas can be made less recognizable as such, thus making it difficult to interfere with the viewer's sense just like the viewer is actually on the site of the image scene.
- a stereo image display apparatus comprising: display means capable of displaying a left eye and a right eye image with a binocular parallax therebetween on respective predetermined display areas; spacial frequency detecting means for detecting a spacial frequency concerning the left eye or right eye image; correlation calculation area specifying means for specifying correlation calculation area according to the spacial frequency detected by the spacial frequency detecting means such that the specified correlation calculation area is the smaller the relatively higher the detected spacial frequency and the greater the relatively lower the detected spacial frequency; correlation calculating means for calculating a correlation of the left eye and right eye images to each other with respect to the correlation calculation area specified by the correlation calculation area specifying means; and binocular parallax control means for controlling the binocular parallax in effect according to the result of the correlation calculation in the correlation calculating means.
- the area of the window as the subject of the correlation calculation is specified according to the spacial frequencies of images, optimum correlation calculation can be made adaptively in dependence on whether the images are fine or coarse. It is thus possible to improve both the efficiency and accuracy of the correlation detection.
- the stereo image display apparatus further comprises viewed point detecting means for detecting a point viewed by the viewer in the display areas of the display means, the spacial frequency detecting means being operable to detect a spacial frequency of images with respect to the viewed point detected by the viewed point detecting means and the neighborhood thereof.
- the stereo image display apparatus further comprises viewed point detecting means for detecting a point viewed by the viewer in the display areas of the display means, the spacial frequency detecting means being operable to detect a spacial frequency of images with respect to the viewed point detected by the viewed point detecting means and the neighborhood thereof.
- the spacial frequency of images can be detected within the point viewed by the viewer and the proximity of that point, so that it is possible to provide an inexpensive apparatus.
- the correlation calculation area specifying means in the eighth aspect specifies the horizontal size of the correlation calculation area to be the smaller the relatively higher a horizontal spacial frequency detected by the spacial frequency detecting means and be the greater the relatively lower the horizontal spacial frequency, and/or specifies the vertical size of the correlation calculation area to be the smaller the relatively higher a vertical spacial frequency detected by the spacial frequency detecting means and be the greater the relatively lower the vertical spacial frequency.
- the horizontal and vertical sizes of the specific area (i.e., window) as the subject of the correlation detection is selected according to both the horizontal and vertical spacial frequencies.
- An adequate window shape thus can be selected according to the two-dimensional fineness (or coarseness) of images, thus permitting adaptive optical correlation calculation. It is thus possible to improve both the efficiency and accuracy of the correlation detection.
- the stereo image display apparatus further comprises edge extracting means for extracting edge portions of at least either of the left eye and right eye images with a binocular parallax therebetween, the spacial frequency detecting means being operable to detect a spacial frequency concerning an image displayed with coupling of edge portions detected by the edge extracting means.
- the spacial frequency detection is made by adopting the simple method of the edge extraction and edge counting.
- a simple and inexpensive apparatus thus is obtainable compared to the case of the Fourier transform method or the like.
- FIG. 1 is a block diagram showing an embodiment of the stereo image display apparatus according to the present invention.
- FIGS. 2 ( a ) to 2 ( c ) are schematic views for describing the line signal extraction and correlation calculation in the parallax reading means 40 in the apparatus shown in FIG. 1;
- FIG. 3 is a perspective view showing the detailed construction of optical systems for displaying image as an essential element in the embodiment shown in FIG. 1;
- FIG. 4 is an optical path diagram showing an example of eyepiece optical system, in which the optical systems shown in FIG. 3 function as shading-off means;
- FIG. 5 is an optical path diagram showing another example of the eyepiece optical system, in which the optical systems shown in FIG. 1 function as shading-off means;
- FIG. 6 is an optical path diagram showing a further example of the eyepiece optical system, in which the optical systems shown in FIG. 3 function as shading-off means;
- FIG. 7 is an optical path diagram showing a further example of the eyepiece optical system, in which the optical systems shown in FIG. 3 function as shading-off means;
- FIGS. 8 ( a ) and 8 ( b ) are views showing optical elements of a further example of the optical system, in which the optical systems shown in FIG. 3 function as shading-off means;
- FIGS. 9 ( a ) and 9 ( b ) are views showing the disposition of an optical element in a further example of the example, in which the optical systems shown in FIG. 3 function as shading-off means;
- FIG. 10 is a view showing an optical element in a further example of the optical system, in which the optical systems shown in FIG. 3 function as shading-off means.
- FIG. 11 is a block diagram showing a right eye image system in a different embodiment of the present invention.
- FIGS. 12 ( a ) to 12 ( c ) are views for describing the control operation of the width of masking in the embodiment described before in connection with FIG.
- FIGS. 13 ( a ) to 13 ( c ) are views for describing how masked images are seen
- FIG. 14 is a block diagram showing a further embodiment of the present invention.
- FIG. 15 is a flow chart for describing the operation of the embodiment shown in FIG. 14;
- FIG. 16 is a view for describing the operation of determining the area and position of the window in the embodiment shown in FIG. 14;
- FIGS. 17 ( a ) and 17 ( b ) are views for describing the operation of the window area determination in the embodiment shown in FIG. 14;
- FIG. 18 is a perspective view showing a head-mounted display (HMD) 700 as an example of such stereo image display apparatus;
- HMD head-mounted display
- FIGS. 19 ( a ) to 19 ( c ) are views for describing how a left and a right eye images are viewed as stereo image in the stereo image display apparatus;
- FIG. 20 shows the way in which the images shown in FIGS. 19 ( a ) to 19 ( c ) are viewed with the two eyes;
- FIGS. 21 ( ) to 21 ( c ) are views showing left eye and right eye images displayed in a stereo image display apparatus which was proposed earlier;
- FIG. 22 shows the way of viewing of the images shown in FIG. 21 displayed on an HMD with the two eyes;
- FIG. 23 is a view for explaining the status of merging of stereo image, which is actually displayed on a left and a right display surface;
- FIG. 24 is a view showing how the horizontal positions X 1 and X 2 in FIG. 23 are normalized;
- FIG. 25 is a graph showing the correspondence relation between accommodation and vergence
- FIG. 26 is a schematic view for describing the influence given to a stereo image viewer by the display area frames (display area edges) in a display apparatus having a right eye and a left eye image display area;
- FIG. 27 is a schematic view showing a case, in which the distances of the right and left edges 11 R rr and 11 R rrl of the right eye image display area 11 R d of the right eye LCD 11 R and the right and left edges 11 L rr and 11 L rl of the left eye image display area 11 L d of the left eye LCD 11 L from one another are variable.
- FIG. 1 is a block diagram showing an embodiment of the stereo image display apparatus according to the present invention.
- a right eye and a left eye LCD 11 R and 11 L having a right eye and a left eye image display area, respectively, are provided for a right and a left eyes 10 R and 10 L, respectively.
- Images on the display areas of the LCDs 11 R and 11 L are perceived as images produced by a right eye and a left eye eyepiece optical system 12 R and 12 L by the viewer through the a right and a left eye 10 R and 10 L.
- An image reproducing unit 31 reproduces and outputs a stereo image signal representing a stereo image as shown in FIGS. 19 ( a ) to 19 ( c ).
- a right eye image signal from the image reproducing unit 31 is coupled to an image shifter 32 R for conversion to provide a stereo image as shown in FIGS. 21 ( a ) to 21 ( c ) on the right eye LCD 11 R.
- the signal is coupled to a right eye LCD driver 33 R to display a right eye image on the LCD 11 R.
- a left eye image signal from the image reproducing unit 31 is coupled through an image shifter 32 L and a left LCD driver 33 L to a left eye LCD 11 R for displaying a left eye image thereon.
- an eyesight detector 18 is provided for either of the right and left eyes 10 R and 10 L (i.e., the left eye 10 L in this example) to detect the eyesight from that eye.
- the eyesight detector 18 includes a photoelectric converting element 17 , which receives light from a light source 15 and also reflected right from a eyeball surface reflecting the light from the light source 15 .
- the eyesight detector 18 provides an eyesight detection signal as its output signal (which is data representing a portion of the image that is viewed by the viewer) to a line signal extractor 45 .
- a line signal extractor 45 To the line signal extractor 45 are also coupled the right eye and left eye image signals from the image reproducing unit 31 . From each of these signals, the line signal extractor 45 extracts a plurality of horizontal lines centered on the image potion viewed by the viewer, and supplies the extracted signal together with the eyesight detection signal coupled to it to a correlation calculator 46 .
- the correlation calculator 46 calculates a parallax value between the right eye and left eye image signals from the correlation between the extracted pluralities of horizontal lines of signals, and outputs a parallax signal representing the calculated parallax value.
- the eyesight detector 18 , line signal extractor 45 and correlation calculator 46 together constitute parallax reading means 40 , which reads out a parallax concerning the images displayed on the display means on the basis of the right eye and left eye image signals.
- the parallax signal outputted from the parallax reading means 40 (the correlation calculator 46 ), is coupled to an address converter 47 .
- the address converter 47 outputs data representing an address number corresponding to the value of the parallax signal coupled to it. According to the address number data, of data provided in a table in a memory 48 as necessary shift data holding means holding necessary horizontal shift data corresponding to the right eye and left eye images, those which correspond to the above address number are retrieved and outputted from the memory 48 .
- the shift data or shift signals representing necessary shift amounts, which are thus read out, are coupled to the image shifters 32 R and 32 L, respectively.
- the image shifters 32 R and 32 L execute a signal processing of horizontally shifting the images, which are produced on the right eye and left eye LCDs 11 R and 11 L by the right eye and left eye image signals from the image reproducing unit 31 , by the above necessary shift amounts, thus making the binocular parallax concerning the right eye and left eye images to be adequate.
- the image shifters 32 R and 32 L, line signal extractor 45 , correlation calculator 46 and address converter 47 may be constructed as a single collective digital circuit or as respective separate digital processors or circuits. It is also possible to include the image reproducing unit 31 and the right eye and left eye LCD drivers 33 R and 33 L either entirely or partly as digital circuit in the data processor or circuit.
- the horizontal display positions of the viewed image patterns are shifted (together with background) on the right eye and left eye image display areas.
- FIGS. 2 ( a ) to 2 ( c ) are schematic views for describing the line signal extraction and correlation calculation in the parallax reading means 40 in the apparatus shown in FIG. 1.
- FIG. 2( b ) a plurality of line signals centered on line y′ are extracted from each of the right eye and left eye image signals.
- FIGS. 2 ( a ) to 2 ( c ) show an example, in which the sole line y′ of signal is extracted.
- a horizontal line video signal corresponding to the vertical coordinate y′ in the coordinates (x′, y′) is extracted as a left y′ line and a right y′ line image signal.
- the correlation between the left y′ line and right y′ line image signals extracted in the above way, is calculated with respect to the horizontal coordinate x′ in the coordinates (x′, y′).
- a signal in a section ⁇ x centered on x′ is taken to calculate the correlation between it and the right y′ line image signals.
- the time difference of a portion, i.e., a most highly correlated portion, of the right eye image is detected, and the amount of parallax is determined from the calculated time difference.
- a shift amount calculator calculates the necessary shift amount according to the data thus obtained.
- the parallax reading means 40 (correlation calculator 46 ), address converter 47 , memory 48 , image shifters 32 R and 32 L and right eye and left eye LCD drivers 33 R and 33 L serve together for a control operation to vary the right eye and left eye images with a binocular parallax therebetween such that the binocular parallax is substantially fixed in effect by horizontally shifting the images on the right eye and left eye LCD drivers 33 R and 33 L by the necessary shift amount as noted above.
- FIG. 3 is a perspective view showing the detailed construction of optical systems for displaying image as an essential element in the embodiment shown in FIG. 1.
- the system comprises the right eye eyepiece optical system 12 R, which is a prism having a convex mirror 12 R b as a bottom inner surface and a substantially diagonally provided inner central half mirror 12 R d .
- An image on the right eye LCD 11 R as an image display element is incident on the corresponding top surface of the prism 12 R, then transmitted through the half mirror 12 R d , and then reflected by the convex mirror 12 R b on the bottom inner surface of the prism.
- the reflected light is reflected by the lower surface of the half mirror 12 R d to the left side of the prism in the figure and incident on the right eye 10 R from the pupil 10 R p to form a virtual image on the retina.
- An optical system for the left eye is line symmetric to the optical system for the right eye described above, and will be understood from the figure by replacing “R” in the reference symbols in the detecting system for the right eye with “L”.
- One feature (or element) of the present invention resides in shading-off means, which shades off in effect edge portions (i.e., frames and the neighborhood thereof) of the displays areas of the left eye and right eye images with a binocular parallax therebetween as noted before.
- edge portions i.e., frames and the neighborhood thereof
- the image display are provided to the viewer has shaded-off or gradated edge portions.
- proposal publication various elements described in Japanese Laid-Open Patent Publication Heisei 7-325266 (hereinafter referred to as proposal publication) by the present applicant.
- FIG. 4 is an optical path diagram showing an example of eyepiece optical system, in which the optical systems shown in FIG. 3 function as shading-off means.
- this system is described as Embodiment 1 on the basis of FIG. 4.
- a beam splitter prism 12 R (or 12 L) is provided, which has an inclined half mirror 12 R d (or 12 L d ) disposed at the intersection between the optical axis of the LCD 11 R (or 11 L) and the viewer's eyesight axis.
- the beam splitter prism 12 R (or 12 L) has a convex mirror 12 R b (or 12 L b ) at the bottom.
- An image light flux from the LCD 11 R (or 11 L) is incident on the top surface of the beam splitter prism 12 R (or 12 L), then transmitted through the half mirror 12 R d (or 12 L d ) and then reflected by the convex mirror 12 R b ( 12 L b ) as the bottom inner surface of the prism.
- the reflected light flux is reflected by the lower surface of the half mirror 12 R d (or 12 L d ) and incident on the right eye (or left eye) pupil 10 R p (or 10 L p ) to form a virtual image on the retina.
- viewing angles are 58° horizontal and 44.2° vertical (the satisfactory image-focusing angles being about 60% of the viewing angles).
- FIG. 5 is an optical path diagram showing another example of the eyepiece optical system, in which the optical systems shown in FIG. 1 function as shading-off means.
- this system is described as Embodiment 6 on the basis of FIG. 9.
- a beam splitter prism 12 R (or 12 L) is provided, which has an inclined half mirror 12 R d (or 12 L d ) disposed at the intersection between the optical axis of the LCD 11 R (or 11 L) and the viewer's eyesight axis.
- the beam splitter prism 12 R (or 12 L) has a convex mirror 12 R b (or 12 L b ) at the bottom.
- An image light flux from the LCD 11 R (or 11 L) is incident on the top surface of the beam splitter prism 12 R (or 12 L), then transmitted through the half mirror 12 R d (or 12 L d ) and then reflected by the convex mirror 12 R b (or 12 L b ) as the bottom inner surface of the prism.
- the reflected light flux is reflected by the lower surface of the half mirror 12 R d (or 12 L d ) and incident on the right eye (or left eye) pupil 10 R p ( 10 L p ) to form a virtual image on the retina.
- the end surface (or to surface of the prism 12 R (or 12 L) on the side of the right eye (or left eye) LCD 11 R (or 11 L) as the image display element is a non-spherical surface such that the convex power is increased as one goes away from the optical axis and changed to convex power as one goes outward from a certain position.
- Astigmatism and coma are generated to deteriorate the resolution in the edge portions of the image display area and thin off the boundary between the image and non-image areas. (The image-focusing performance of the eyepiece optical system is deteriorated in the edge portions of the image display area.)
- FIG. 6 is an optical path diagram showing a further example of the eyepiece optical system, in which the optical systems shown in FIG. 3 function as shading-off means.
- this system is described as Embodiment 7 on the basis of FIG. 10.
- a beam splitter prism 12 R (or 12 L) is provided, which has an inclined half mirror 12 R d (or 12 L d ) disposed at the intersection between the optical axis of the LCD 11 R (or 11 L) and the viewer's eyesight axis.
- the beam splitter prism 12 R (or 12 L) has a convex mirror 12 R b (or 12 L b ) at the bottom.
- An image light flux from the LCD 11 R (or 11 L) is incident on the top surface of the beam splitter prism 12 R (or 12 L), then transmitted through the half mirror 12 R d (or 12 L d ) and then reflected by the convex mirror 12 R d (or 12 L d ) as the bottom inner surface of the prism.
- the reflected light flux is reflected by the lower surface of the half mirror 12 R d (or 12 L d ) and incident on the right eye (or left eye) pupil 10 R p (or 10 L p ) to form a virtual image on the surface.
- satisfactory image-focusing angles are 12° horizontal and 9° vertical, and viewing angles are 38° horizontal and 29° vertical, (the satisfactory image-focusing angles being about 32% of the viewing angles).
- a light-blocking frame SF having a similar shape to that of the edges of the LCD 11 R (or 11 L) is used to prevent the main light flux of the light fluxes from the edges of the LCD 11 R (or 11 L) from reaching the viewer's eye.
- the light-blocking frame SF When the light-blocking frame SF is located at the position of the LCD 11 R (or 11 L), it serves as visual field stop, and its shape is projected onto the eye. For this reason, the light-blocking frame SF should be located at a position spaced apart from the LCD 11 R (or 11 L) by more than the depth-of-focus. Preferably, the light-blocking frame SF is located at a position spaced apart from the LCD by more than 20 times the depth-of-focus. By so doing, the effect of shading-off can be further increased. Furthermore, by increasing the size of the light-blocking frame SF the light fluxes from the edges of the LCD 11 R (or 11 L) can be perfectly blocked.
- FIG. 7 is an optical path diagram showing a further example of the eyepiece optical system, in which the optical systems shown in FIG. 3 function as shading-off means.
- this system is described as Embodiment 13 on the basis of FIG. 14.
- a beam splitter prism 12 R (or 12 L) is provided, which has an inclined half mirror 12 R d disposed at the intersection between the optical axis of the LCD 11 R (or 11 L) and viewer's eyesight axis.
- the beam splitter prism 12 R (or 12 L) has a convex mirror 12 R b (or 12 L b ) at the bottom.
- An image light flux form the LCD 11 R (or 11 L) is incident on the to surface of the beam splitter prism 12 R (or 12 L), then transmitted through the half mirror 12 R d (or 12 L d ) and then reflected by the convex mirror 12 R d ( 12 L d ) as the bottom inner surface of the prism.
- the reflected light flux is reflected by the lower surface of the half mirror 12 R d (or 12 L d ) and incident on the right eye (or left eye) pupil 10 R p ( 10 L p ) to form a virtual image on the retina.
- the portion of the convex mirror 12 R b ( 12 L b ) which is coated on the bottom portion of the beam splitter 12 R ( 12 L) is limited to the portion A smaller than the entire bottom portion and other portion around there is transparent or light absorption portion.
- viewing angles are 38° horizontal and 29° vertical (the satisfactory image-focusing angles being about 32% of the viewing angles).
- the outside portion of the outermost portion is not mirror-coated to prevent the main optical beam of the outermost portion from incident to the viewer's eyes.
- the limitation of the mirror-coat portion completely cut the optical beam of the outermost portion
- the main optical beam of the outermost portion of the right eye and left eye LCDs 11 R and 11 L as the image display element is not incident to the viewer's eyes, making shade-off in effect for the edge portion and neighborhood thereof of the display portion corresponding to the left eye and right eye images.
- FIGS. 8 ( a ) and 8 ( b ) are views showing optical elements of a further example of the optical system, in which the optical systems shown in FIG. 3 function as shading-off means.
- the element shown in FIG. 8( a ) is described in the proposal publication as Embodiment 9 on the basis of FIG. 12.
- the light transmittivity is reduced stepwise toward the edges.
- the light transmittivity is reduced continuously toward the edges.
- Such a light-blocking member is disposed between the right eye (or left eye) LCD 11 R (or 11 L) as the image display element in the optical systems shown in FIG. 3 and an illuminating system (not shown) behind the LCD for illumination light intensity control to darken and make obscure the image of image display area edges. Edge portions (i.e., frames and the neighborhood thereof) of the display areas of the left eye and right eye images are thus shaded off in effect.
- FIGS. 9 ( a ) and 9 ( b ) are views showing the disposition of an optical element in a further example of the example, in which the optical systems shown in FIG. 3 function as shading-off means.
- this system is described as Embodiment 10 on the basis of FIG. 19.
- a backlight BKL is disposed in the proximity of an LCD used as image display element.
- the backlight BKL is disposed such that it is far apart from the LCD. This disposition has an aim of positively generating illumination irregularities with respect to the LCD 11 R (or 11 L) to relatively darken the image of the edge portions of the LCD as the image display element. Edge portions (i.e., frames and the neighborhood thereof) of the areas of display of the left and right eye images are thus shaded off in effect.
- FIG. 10 is a view showing an optical element in a further example of the optical system, in which the optical systems shown in FIG. 3 function as shading-off means.
- this system is described as Embodiment 13 on the basis of FIG. 16.
- the element shown in FIG. 10 is a diffuser, the diffusing effect of which is increased step-wise as one goes toward the edges.
- This diffuser is disposed between the right eye (or left eye) LCD 11 R (or 11 L) as the image display element and the beam splitter prism 12 R (or 12 L) in the optical systems shown in FIG. 13 to darken and made obscure the image of the image display area edges. Edge portions (i.e., frames and the neighborhood thereof) of the areas of display of the left and right eye images are thus shaded off in effect. It is possible to dispose the diffuser at a suitable position on the optical path in the beam splitter prism 12 R (or 12 L).
- FIG. 11 is a block diagram showing a right eye image system in a different embodiment of the present invention. While FIG. 11 shows only the right eye image system, the embodiment also comprises a left eye image system which is alike in construction.
- the parallax reading circuit 71 forms a parallax signal representing a parallax concerning the left and right eye images.
- the parallax signal is coupled to a shift amount converter 72 .
- the shift amount converter 72 derives an adequate shift amount (i.e., shift amount of the horizontal positions of images or control of left and right masking widths of the LCD display areas) corresponding to the parallax value represented by the parallax signal, and generates a shift amount signal representing an image shift amount.
- the shift amount signal is coupled to a read controller 73 and also to a masking signal generator 74 .
- the right eye image signal (which is an analog signal), is also coupled to an A/D converter 75 .
- the output signal (i.e., right eye digital image signal) from the A/D converter 75 is written in a memory 76 under control of an externally provided write control signal.
- the right eye digital image signal thus written in the memory 76 is read out under control of a read control signal, which is generated in the read control signal 73 in correspondence to the shift amount signal.
- the right eye digital image signal read out from the memory 76 represents a right eye image as shown in FIGS. 21 ( a ) to 21 ( c ) which is in a proper horizontal position in the display area. This signal is coupled as one input to a mixer 77 .
- the masking signal prescribes the form (i.e., width) of a left and a right masking portion of the LCD display area as schematically shown on the right side of the block 74 in FIG. 11.
- the mixer 77 mixes together the two inputs, i.e., the right eye digital image signal read out from the memory 76 , and the masking signal from the masking signal generator 74 , to form a digital image signal which represents an image on the display area with the left and right sides masked by an adequate width, i.e., an image on the display area with the left and right edges and neighborhood thereof in monochrome display, such as black display (as will be described later with reference to FIGS. 12 ( a ) to 12 ( c ).
- This signal is coupled to the D/A converter 78 .
- the D/A converter 78 generates an analog image signal, which is coupled to an LCD driver 33 R.
- the LCD driver 33 R drives LCD 11 R for displaying the right eye image according to the right eye analog image signal coupled thereto.
- a left eye image system likewise comprises an A/D converter 75 , a memory 76 , a mixer 77 and a D/A converter 78 .
- the output of the D/A converter in this system is coupled to a left eye LCD driver 33 L to drive the left eye LCD 11 L and display the left eye image (see FIG. 1).
- FIGS. 12 ( a ) to 12 ( c ) are views for describing the control operation of the width of masking (i.e., converted area of the left and right edges and neighborhood thereof of the image display areas into predetermined monochrome display, such as black display) in the embodiment described before in connection with FIG. 11.
- the width of masking i.e., converted area of the left and right edges and neighborhood thereof of the image display areas into predetermined monochrome display, such as black display
- the sphere is becoming closer while the triangular pyramid is becoming far away from the viewer in the order of FIGS. 12 ( a ) to 12 ( c ).
- the width of the monochrome display parts is controlled such that it is increased on the right side of the display area and reduced on the left side when position of the image on the display area is shifted to the left, while it is increased on the left side of the display area and reduced on the right side when the image position is shifted to the right.
- the width WLL of the monochrome part of the left eye display area i.e., the left edge and neighborhood thereof of the display area
- the width WLL of the monochrome part of the same image display area i.e., the right edge and neighborhood thereof of the display area
- the width WLL of the monochrome part of the right eye image display area i.e., the right edge and neighborhood thereof of the display area
- the width WLL of the monochrome part of the same display area i.e., the left edge and neighborhood of the display area.
- FIGS. 13 ( a ) to 13 ( c ) are views for describing how masked images (with monochrome parts) are seen.
- FIGS. 13 ( a ) to 13 ( c ) correspond to FIGS. 12 ( a ) to 12 ( c ).
- the position of the masked parts (i.e., monochrome parts) of images i.e., position of edge portions of the images
- the viewer's sense as though he or she is actually on the side of the image scene is enhanced with respect to the approaching motion of the sphere (in the depth direction).
- FIG. 14 is a block diagram showing a further embodiment of the present invention.
- the left eye and right eye image signals from a the image reproducing unit shown in FIG. 1 are coupled to a low-pass filter 81 for smoothing and high frequency noise removal.
- the smoothed left eye and right eye image signals outputted from the low-pass filter 81 are coupled to the respective left eye and right eye LCDs noted above and also to a differential circuit 82 .
- the differential circuit 82 differentiates the two input image signals and extracts signals corresponding to edge portions of images. Of the output signals of the differential circuit 82 , that which is of the left eye image signal system is coupled to a squaring and integrating circuit 83 .
- the squaring and integrating circuit 83 first squares the output of the differentiating circuit 82 , which is a signal comprising both positive and negative components (i.e., differential components corresponding to rising and falling edges).
- the squared signal comprises positive components only.
- the circuit 83 then integrates the squared signal for each predetermined time section.
- the integrated value is the greater the higher the frequency of appearance of differential peaks in the predetermined time section. Consequently, the output of the squaring and integrating circuit 83 has the greater value the finer the images in the areas corresponding to the predetermined time section and relatively the higher the mean spacial frequency.
- To the squaring and integrating circuit 83 is coupled a parallax detection signal representing the position viewed by the viewer.
- the above predetermined time section for the integration is selected such that it is matched to the viewer's viewed position represented by the detection signal.
- the output of the squaring and integrating circuit 83 is coupled to a memory read position controller 84 as the next stage.
- the eyesight detection signal representing the viewer's viewed position which is obtained by the eyesight detecting means such as the eyesight detector 18 described before in connection with FIG. 1 and coupled to the squaring and integrating circuit 83 , is also coupled to the memory read position controller 84 .
- the memory read position controller 84 determines the area of the window noted above according to the output signal of the squaring and integrating circuit 83 , which is indicative of whether the spacial frequency is high or not.
- the window area to be correlation-detected between the left eye and right eye images is selected to be relatively small when the spacial frequency is relatively high.
- the window area is selected to be relatively large.
- the window area is selected according to the parallax detection signal such as to match the center position of the window to the viewer's viewed position.
- the memory read position controller 84 outputs a signal representing the area and center position of the window.
- the output of the differentiating circuit 82 i.e., the outputs of the left eye and right eye image systems, is stored in respective memories 85 and 86 .
- An appropriate A/D converter is provided in an input section of or as a preceding stage to each of the memories 85 and 86 .
- Data are read out from the memories 85 and 86 under control of the signal from the memory read position controller 84 , representing the area and center position of the window.
- These read-out data which each correspond to an appropriate window area, are coupled to a correlation calculator 87 for deriving a parallax concerning the left eye and right eye images.
- the output of the correlation calculator 87 i.e., a parallax signal
- a shift amount calculator 88 which obtains a signal for appropriately shifting the images according to the difference between the present and desired parallax values.
- the output of the shift amount calculator 88 is coupled to the image shifters which were described before in connection with FIG. 1 for horizontal image position control concerning the left eye and right eye images.
- FIG. 15 is a flow chart for describing the operation of the embodiment shown in FIG. 14.
- the low-pass filter 51 smoothes and removes the high frequency noise components from the left eye and right eye image signals as its inputs (step S 1 ).
- the two smoothed image signals are then subjected to edge extraction. More specifically, the differentiating circuit 82 differentiates the smoothed image signals to extract signals corresponding to image edges (step S 2 ). Of these signals obtained as a result of the edge extraction, the squaring and integrating circuit 83 detects a frequency characteristic (i.e., discriminates whether the spacial frequency of the images is relatively high or not) (step S 3 ).
- the area of the correlation detection subject window is determined (step S 4 ).
- the horizontal and vertical sizes n and m of the window may be determined according to the horizontal and vertical spacial frequencies, respectively.
- the center position (Xt, Yt) of the window is determined (step S 5 ).
- the eyesight detection signal representing the position viewed by the viewer, which is obtained in the eyesight detecting means such as the eyesight detector 18 as described before in connection with FIG. 1, is supplied to the squaring and integrating circuit 83 and the memory read position controller 84 as described before in connection with FIG. 14.
- the correlation calculator 87 derives the parallax concerning the left eye and right eye images with respect to the window area, which has been specified in the processes of the steps S 4 and S 5 (step S 6 ).
- the shift amount calculator 88 obtains a signal for appropriately shifting the images according to the result of the process in the step S 6 (i.e., the parallax signal value) (step S 7 ).
- the result of the process in the step S 6 is coupled to the image shifters.
- FIG. 16 is a view for describing the operation of determining the area and position of the window in the embodiment shown in FIG. 14.
- the horizontal and vertical sizes n and m of the window are determined according to the horizontal and vertical spacial frequencies, respectively.
- the center position (xt, Yt) of the window is determined according to the eyesight detection signal representing the position viewed by the viewer, obtained in the eyesight detecting means such as the eyesight detector 18 described before in connection with FIG. 1 (step S 5 in FIG. 15).
- FIGS. 17 ( a ) and 17 ( b ) are views for describing the operation of the window area determination in the embodiment shown in FIG. 14.
- the window area is set to be small as shown by the dashed rectangle in FIG. 17( a ).
- the window area is set to be relatively large as shown by the dashed rectangle in FIG. 17( b ).
- spacial frequency detection is made in effect with respect to images concerning the edges extracted in the edge extraction process (corresponding to image frames as shown in FIGS. 17 ( a ) and 17 ( b ).
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
- Processing Or Creating Images (AREA)
- Processing Of Color Television Signals (AREA)
- Stereoscopic And Panoramic Photography (AREA)
Abstract
Description
- The present invention relates to stereo image display apparatuses capable of displaying stereo image to the viewer with left eye and right eye images with a binocular parallax and, more particularly, to improvements in the stereo image display apparatus for alleviating the departure from the natural sense of viewing and fatigue of the viewer viewing stereo image.
- As visual display apparatuses or systems, various stereo image display apparatuses for displaying images viewed as stereo image have been proposed.
- FIG. 18 is a perspective view showing a head-mounted display (HMD)700 as an example of such stereo image display apparatus. The illustrated HMD 700 is a binocular stereo display. The HMD 700 has a
frame 702, which is mounted on the viewer's head and supports a left and a right display element and also a left and a right enlargingoptical system 701 in front of the viewer's left and right eyes. Thus, a left eye image is displayed to the left eye, while a right eye is displayed to the right eye, whereby the viewer can view the image as stereo image. Theframe 702 has asensor support 703 supporting ahead motion sensor 704, which is located on the head and detects motion of the head. Thus, the viewer can view the image in correspondence to the motion of his or her head. Adata processor 720 is connected via acable 722 to aconnector 706, which is supported on aconnector support 705 provided on theframe 702. Aloudspeaker 709 for outputting sound is provided in each air. Thedata processor 720 has operating buttons 720a which are operable by the user for various operations. With the usual stereo image display apparatus such as the HMD, the viewing distance and the verging distance fail to coincide with each other, as will be described later in detail, thus resulting in a departure from the natural sense of viewing. - FIGS.19(a) to 19(c) are views for describing how a left and a right eye images are viewed as stereo image in the stereo image display apparatus. These figures show an example of stereo image viewed by the left and right eyes. The image includes two objects, i.e., a sphere and a triangle pyramid, the sphere becoming closer to the viewer. In this case, the left eye and right eye images are changed from those shown in FIG. 19(a) to those shown in FIG. 19(b) and then to those shown in FIG. 19(c). As shown, the sphere is moved toward the center while being gradually increased in size. This means that the binocular parallax is being gradually increased.
- FIG. 20 shows the way in which the images shown in FIGS.19(a) to 19(c) are viewed with the two eyes. Increasing binocular parallax leads to verging for merging, i.e., reaching or going to reach a viewer's state of perceiving one image on the basis of a plurality of images), so that the viewer's eyeballs are turned inward. This rotation of the eyes is called vergence, and the angle of the rotation is called vergence angle in the illustrated definition. Also, the distance between the optical axes of the eyeballs in vergence and each eye is called parallax distance. In the HMD, the parallax distance is equal to the distance between the point of intersection of the main beans of the left and right images and the main plane of the eyepiece optical system. Thus the vergence of the eyeballs immediately causes accommodation. With increasing vergence angle, the accommodation tends to be closer. Conversely, with reducing the vergence angle, the accommodation tends to be further apart. In the stereo image display, the plane in which image can be viewed with the best contrast is fixed. In the HMD, the distance from this plane to each eyeball is the viewing distance. In this connection, inconsistency has heretofore taken place. Specifically, the above phenomenon occurs not only in the HMD but also in various stereo television sets, such as those of shutter switching system, lenticular system, etc. In these systems, the viewing distance of stereo television is the distance from the display surface of the display such as a CRT, to each eyeball of the viewer.
- Viewing image with great verging distance changes as stereo image in a state that the viewing distance and the verging distance do not coincide, leads unnatural viewing. This problem may be avoided by producing image with less fly-out changes. Doing so, however, weakens the impact of image as stereo image.
- To solve this problem, Japanese Patent Publication Heisei 6-85590 proposes an HMD, in which the viewing distance is varied according to the image motion or the like through mechanical driving of the eyepiece lens. Japanese Laid-Open Patent Publication Heisei 3-292093 discloses a method of varying the viewing degree by detecting a point viewed by the viewer and moving the lenses according to depth information at the viewed point. These systems permit to obtain coincidence of the viewing degree and the verging angle with each other.
- Japanese Laid-Open Patent Publication Heisei 7-167633 shows a method of controlling the optimum viewing point, which permits the viewer to perceive the depth world of an object in the broadcast range, by calculating the point from the binocular parallax of image, such as the point is reproduced on the surface of a stereo image display unit or at a specified distance from the surface. As a specific means, parallax map is calculated from left and right images by using a correlation matching map, and then the mean value of parallax of the entire image or weighted mean parallax of a central part of the image is calculated. Using this mean parallax, a parallax controller controls the horizontal read timing of left and right images to cause parallel movement of the image in the horizontal direction. This method does not require any mechanical drive system, and it is thus possible to prevent size increase.
- FIGS.21( ) to 21(c) are views showing left eye and right eye images displayed in a stereo image display apparatus, which was proposed earlier by the inventor (Japanese Patent Application Heisei 8-28856). Like the case of FIGS. 19(a) to 19(c), two objects, i.e., a sphere and a triangular pyramid, are displayed, the sphere becoming closer to the viewer. In this case, the left eye and right eye images are changed from those shown in FIGS. 21(a) to those shown in FIG. 21(b) and then to those shown in FIG. 21(c). In this apparatus, the parallax of the left eye and right eye images is substantially fixed irrespective of the motion of the sphere toward and away from the viewer.
- FIG. 22 shows the way of viewing of the images shown in FIG. 21 displayed on an HMD with the two eyes. In this case, the verging distance L with respect to the sphere is unchanged although the image of the ball is increased as the ball becomes closer. The triangular pyramid, on the other hand, is moved apart from the viewer although its size is unchanged. In other words, the distance difference between the triangular pyramid and the sphere is increased as in the prior art case. Nevertheless, the verging distance L with respect to the sphere is substantially fixed.
- This is owing to the fact that the person's eyes are not so sensitive with respect to the change in the absolute distance although they are sensitive to changes in the relative distance. Experiments conducted by the inventor prove that the viewer viewing stereo image of only single object with changing binocular parallax (background being black), cannot perceive distance changes. However, the sense of stereo arises when objects in different motions are displayed simultaneously. This means that it is difficult to recognize a change in the distance of a single object, although changes in the distance between two objects can be recognized. According to the proposal noted above, with the distance difference between the sphere and the triangular pyramid changing as usual and also the sphere changing in size while the triangular pyramid is not, the viewer perceive as though the sphere is becoming closer to him or her while the triangular pyramid is not changing its position. It is thus possible to provide images with a stereo sense while holding a substantially constant verging distance with respect to the ball. Preferably, the verging distance L of the sphere in FIG. 22 is made coincident with the viewing distance. More preferably, an eye detector judges whether the viewer is viewing the sphere or the triangular pyramid, and the verging distance of the image being viewed is made substantially constant.
- FIG. 23 is a view for explaining the status of merging of stereo image, which is actually displayed on a left and a right display surface. The relation between the binocular parallax and the verging distance L when viewing a stereo image is now considered. Referring to the figure, when merging is attained, the horizontal positions X1 and X2 of the sphere on the left and right display surface when the sphere is viewed to be at a verging distance L and on a horizontal position −H, are respectively driven as equations (1) and (2).
- X1={d+(H)}/tanθ (1)
- X2={−d+(−H)}/tanθ (2)
- In these equations, d is the distance from the mid point between a left and a right lens to each lens (the distance being positive for the right eye and negative for the left eye), and θ is the half field angle. The horizontal positions X1 and X2 are prescribed as follows.
- FIG. 24 is a view showing how the horizontal positions X1 and X2 in FIG. 23 are normalized. As shown in FIG. 24, the normalization is made by setting the horizontal center value of the display region to “0” and the horizontal length of the display region to “2”. Equation (1) can be derived from the fact that the triangle with pints A to C in FIG. 23 as the apices and the triangle with origin O and points X1 and CX on the left display surfaces as the apices are similar to each other. Likewise, equation (2) can be derived from the similarity of the triangle with points D, B and E as the apices and the triangle with the origin O and points X2 and E on the right display surface to each other.
- Equations (1) and (2) can rearranged into equation (3).
- |X1−X2|=2d/(L·tanθ) (3)
- In equation (3), the left side |x1−x2| represents the parallax. Equation (3) shows that the verging distance L when the merging is attained is determined independent on the horizontal position H if the parallax is determined.
- The permissible change in the verging distance L, i.e., the permissible change in the parallax, will now be considered. FIG. 25 is a graph showing the correspondence relation between accommodation (i.e., state of focus of the eyes) and vergence. The figure shows the permissible range of the vergence accommodation and the parallax (“O Plus E”, Seiri Kogaku Dec. 15, 1985, pp. 103). In the graph, the ordinate is taken for the accommodation (parallax) (D: diopter), and the abscissa is taken for the vergence (vergence angle MW). It will be seen from the graph that the vergence is obtainable in a short period of time so log as its changes are within 4 diopters.
- In various display apparatuses, it is usual that the frame or an edge part of the display area enters the visual field of the viewer. However, in the system disclosed in the Japanese Laid-Open Patent Publication Heisei 7-167633 and the other prior art techniques described above, as well as some of the apparatuses which were proposed earlier by the inventor, no particular considerations are given to the influence, which is given to the viewer viewing stereo image by the frame of the display area of display means, i.e., the boundary between the display and non-display areas of the display means.
- FIG. 26 is a schematic view for describing the influence given to a stereo image viewer by the display area frames (i.e., display area edges) in a display apparatus having a right eye and a left eye image display area.
- Referring to FIG. 26, a right eye and a left eye LCDs11R and 11L with a right eye and a left eye image display area 11Rd and 11Ld, respectively, are provided for the right and left eyes 10R and 10L. Images on the display areas of the LCDs 11R and 11L, are perceived as images of a right eye and a left eye eyepiece
optical system - On the right eye image display area11Rd of the right eye LCD 11R, a right side and a left side image edges (i.e., boundaries between display and non-display areas) are formed as a right and a left edge 11Rrr and 11Rrl, respectively. Likewise, on the left eye image display area 11Ld of the left eye LCD 11L, a right side and a left side image edges (i.e., boundaries between display and non-display areas) are formed as a right and a left edge 11Lrr and 11Lrl.
- In the case of FIG. 26, like the case described before in connection with FIG. 22, an image is assumed which contains two objects, i.e., a sphere and a triangular pyramid, the sphere becoming closer to the viewer. The verging distance L with respect to the sphere is unchanged although the image thereof is increased as the sphere is becoming closer. The triangular pyramid, on the other hand, becomes away from the viewer although its size is unchanged. That is, the distance difference between the triangular pyramid and the sphere is being increased while the verging distance L with respect to the sphere is substantially fixed. In such a state, the positions of
right edge images 11 ir and 11 il, which are merged or verged in a binocular visual field formed by the right and left edges 11Rrr and 11Rrl of the right eye LCD 11R and the right and left edges 11Lrr and 11Lrl of the left eye LCD 11L (i.e., the distance between the image of the sphere and the image of the edge), are fixed as shown. - As described before in connection with FIG. 22, the stereo image display system shown in FIG. 26, utilizes the fact as shown in FIG. 22 that the person's eyes are not so sensitive to detect the absolute distance of an object although they are sensitive to relative distance changes. The system thus permits providing an image with a stereo sense as though the viewer sees the sphere becoming closer to him or her while the position of the triangular pyramid is unchanged, while holding the verging distance L with respect to the sphere substantially constant. However, as described before in connection with FIG. 26, in the system of this type the distance between the image of the sphere and the image of the edge is fixed. Therefore, when the viewer sees both the images of the sphere and the edge in his or her visual field, the inconsistency that the relative positions of the image of the sphere which must be becoming closer to the viewer and the image of the image which is fixed in position becomes unconcealed, thus spoiling the stereo sense of the image, i.e., the sense as though the viewer is actually on the site of the image.
- In order to evade the problem described before in connection with FIG. 26, it may be thought to move the positions of the
right edge images 11 ir and 11 il merged (or verged) together in the binocular vidual field (i.e., the distance between the images of the sphere and the edge) increased of fixing these positions. - FIG. 27 is a schematic view showing a case, in which the distances of the right and left edges11Rrr and 11Rrl of the right eye image display area 11Rd of the right eye LCD 11R and the right and left edges 11Lrr and 11Lrl of the left eye image display area 11Ld of the left eye LCD 11L from one another are variable. Increasing the edge-to-edge distance between the two eyes (i.e., between the edges 11Rrl and 11Lrr) is increased as shown in FIG. 27, gives rise to commonly called field struggling when images produced on the right and left LCDs 11R and 11L are to be verged to produce a stereo image.
- In the stereo image display apparatus of this type, usually the parallax concerning a left and a right image, is detected from the correlation between the full frames of the left and right images. However, it is not always efficient data processing to compare the full frames of the left and right images unanimously, that is, irrespective of the images (whether the images are thin or coarse) for obtaining the correlation therebetween. On the other hand, imposing a restriction on the images for obtaining the correlation therebetween, may result in an erroneous judgment.
- In view of the problems inherent in the prior art as described above, the present invention has an object of providing a stereo image display apparatus of the pertaining type, which gives due considerations to the influence given to the viewer viewing a stereo image by the edges of the display areas of the display means, i.e., the boundaries (or edges) of the display and non-display areas of the display means, and can display stereo images which do not spoil the sense as though the viewer is actually on the site of the image.
- Another object of the present invention is to provide a stereo image display apparatus of the pertaining type, which permits adequate detection of the correlation between a left and a right image in connection with the detection of a parallax concerning these images.
- According to a first aspect of the present invention, there is provided a stereo image display apparatus comprising: binocular parallax control means for executing a control operation to vary a right eye and a left eye images with a binocular parallax therebetween such that the binocular parallax is substantially fixed in effect; display means capable of displaying the left eye and right eye images on respective predetermined display areas; and shading-off means for shading off in effect edge portions of the display areas of the left eye and right eye images.
- In the prior apparatuses in which the binocular parallax is controlled to be substantially fixed, edge portions of the display areas (i.e., boundary portions between display and non-display areas) are clearly recognized in the visual field. Therefore, the display is unnatural as stereo image display, spoiling the viewer's sense just like the viewer is actually in on the site of the image scene.
- According to the first aspect of the present invention, edge portions of the display areas are shaded off in effect and made difficult to be clearly recognized. The display thus has a natural sense as stereo image display, providing enhanced sense of the viewer just like the viewer is actually on the site of the image scene.
- In the stereo image display apparatus according to a second aspect of the present invention, the shading-off means in the first aspect includes luminance restricting means for restricting the luminance of edge portions of the left eye and right eye display areas such that the luminance is reduced as one goes toward the edges of the display areas.
- According to the second aspect of the present invention, the luminance of the display areas is reduced toward the edges thereof. The edge portions of the display areas are thus shaded off and made difficult to be verged, making the sense about the distance of the display area edge portions unclear. The viewer's sense just like the viewer is actually on the site of the image scene is thus made difficult to be interfered with.
- In the stereo image display apparatus according to a third aspect of the present invention, the shading-off means in the first aspect includes resolution restricting optical means for restricting the resolution of edge portions of the left eye and right eye image display areas such that the resolution becomes coarser as one goes toward the edges of the display areas.
- According to the third aspect of the present invention, by employing an optical element to make the resolution of the display areas to become gradually coarser toward the edges thereof, the edge portions of the display areas are made difficult to be revolved. The edge portions are thus shaded off and made difficult to be verged, making the distance sense of the display area edge portions unclear. The viewer's sense just like the viewer is actually on the site of the image scene is thus made difficult to be interfered with.
- According to a fourth aspect of the present invention, there is provided a stereo image display apparatus comprising: display means capable of displaying a left eye and a right eye images with a binocular parallax therebetween on respective predetermined display areas; horizontal display position control means for controlling the horizontal display positions of the left eye and right eye images on the left eye and right eye image display areas in opposite directions, respectively; monochrome display area generating means for making the left and right edges and neighborhood thereof of the left eye and right eye image display areas to be predetermined monochrome display areas; and monochrome display area width control means for controlling the width of the monochrome display areas, constituted by the left and right edges and neighborhood thereof of the left eye and right eye image display areas, such as to be increased on the right edge side of the display and reduced on the left edge side thereof when the position of the images on the display areas is shifted to the left, and increased on the left edge side of the display areas and reduced on the right edge side thereof when the position of the images on the display areas is shifted to the right.
- With the prior art apparatuses, control of the horizontal display position of the left eye and right eye images in opposite directions results in clear recognition of edge portions of the display areas (i.e., boundaries between image and non-image areas) in the visual field, making the display unnatural as stereo image display. The viewer's sense just like the viewer is actually on the site of the image scene is therefore spoiled.
- According to the fourth aspect, in which the widths of the left and right edge portions of the left eye and right eye image display areas can be varied by the monochrome display area, the verging distance of the edges of (or boundaries between) of the monochrome display areas and the rest of the image display areas is varied in effect. It is thus possible to vary the relative distances of the monochrome display areas and the rest of the image display areas, thus making the display natural as stereo image display and enhancing the viewer's sense just like the viewer is actually on the site of the image scene.
- In the stereo image display apparatus according to a fifth aspect of the present invention, the monochrome display area generating means in the fourth aspect regulates the width of the monochrome display area constituted by the left edge and neighborhood thereof of the left eye image display area to be greater than the width of the monochrome display area constituted by the right edge and neighborhood thereof of the same display area, and also regulates the width of the monochrome display area constituted by the right edge and neighborhood thereof of the right eye image display area to be greater than the left edge and neighborhood thereof of the same display area.
- According to the fifth aspect, the width of the monochrome display area, which is constituted by the left edge and neighborhood thereof of the left eye image display area, is made to be greater than the width of the monochrome display area constituted by the right edge and neighborhood thereof of the same display area, and the width of the monochrome display area constituted by the right edge and neighborhood thereof of the right eye image display area is made to be greater than the monochrome display area constituted by the left edge and neighborhood thereof of the same display area. With this arrangement, verging of the intrinsic image display areas and edge portions thereof can always be obtained to evade the visual field struggling and obtain satisfactory stereo image display.
- In the stereo image display apparatus according to a sixth aspect of the present invention, in the fourth or fifth aspect, the monochrome display area generating means generates black display areas as monochrome display areas.
- According to the sixth aspect, when watching stereo pictures in theaters or viewing stereo images on HMDs, the outside of the image display areas are usually dark. By providing black display areas as the monochrome display areas, the edge portions of the image display areas can be made less recognizable as such, thus making it difficult to interfere with the viewer's sense just like the viewer is actually on the site of the image scene.
- According to a seventh aspect of the present invention, there is provided a stereo image display apparatus comprising: display means capable of displaying a left eye and a right eye image with a binocular parallax therebetween on respective predetermined display areas; spacial frequency detecting means for detecting a spacial frequency concerning the left eye or right eye image; correlation calculation area specifying means for specifying correlation calculation area according to the spacial frequency detected by the spacial frequency detecting means such that the specified correlation calculation area is the smaller the relatively higher the detected spacial frequency and the greater the relatively lower the detected spacial frequency; correlation calculating means for calculating a correlation of the left eye and right eye images to each other with respect to the correlation calculation area specified by the correlation calculation area specifying means; and binocular parallax control means for controlling the binocular parallax in effect according to the result of the correlation calculation in the correlation calculating means.
- According to the seventh aspect, since the area of the window as the subject of the correlation calculation is specified according to the spacial frequencies of images, optimum correlation calculation can be made adaptively in dependence on whether the images are fine or coarse. It is thus possible to improve both the efficiency and accuracy of the correlation detection.
- The stereo image display apparatus according to an eighth aspect of the present invention, further comprises viewed point detecting means for detecting a point viewed by the viewer in the display areas of the display means, the spacial frequency detecting means being operable to detect a spacial frequency of images with respect to the viewed point detected by the viewed point detecting means and the neighborhood thereof.
- The stereo image display apparatus according to an eighth aspect of the present invention, further comprises viewed point detecting means for detecting a point viewed by the viewer in the display areas of the display means, the spacial frequency detecting means being operable to detect a spacial frequency of images with respect to the viewed point detected by the viewed point detecting means and the neighborhood thereof.
- According to the eighth aspect, the spacial frequency of images can be detected within the point viewed by the viewer and the proximity of that point, so that it is possible to provide an inexpensive apparatus.
- In the stereo image display apparatus according to a ninth aspect of the present invention, the correlation calculation area specifying means in the eighth aspect specifies the horizontal size of the correlation calculation area to be the smaller the relatively higher a horizontal spacial frequency detected by the spacial frequency detecting means and be the greater the relatively lower the horizontal spacial frequency, and/or specifies the vertical size of the correlation calculation area to be the smaller the relatively higher a vertical spacial frequency detected by the spacial frequency detecting means and be the greater the relatively lower the vertical spacial frequency.
- According to the ninth aspect, the horizontal and vertical sizes of the specific area (i.e., window) as the subject of the correlation detection is selected according to both the horizontal and vertical spacial frequencies. An adequate window shape thus can be selected according to the two-dimensional fineness (or coarseness) of images, thus permitting adaptive optical correlation calculation. It is thus possible to improve both the efficiency and accuracy of the correlation detection.
- The stereo image display apparatus according to a tenth aspect of the present invention, further comprises edge extracting means for extracting edge portions of at least either of the left eye and right eye images with a binocular parallax therebetween, the spacial frequency detecting means being operable to detect a spacial frequency concerning an image displayed with coupling of edge portions detected by the edge extracting means.
- According to the tenth aspect, the spacial frequency detection is made by adopting the simple method of the edge extraction and edge counting. A simple and inexpensive apparatus thus is obtainable compared to the case of the Fourier transform method or the like.
- Other objects and features will be clarified from the following description with reference to attached drawings.
- FIG. 1 is a block diagram showing an embodiment of the stereo image display apparatus according to the present invention;
- FIGS.2(a) to 2(c) are schematic views for describing the line signal extraction and correlation calculation in the parallax reading means 40 in the apparatus shown in FIG. 1;
- FIG. 3 is a perspective view showing the detailed construction of optical systems for displaying image as an essential element in the embodiment shown in FIG. 1;
- FIG. 4 is an optical path diagram showing an example of eyepiece optical system, in which the optical systems shown in FIG. 3 function as shading-off means;
- FIG. 5 is an optical path diagram showing another example of the eyepiece optical system, in which the optical systems shown in FIG. 1 function as shading-off means;
- FIG. 6 is an optical path diagram showing a further example of the eyepiece optical system, in which the optical systems shown in FIG. 3 function as shading-off means;
- FIG. 7 is an optical path diagram showing a further example of the eyepiece optical system, in which the optical systems shown in FIG. 3 function as shading-off means;
- FIGS.8(a) and 8(b) are views showing optical elements of a further example of the optical system, in which the optical systems shown in FIG. 3 function as shading-off means;
- FIGS.9(a) and 9(b) are views showing the disposition of an optical element in a further example of the example, in which the optical systems shown in FIG. 3 function as shading-off means;
- FIG. 10 is a view showing an optical element in a further example of the optical system, in which the optical systems shown in FIG. 3 function as shading-off means. In the proposal publication,
- FIG. 11 is a block diagram showing a right eye image system in a different embodiment of the present invention;
- FIGS.12(a) to 12(c) are views for describing the control operation of the width of masking in the embodiment described before in connection with FIG.
- FIGS.13(a) to 13(c) are views for describing how masked images are seen;
- FIG. 14 is a block diagram showing a further embodiment of the present invention;
- FIG. 15 is a flow chart for describing the operation of the embodiment shown in FIG. 14;
- FIG. 16 is a view for describing the operation of determining the area and position of the window in the embodiment shown in FIG. 14;
- FIGS.17(a) and 17(b) are views for describing the operation of the window area determination in the embodiment shown in FIG. 14;
- FIG. 18 is a perspective view showing a head-mounted display (HMD)700 as an example of such stereo image display apparatus;
- FIGS.19(a) to 19(c) are views for describing how a left and a right eye images are viewed as stereo image in the stereo image display apparatus;
- FIG. 20 shows the way in which the images shown in FIGS.19(a) to 19(c) are viewed with the two eyes;
- FIGS.21( ) to 21(c) are views showing left eye and right eye images displayed in a stereo image display apparatus which was proposed earlier;
- FIG. 22 shows the way of viewing of the images shown in FIG. 21 displayed on an HMD with the two eyes;
- FIG. 23 is a view for explaining the status of merging of stereo image, which is actually displayed on a left and a right display surface; FIG. 24 is a view showing how the horizontal positions X1 and X2 in FIG. 23 are normalized;
- FIG. 25 is a graph showing the correspondence relation between accommodation and vergence;
- FIG. 26 is a schematic view for describing the influence given to a stereo image viewer by the display area frames (display area edges) in a display apparatus having a right eye and a left eye image display area; and
- FIG. 27 is a schematic view showing a case, in which the distances of the right and left edges11Rrr and 11Rrrl of the right eye image display area 11Rd of the right eye LCD 11R and the right and left edges 11Lrr and 11Lrl of the left eye image display area 11Ld of the left eye LCD 11L from one another are variable.
- FIG. 1 is a block diagram showing an embodiment of the stereo image display apparatus according to the present invention. A right eye and a left eye LCD11R and 11L having a right eye and a left eye image display area, respectively, are provided for a right and a left eyes 10R and 10L, respectively. Images on the display areas of the LCDs 11R and 11L are perceived as images produced by a right eye and a left eye eyepiece
optical system image shifter 32R for conversion to provide a stereo image as shown in FIGS. 21(a) to 21(c) on the right eye LCD 11R. Specifically, the signal is coupled to a righteye LCD driver 33R to display a right eye image on the LCD 11R. Likewise, a left eye image signal from the image reproducing unit 31, for providing the stereo image as shown in FIGS. 21(a) to 21(c), is coupled through animage shifter 32L and aleft LCD driver 33L to a left eye LCD 11R for displaying a left eye image thereon. - In the system shown in FIG. 1, an
eyesight detector 18 is provided for either of the right and left eyes 10R and 10L (i.e., the left eye 10L in this example) to detect the eyesight from that eye. Theeyesight detector 18 includes a photoelectric convertingelement 17, which receives light from alight source 15 and also reflected right from a eyeball surface reflecting the light from thelight source 15. - The
eyesight detector 18 provides an eyesight detection signal as its output signal (which is data representing a portion of the image that is viewed by the viewer) to aline signal extractor 45. To theline signal extractor 45 are also coupled the right eye and left eye image signals from the image reproducing unit 31. From each of these signals, theline signal extractor 45 extracts a plurality of horizontal lines centered on the image potion viewed by the viewer, and supplies the extracted signal together with the eyesight detection signal coupled to it to acorrelation calculator 46. Thecorrelation calculator 46 calculates a parallax value between the right eye and left eye image signals from the correlation between the extracted pluralities of horizontal lines of signals, and outputs a parallax signal representing the calculated parallax value. - The
eyesight detector 18,line signal extractor 45 andcorrelation calculator 46 together constitute parallax reading means 40, which reads out a parallax concerning the images displayed on the display means on the basis of the right eye and left eye image signals. - The parallax signal outputted from the parallax reading means40 (the correlation calculator 46), is coupled to an
address converter 47. Theaddress converter 47 outputs data representing an address number corresponding to the value of the parallax signal coupled to it. According to the address number data, of data provided in a table in amemory 48 as necessary shift data holding means holding necessary horizontal shift data corresponding to the right eye and left eye images, those which correspond to the above address number are retrieved and outputted from thememory 48. The shift data or shift signals representing necessary shift amounts, which are thus read out, are coupled to theimage shifters - The
image shifters - The
image shifters line signal extractor 45,correlation calculator 46 andaddress converter 47 may be constructed as a single collective digital circuit or as respective separate digital processors or circuits. It is also possible to include the image reproducing unit 31 and the right eye and lefteye LCD drivers - In the above case, the horizontal display positions of the viewed image patterns are shifted (together with background) on the right eye and left eye image display areas. Alternatively, it is possible to shift the right eye and left eye image display areas (i.e., the right eye and left eye LCDs11R and 11L as display devices) as a whole with a result that the horizontal display positions the viewed image patterns displayed on these display areas are shifted (together with background).
- FIGS.2(a) to 2(c) are schematic views for describing the line signal extraction and correlation calculation in the parallax reading means 40 in the apparatus shown in FIG. 1.
- As shown in FIG. 2(a), a right eye and a
left eye images eyesight detector 18 to determine the coordinates (x′, y′) of its position. - Then, as shown in FIG. 2(b), a plurality of line signals centered on line y′ are extracted from each of the right eye and left eye image signals. For the brevity of the description, FIGS. 2(a) to 2(c) show an example, in which the sole line y′ of signal is extracted. Thus, a horizontal line video signal corresponding to the vertical coordinate y′ in the coordinates (x′, y′) is extracted as a left y′ line and a right y′ line image signal. The correlation between the left y′ line and right y′ line image signals extracted in the above way, is calculated with respect to the horizontal coordinate x′ in the coordinates (x′, y′). As an example, of the left y′ line video signal a signal in a section ±Δx centered on x′ is taken to calculate the correlation between it and the right y′ line image signals. Specifically, with respect to a left eye image portion at the horizontal coordinate x′, the time difference of a portion, i.e., a most highly correlated portion, of the right eye image is detected, and the amount of parallax is determined from the calculated time difference. A shift amount calculator calculates the necessary shift amount according to the data thus obtained.
- In the embodiment of the apparatus according to the present invention described with reference to FIGS. 1 and 2(a) to 2(c), the parallax reading means 40 (correlation calculator 46),
address converter 47,memory 48,image shifters eye LCD drivers eye LCD drivers - FIG. 3 is a perspective view showing the detailed construction of optical systems for displaying image as an essential element in the embodiment shown in FIG. 1. An optical system for the right eye will first be described. The system comprises the right eye eyepiece
optical system 12R, which is a prism having a convex mirror 12Rb as a bottom inner surface and a substantially diagonally provided inner central half mirror 12Rd. An image on the right eye LCD 11R as an image display element is incident on the corresponding top surface of theprism 12R, then transmitted through the half mirror 12Rd, and then reflected by the convex mirror 12Rb on the bottom inner surface of the prism. The reflected light is reflected by the lower surface of the half mirror 12Rd to the left side of the prism in the figure and incident on the right eye 10R from the pupil 10Rp to form a virtual image on the retina. - An optical system for the left eye is line symmetric to the optical system for the right eye described above, and will be understood from the figure by replacing “R” in the reference symbols in the detecting system for the right eye with “L”.
- One feature (or element) of the present invention resides in shading-off means, which shades off in effect edge portions (i.e., frames and the neighborhood thereof) of the displays areas of the left eye and right eye images with a binocular parallax therebetween as noted before. For example, as shown in FIG. 8(b), the image display are provided to the viewer has shaded-off or gradated edge portions. As this constitution, various elements described in Japanese Laid-Open Patent Publication Heisei 7-325266 (hereinafter referred to as proposal publication) by the present applicant.
- Some of the elements described in the proposal publication which are recommendable for application to the present invention, will be briefly described hereinunder by providing like reference symbols to parts like those shown in FIG. 3.
- FIG. 4 is an optical path diagram showing an example of eyepiece optical system, in which the optical systems shown in FIG. 3 function as shading-off means. In the proposal publication, this system is described as
Embodiment 1 on the basis of FIG. 4. - Referring to FIG. 4, in order to lead an image light flux from the right eye (or left eye) LCD11R (or 11L) as image display element to the viewer's right eye (or left eye) pupil 10Rp (or 10Lp), a
beam splitter prism 12R (or 12L) is provided, which has an inclined half mirror 12Rd (or 12Ld) disposed at the intersection between the optical axis of the LCD 11R (or 11L) and the viewer's eyesight axis. Thebeam splitter prism 12R (or 12L) has a convex mirror 12Rb (or 12Lb) at the bottom. An image light flux from the LCD 11R (or 11L) is incident on the top surface of thebeam splitter prism 12R (or 12L), then transmitted through the half mirror 12Rd (or 12Ld) and then reflected by the convex mirror 12Rb (12Lb) as the bottom inner surface of the prism. The reflected light flux is reflected by the lower surface of the half mirror 12Rd (or 12Ld) and incident on the right eye (or left eye) pupil 10Rp (or 10Lp) to form a virtual image on the retina. - Specifications of this optical system are that
- a 1.3-inch LCD is used,
- satisfactory image-focusing angles are 35° horizontal and 26.6° vertical, and
- viewing angles are 58° horizontal and 44.2° vertical (the satisfactory image-focusing angles being about 60% of the viewing angles).
- With a prism size of 29|−x 24|−x 27|- the main optical axis of the light fluxes along the edges of the right eye (or left eye) LCD11R (or 11L) as the image display element is not projected onto the viewer's eye. That is, edge portions (i.e., frames and the neighborhood thereof) of the display areas of the left eye and right eye images are shaded off in effect.
- FIG. 5 is an optical path diagram showing another example of the eyepiece optical system, in which the optical systems shown in FIG. 1 function as shading-off means. In the proposal publication, this system is described as Embodiment 6 on the basis of FIG. 9.
- Referring to FIG. 5, like the example shown in FIG. 4, in order to lead an image light flux from the right eye (or left eye) LCD11R (or 11L) as image display element to the viewer's right eye (or left eye) pupil 10Rp (or 10Lp), a
beam splitter prism 12R (or 12L) is provided, which has an inclined half mirror 12Rd (or 12Ld) disposed at the intersection between the optical axis of the LCD 11R (or 11L) and the viewer's eyesight axis. Thebeam splitter prism 12R (or 12L) has a convex mirror 12Rb (or 12Lb) at the bottom. An image light flux from the LCD 11R (or 11L) is incident on the top surface of thebeam splitter prism 12R (or 12L), then transmitted through the half mirror 12Rd (or 12Ld) and then reflected by the convex mirror 12Rb (or 12Lb) as the bottom inner surface of the prism. The reflected light flux is reflected by the lower surface of the half mirror 12Rd (or 12Ld) and incident on the right eye (or left eye) pupil 10Rp (10Lp) to form a virtual image on the retina. - Specifications of this optical system are that
- a 1.3-inch LCD is used,
- satisfactory image-focusing angles are 30° horizontal and 23° vertical, and viewing angles are 60° horizontal and 47°
- vertical (the satisfactory image-focusing angles being about 50% of the viewing angles).
- In this example, the end surface (or to surface of the
prism 12R (or 12L) on the side of the right eye (or left eye) LCD 11R (or 11L) as the image display element is a non-spherical surface such that the convex power is increased as one goes away from the optical axis and changed to convex power as one goes outward from a certain position. - By the provision of the above non-spherical surface, the following effect is obtainable.
- (1) Within the satisfactory angle a negative distortion generated in the convex mirror12Rb (or 12Lb) is corrected to −5% or below, and outside the satisfactory angle a great negative distortion is generated, thus permitting a broad angle to be readily secured. (In the forward pursuit the distortion is positive, although it is negative in the backward pursuit.)
- (2) The angle between the main optical axis of the edges of the right eye (or left eye) LCD11R (or 11L) as the image display element and the image display element is increased, thus darkening the image of the image display element edges (i.e., controlling the inclination of the main optical axis of the eyepiece optical system). The inclination angles of the main optical axis in the long sides, short sides and diagonal of the LCD are as follows.
- Inclination angle of the main optical axis in the long sides of LCD: 14°
- Inclination angle of the main optical axis in the short sides of LCD: 7°
- Inclination angle of the main optical axis in the diagonal of LCD: 35°
- (3) Astigmatism and coma (internal coma) are generated to deteriorate the resolution in the edge portions of the image display area and thin off the boundary between the image and non-image areas. (The image-focusing performance of the eyepiece optical system is deteriorated in the edge portions of the image display area.)
- By applying the optical system shown in FIG. 5, with the above effects the frames of the display area of the left and right images are shaded off.
- FIG. 6 is an optical path diagram showing a further example of the eyepiece optical system, in which the optical systems shown in FIG. 3 function as shading-off means. In the proposal publication, this system is described as Embodiment 7 on the basis of FIG. 10.
- Referring to FIG. 6, like the example shown in FIG. 4, in order to lead an image light flux from the right eye (or left eye) LCD11R (or 11L) as image display element to be viewer's right eye (or left eye) pupil 10Rp (or 10Lp), a
beam splitter prism 12R (or 12L) is provided, which has an inclined half mirror 12Rd (or 12Ld) disposed at the intersection between the optical axis of the LCD 11R (or 11L) and the viewer's eyesight axis. Thebeam splitter prism 12R (or 12L) has a convex mirror 12Rb (or 12Lb) at the bottom. An image light flux from the LCD 11R (or 11L) is incident on the top surface of thebeam splitter prism 12R (or 12L), then transmitted through the half mirror 12Rd (or 12Ld) and then reflected by the convex mirror 12Rd (or 12Ld) as the bottom inner surface of the prism. The reflected light flux is reflected by the lower surface of the half mirror 12Rd (or 12Ld) and incident on the right eye (or left eye) pupil 10Rp (or 10Lp) to form a virtual image on the surface. - Specifications of this optical system are that
- a 1.3-inch LCD is used,
- satisfactory image-focusing angles are 12° horizontal and 9° vertical, and viewing angles are 38° horizontal and 29° vertical, (the satisfactory image-focusing angles being about 32% of the viewing angles).
- In this example, a light-blocking frame SF having a similar shape to that of the edges of the LCD11R (or 11L) is used to prevent the main light flux of the light fluxes from the edges of the LCD 11R (or 11L) from reaching the viewer's eye.
- When the light-blocking frame SF is located at the position of the LCD11R (or 11L), it serves as visual field stop, and its shape is projected onto the eye. For this reason, the light-blocking frame SF should be located at a position spaced apart from the LCD 11R (or 11L) by more than the depth-of-focus. Preferably, the light-blocking frame SF is located at a position spaced apart from the LCD by more than 20 times the depth-of-focus. By so doing, the effect of shading-off can be further increased. Furthermore, by increasing the size of the light-blocking frame SF the light fluxes from the edges of the LCD 11R (or 11L) can be perfectly blocked.
- FIG. 7 is an optical path diagram showing a further example of the eyepiece optical system, in which the optical systems shown in FIG. 3 function as shading-off means. In the proposal publication, this system is described as Embodiment 13 on the basis of FIG. 14.
- Referring to FIG. 7, like the example shown in FIG. 4, in order to lead an image light flux from the right eye (or left eye) LCD11R (or 121L) as image display element to the viewer's right eye (or left eye) pupil 10Rp (or 10Lp), a
beam splitter prism 12R (or 12L) is provided, which has an inclined half mirror 12Rd disposed at the intersection between the optical axis of the LCD 11R (or 11L) and viewer's eyesight axis. Thebeam splitter prism 12R (or 12L) has a convex mirror 12Rb (or 12Lb) at the bottom. An image light flux form the LCD 11R (or 11L) is incident on the to surface of thebeam splitter prism 12R (or 12L), then transmitted through the half mirror 12Rd (or 12Ld) and then reflected by the convex mirror 12Rd (12Ld) as the bottom inner surface of the prism. The reflected light flux is reflected by the lower surface of the half mirror 12Rd (or 12Ld) and incident on the right eye (or left eye) pupil 10Rp (10Lp) to form a virtual image on the retina. - In this example of FIG. 7, the portion of the convex mirror12Rb (12Lb) which is coated on the bottom portion of the
beam splitter 12R (12L) is limited to the portion A smaller than the entire bottom portion and other portion around there is transparent or light absorption portion. - Specifications of this optical system are that
- a 1.3-inch LCD is used,
- satisfactory image-focusing angles are 12° horizontal and 9° vertical, and
- viewing angles are 38° horizontal and 29° vertical (the satisfactory image-focusing angles being about 32% of the viewing angles).
- In the example of FIG. 7, the outside portion of the outermost portion is not mirror-coated to prevent the main optical beam of the outermost portion from incident to the viewer's eyes. The limitation of the mirror-coat portion completely cut the optical beam of the outermost portion Thus the main optical beam of the outermost portion of the right eye and left eye LCDs11R and 11L as the image display element is not incident to the viewer's eyes, making shade-off in effect for the edge portion and neighborhood thereof of the display portion corresponding to the left eye and right eye images.
- FIGS.8(a) and 8(b) are views showing optical elements of a further example of the optical system, in which the optical systems shown in FIG. 3 function as shading-off means. The element shown in FIG. 8(a) is described in the proposal publication as Embodiment 9 on the basis of FIG. 12.
- In the element shown in FIG. 8(a), the light transmittivity is reduced stepwise toward the edges. In the element shown in FIG. 8(b), the light transmittivity is reduced continuously toward the edges. Such a light-blocking member is disposed between the right eye (or left eye) LCD 11R (or 11L) as the image display element in the optical systems shown in FIG. 3 and an illuminating system (not shown) behind the LCD for illumination light intensity control to darken and make obscure the image of image display area edges. Edge portions (i.e., frames and the neighborhood thereof) of the display areas of the left eye and right eye images are thus shaded off in effect.
- FIGS.9(a) and 9(b) are views showing the disposition of an optical element in a further example of the example, in which the optical systems shown in FIG. 3 function as shading-off means. In the proposal publication, this system is described as
Embodiment 10 on the basis of FIG. 19. - Usually, as shown in FIG. 9(b), a backlight BKL is disposed in the proximity of an LCD used as image display element. In this example, as shown in FIG. 9(a), the backlight BKL is disposed such that it is far apart from the LCD. This disposition has an aim of positively generating illumination irregularities with respect to the LCD 11R (or 11L) to relatively darken the image of the edge portions of the LCD as the image display element. Edge portions (i.e., frames and the neighborhood thereof) of the areas of display of the left and right eye images are thus shaded off in effect.
- FIG. 10 is a view showing an optical element in a further example of the optical system, in which the optical systems shown in FIG. 3 function as shading-off means. In the proposal publication, this system is described as Embodiment 13 on the basis of FIG. 16.
- The element shown in FIG. 10 is a diffuser, the diffusing effect of which is increased step-wise as one goes toward the edges. As in the case of the light-blocking member shown in FIG. 8(b), it is possible to arrange such that the diffusing effect is increased continuously as one goes toward the edges. This diffuser is disposed between the right eye (or left eye) LCD 11R (or 11L) as the image display element and the
beam splitter prism 12R (or 12L) in the optical systems shown in FIG. 13 to darken and made obscure the image of the image display area edges. Edge portions (i.e., frames and the neighborhood thereof) of the areas of display of the left and right eye images are thus shaded off in effect. It is possible to dispose the diffuser at a suitable position on the optical path in thebeam splitter prism 12R (or 12L). - FIG. 11 is a block diagram showing a right eye image system in a different embodiment of the present invention. While FIG. 11 shows only the right eye image system, the embodiment also comprises a left eye image system which is alike in construction.
- A right eye and a left eye image signal as described before in connection with FIGS.19(a) to 19(c), are coupled to a
parallax reading circuit 71. According to the two image signals, theparallax reading circuit 71 forms a parallax signal representing a parallax concerning the left and right eye images. The parallax signal is coupled to ashift amount converter 72. Theshift amount converter 72 derives an adequate shift amount (i.e., shift amount of the horizontal positions of images or control of left and right masking widths of the LCD display areas) corresponding to the parallax value represented by the parallax signal, and generates a shift amount signal representing an image shift amount. The shift amount signal is coupled to aread controller 73 and also to amasking signal generator 74. - The right eye image signal (which is an analog signal), is also coupled to an A/
D converter 75. The output signal (i.e., right eye digital image signal) from the A/D converter 75 is written in amemory 76 under control of an externally provided write control signal. The right eye digital image signal thus written in thememory 76 is read out under control of a read control signal, which is generated in the readcontrol signal 73 in correspondence to the shift amount signal. The right eye digital image signal read out from thememory 76 represents a right eye image as shown in FIGS. 21(a) to 21(c) which is in a proper horizontal position in the display area. This signal is coupled as one input to amixer 77. As the other input to themixer 74 is coupled a masking signal, which is generated in themasking signal generator 74 in correspondence to the shift amount signal. The masking signal prescribes the form (i.e., width) of a left and a right masking portion of the LCD display area as schematically shown on the right side of theblock 74 in FIG. 11. - The
mixer 77 mixes together the two inputs, i.e., the right eye digital image signal read out from thememory 76, and the masking signal from themasking signal generator 74, to form a digital image signal which represents an image on the display area with the left and right sides masked by an adequate width, i.e., an image on the display area with the left and right edges and neighborhood thereof in monochrome display, such as black display (as will be described later with reference to FIGS. 12(a) to 12(c). This signal is coupled to the D/A converter 78. The D/A converter 78 generates an analog image signal, which is coupled to anLCD driver 33R. TheLCD driver 33R drives LCD 11R for displaying the right eye image according to the right eye analog image signal coupled thereto. - A left eye image system likewise comprises an A/
D converter 75, amemory 76, amixer 77 and a D/A converter 78. The output of the D/A converter in this system is coupled to a lefteye LCD driver 33L to drive the left eye LCD 11L and display the left eye image (see FIG. 1). - FIGS.12(a) to 12(c) are views for describing the control operation of the width of masking (i.e., converted area of the left and right edges and neighborhood thereof of the image display areas into predetermined monochrome display, such as black display) in the embodiment described before in connection with FIG. 11.
- The image shown in FIGS.12(a) to 12(c), like those shown in FIGS. 21(a) to 21(c) and 22, includes a sphere and a triangular pyramid. The sphere is becoming closer while the triangular pyramid is becoming far away from the viewer in the order of FIGS. 12(a) to 12(c). As is seen from the figures, the width of the monochrome display parts is controlled such that it is increased on the right side of the display area and reduced on the left side when position of the image on the display area is shifted to the left, while it is increased on the left side of the display area and reduced on the right side when the image position is shifted to the right.
- The width WLL of the monochrome part of the left eye display area, i.e., the left edge and neighborhood thereof of the display area, is regulated to be greater than the width WLR of the monochrome part of the same image display area, i.e., the right edge and neighborhood thereof of the display area. Likewise, the width WLL of the monochrome part of the right eye image display area, i.e., the right edge and neighborhood thereof of the display area, is regulated to be greater than the width WLR of the monochrome part of the same display area, i.e., the left edge and neighborhood of the display area.
- FIGS.13(a) to 13(c) are views for describing how masked images (with monochrome parts) are seen.
- FIGS.13(a) to 13(c) correspond to FIGS. 12(a) to 12(c). As shown, the position of the masked parts (i.e., monochrome parts) of images (i.e., position of edge portions of the images), is changed from a close position to a distant position with changes in the masking width. Thus, even with the present of the masked parts in the viewer's visual field, the viewer's sense as though he or she is actually on the side of the image scene is enhanced with respect to the approaching motion of the sphere (in the depth direction).
- FIG. 14 is a block diagram showing a further embodiment of the present invention.
- The left eye and right eye image signals from a the image reproducing unit shown in FIG. 1 are coupled to a low-
pass filter 81 for smoothing and high frequency noise removal. The smoothed left eye and right eye image signals outputted from the low-pass filter 81 are coupled to the respective left eye and right eye LCDs noted above and also to adifferential circuit 82. Thedifferential circuit 82 differentiates the two input image signals and extracts signals corresponding to edge portions of images. Of the output signals of thedifferential circuit 82, that which is of the left eye image signal system is coupled to a squaring and integratingcircuit 83. The squaring and integratingcircuit 83 first squares the output of the differentiatingcircuit 82, which is a signal comprising both positive and negative components (i.e., differential components corresponding to rising and falling edges). The squared signal comprises positive components only. Thecircuit 83 then integrates the squared signal for each predetermined time section. The integrated value is the greater the higher the frequency of appearance of differential peaks in the predetermined time section. Consequently, the output of the squaring and integratingcircuit 83 has the greater value the finer the images in the areas corresponding to the predetermined time section and relatively the higher the mean spacial frequency. To the squaring and integratingcircuit 83 is coupled a parallax detection signal representing the position viewed by the viewer. The above predetermined time section for the integration is selected such that it is matched to the viewer's viewed position represented by the detection signal. - The output of the squaring and integrating
circuit 83 is coupled to a memory readposition controller 84 as the next stage. The eyesight detection signal representing the viewer's viewed position, which is obtained by the eyesight detecting means such as theeyesight detector 18 described before in connection with FIG. 1 and coupled to the squaring and integratingcircuit 83, is also coupled to the memory readposition controller 84. The memory readposition controller 84 determines the area of the window noted above according to the output signal of the squaring and integratingcircuit 83, which is indicative of whether the spacial frequency is high or not. The window area to be correlation-detected between the left eye and right eye images is selected to be relatively small when the spacial frequency is relatively high. When the spacial frequency is relatively low, the window area is selected to be relatively large. In this embodiment, the window area is selected according to the parallax detection signal such as to match the center position of the window to the viewer's viewed position. In other words, the memory readposition controller 84 outputs a signal representing the area and center position of the window. - The output of the differentiating
circuit 82, i.e., the outputs of the left eye and right eye image systems, is stored inrespective memories memories memories position controller 84, representing the area and center position of the window. These read-out data which each correspond to an appropriate window area, are coupled to acorrelation calculator 87 for deriving a parallax concerning the left eye and right eye images. The output of thecorrelation calculator 87, i.e., a parallax signal, is coupled to ashift amount calculator 88, which obtains a signal for appropriately shifting the images according to the difference between the present and desired parallax values. The output of theshift amount calculator 88 is coupled to the image shifters which were described before in connection with FIG. 1 for horizontal image position control concerning the left eye and right eye images. - FIG. 15 is a flow chart for describing the operation of the embodiment shown in FIG. 14. The low-pass filter51 smoothes and removes the high frequency noise components from the left eye and right eye image signals as its inputs (step S1). The two smoothed image signals are then subjected to edge extraction. More specifically, the differentiating
circuit 82 differentiates the smoothed image signals to extract signals corresponding to image edges (step S2). Of these signals obtained as a result of the edge extraction, the squaring and integratingcircuit 83 detects a frequency characteristic (i.e., discriminates whether the spacial frequency of the images is relatively high or not) (step S3). According to the result of the process in the step S3, the area of the correlation detection subject window is determined (step S4). In the step S4, the horizontal and vertical sizes n and m of the window may be determined according to the horizontal and vertical spacial frequencies, respectively. - After the window area determination in the step S4, the center position (Xt, Yt) of the window is determined (step S5). For the steps S3 and S5, the eyesight detection signal representing the position viewed by the viewer, which is obtained in the eyesight detecting means such as the
eyesight detector 18 as described before in connection with FIG. 1, is supplied to the squaring and integratingcircuit 83 and the memory readposition controller 84 as described before in connection with FIG. 14. Thecorrelation calculator 87 derives the parallax concerning the left eye and right eye images with respect to the window area, which has been specified in the processes of the steps S4 and S5 (step S6). Theshift amount calculator 88 obtains a signal for appropriately shifting the images according to the result of the process in the step S6 (i.e., the parallax signal value) (step S7). The result of the process in the step S6 is coupled to the image shifters. - FIG. 16 is a view for describing the operation of determining the area and position of the window in the embodiment shown in FIG. 14. As described before in connection with the process in the step S4 in FIG. 15, the horizontal and vertical sizes n and m of the window are determined according to the horizontal and vertical spacial frequencies, respectively. Then, the center position (xt, Yt) of the window is determined according to the eyesight detection signal representing the position viewed by the viewer, obtained in the eyesight detecting means such as the
eyesight detector 18 described before in connection with FIG. 1 (step S5 in FIG. 15). - FIGS.17(a) and 17(b) are views for describing the operation of the window area determination in the embodiment shown in FIG. 14. With a relatively complicated image (i.e., when the spacial frequency is relatively high), the window area is set to be small as shown by the dashed rectangle in FIG. 17(a). With a relatively simple image (i.e., when the spacial frequency is relatively low), the window area is set to be relatively large as shown by the dashed rectangle in FIG. 17(b). In the window area determination process described above in connection with the embodiment shown in FIG. 14, spacial frequency detection is made in effect with respect to images concerning the edges extracted in the edge extraction process (corresponding to image frames as shown in FIGS. 17(a) and 17(b).
- As has been described in the foregoing, according to the present invention due considerations are given to the influence on the viewer viewing stereo image of edge portions of the display areas of the display means, i.e., boundaries of the image and non-image areas of the display means, thus permitting stereo image display which does not spoil the viewer's sense just like the viewer is actually on the site of the image scene. In addition, it is possible to obtain adequate detection of the correlation of the left eye and right eye images to each other.
- Changes in construction will occur to those skilled in the art and various apparently different modifications and embodiments may be made without departing from the scope of the present invention. The matter set forth in the foregoing description and accompanying drawings is offered by way of illustration only. It is therefore intended that the foregoing description be regarded as illustrative rather than limiting.
Claims (10)
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP8-157577 | 1996-05-29 | ||
JP8157577A JPH09322199A (en) | 1996-05-29 | 1996-05-29 | Stereoscopic video display device |
JP157577/1996 | 1996-05-29 |
Publications (2)
Publication Number | Publication Date |
---|---|
US20010030715A1 true US20010030715A1 (en) | 2001-10-18 |
US6324001B2 US6324001B2 (en) | 2001-11-27 |
Family
ID=15652742
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US08/865,187 Expired - Lifetime US6324001B2 (en) | 1996-05-29 | 1997-05-29 | Stereo image display apparatus |
Country Status (2)
Country | Link |
---|---|
US (1) | US6324001B2 (en) |
JP (1) | JPH09322199A (en) |
Cited By (32)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1343334A2 (en) * | 2002-03-08 | 2003-09-10 | Topcon Corporation | Device and method for displaying stereo image |
US20050285997A1 (en) * | 2002-09-26 | 2005-12-29 | Yoshihide Koyama | 2d/3d switch liquid crystal display panel and 2d/3d selection liquid crystal display |
US20070035618A1 (en) * | 2004-04-26 | 2007-02-15 | Olympus Corporation | Image processing apparatus |
FR2900475A1 (en) * | 2006-04-26 | 2007-11-02 | Essilor Int | DISPLAY COMPRISING A PAIR OF BINOCULAR GLASSES AND WITH A DEVICE FOR ADJUSTING THE IMAGE |
US20080002859A1 (en) * | 2006-06-29 | 2008-01-03 | Himax Display, Inc. | Image inspecting device and method for a head-mounted display |
US20100289882A1 (en) * | 2009-05-13 | 2010-11-18 | Keizo Ohta | Storage medium storing display control program for controlling display capable of providing three-dimensional display and information processing device having display capable of providing three-dimensional display |
US20100309295A1 (en) * | 2009-06-04 | 2010-12-09 | Chow Kenny W Y | 3d video processor integrated with head mounted display |
US20110032252A1 (en) * | 2009-07-31 | 2011-02-10 | Nintendo Co., Ltd. | Storage medium storing display control program for controlling display capable of providing three-dimensional display and information processing system |
NL1029968C2 (en) * | 2005-03-10 | 2011-02-17 | Minoru Inaba | DIGITAL STEREO CAMERA / DIGITAL STEREO VIDEO CAMERA, 3-DIMENSIONAL DISPLAY, 3-DIMENSIONAL PROJECTOR, AND PRINTER AND STEREOVIEWER. |
US20110102425A1 (en) * | 2009-11-04 | 2011-05-05 | Nintendo Co., Ltd. | Storage medium storing display control program, information processing system, and storage medium storing program utilized for controlling stereoscopic display |
US20110254926A1 (en) * | 2010-04-16 | 2011-10-20 | Ushiki Suguru | Data Structure, Image Processing Apparatus, Image Processing Method, and Program |
EP2389003A1 (en) * | 2009-01-19 | 2011-11-23 | Minoru Inaba | Three-dimensional video image pick-up and display system |
EP2536159A1 (en) * | 2010-02-12 | 2012-12-19 | Sony Corporation | Image processing device, image processing method, and program |
US20130010093A1 (en) * | 2010-04-01 | 2013-01-10 | Thomson Licensing Llc | Method and system of using floating window in three-dimensional (3d) presentation |
US20130141425A1 (en) * | 2011-12-06 | 2013-06-06 | Novatek Microelectronics Corp. | Three-dimension image processing method |
CN103238341A (en) * | 2010-12-09 | 2013-08-07 | 索尼公司 | Image processing device, image processing method, and program |
US20130271569A1 (en) * | 2010-12-10 | 2013-10-17 | Fujitsu Limited | Stereoscopic moving picture generating apparatus and stereoscopic moving picture generating method |
US8791989B2 (en) | 2009-01-21 | 2014-07-29 | Nikon Corporation | Image processing apparatus, image processing method, recording method, and recording medium |
US9019261B2 (en) | 2009-10-20 | 2015-04-28 | Nintendo Co., Ltd. | Storage medium storing display control program, storage medium storing library program, information processing system, and display control method |
US20150198808A1 (en) * | 2012-07-24 | 2015-07-16 | Sony Corporation | Image display apparatus and method for displaying image |
US9128293B2 (en) | 2010-01-14 | 2015-09-08 | Nintendo Co., Ltd. | Computer-readable storage medium having stored therein display control program, display control apparatus, display control system, and display control method |
US20160320623A1 (en) * | 2015-05-01 | 2016-11-03 | Seiko Epson Corporation | Transmission-type display |
US20160321967A1 (en) * | 2014-08-11 | 2016-11-03 | Sung Jae Cho | Three-Dimensional Label Having Moving Patterns Using Fine Patterns And Microlens |
TWI573435B (en) * | 2009-01-20 | 2017-03-01 | Inaba Minoru | Dimensional image camera display system |
US9693039B2 (en) | 2010-05-27 | 2017-06-27 | Nintendo Co., Ltd. | Hand-held electronic device |
US20170212587A1 (en) * | 2014-09-29 | 2017-07-27 | Kyocera Corporation | Electronic device |
US9841598B2 (en) * | 2013-12-31 | 2017-12-12 | 3M Innovative Properties Company | Lens with embedded multilayer optical film for near-eye display systems |
GB2553032A (en) * | 2016-07-01 | 2018-02-21 | Google Llc | Head mounted display device having display panels with asymetric panel borders for improved nasal FOV |
CN107945649A (en) * | 2014-08-11 | 2018-04-20 | 赵成载 | The three-dimensional label for moving pattern using fine pattern and lenticule |
US20180284885A1 (en) * | 2017-03-31 | 2018-10-04 | Sony Interactive Entertainment LLC | Depth-Keying of Web Content |
US20190265481A1 (en) * | 2018-02-26 | 2019-08-29 | Shanghai Xiaoyi Technology Co., Ltd. | Method, device, and storage medium for virtual reality display |
US10642044B2 (en) | 2014-04-09 | 2020-05-05 | 3M Innovative Properties Company | Near-eye display system having a pellicle as a combiner |
Families Citing this family (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11327460A (en) * | 1998-05-08 | 1999-11-26 | Shimadzu Corp | Head mounted type display device |
JP4149037B2 (en) * | 1998-06-04 | 2008-09-10 | オリンパス株式会社 | Video system |
JP2000013818A (en) | 1998-06-23 | 2000-01-14 | Nec Corp | Stereoscopic display device and stereoscopic display method |
US7092003B1 (en) * | 1999-01-21 | 2006-08-15 | Mel Siegel | 3-D imaging arrangements |
JP2001133724A (en) * | 1999-08-25 | 2001-05-18 | Olympus Optical Co Ltd | Head-mounted type video display device |
JP2001101415A (en) * | 1999-09-29 | 2001-04-13 | Fujitsu Ten Ltd | Image recognizing device and image processor |
JP4573393B2 (en) * | 2000-01-06 | 2010-11-04 | オリンパス株式会社 | Image display device |
KR100430528B1 (en) * | 2002-07-09 | 2004-05-10 | 한국전자통신연구원 | Vergence control apparatus using oversampling and variable addressing technique and method thereof, and parallel-axis stereo camera system using the same apparatus |
JP4179946B2 (en) * | 2003-08-08 | 2008-11-12 | オリンパス株式会社 | Stereoscopic endoscope device |
RU2006129293A (en) * | 2004-01-14 | 2008-02-20 | Присижн Оптикс Корпорейшн (Us) | OPTICAL DATA DEVICE FOR STEREOSCOPIC IMAGE FORMING SYSTEMS |
US7483059B2 (en) * | 2004-04-30 | 2009-01-27 | Hewlett-Packard Development Company, L.P. | Systems and methods for sampling an image sensor |
JP4707368B2 (en) | 2004-06-25 | 2011-06-22 | 雅貴 ▲吉▼良 | Stereoscopic image creation method and apparatus |
JP4594673B2 (en) * | 2004-08-18 | 2010-12-08 | オリンパス株式会社 | Display control device for stereoscopic endoscope |
US8179423B2 (en) * | 2005-08-22 | 2012-05-15 | Ricoh Company, Ltd. | Image display system, an image display method, a coding method, and a printed matter for stereoscopic viewing |
KR100947366B1 (en) * | 2007-05-23 | 2010-04-01 | 광운대학교 산학협력단 | 3D image display method and system thereof |
KR20100002032A (en) * | 2008-06-24 | 2010-01-06 | 삼성전자주식회사 | Image generating method, image processing method, and apparatus thereof |
JP2010098567A (en) * | 2008-10-17 | 2010-04-30 | Seiko Epson Corp | Head mount full-face type image display device |
TWI527429B (en) * | 2008-10-28 | 2016-03-21 | 皇家飛利浦電子股份有限公司 | A three dimensional display system |
US8565516B2 (en) * | 2010-02-05 | 2013-10-22 | Sony Corporation | Image processing apparatus, image processing method, and program |
JP5545140B2 (en) * | 2010-09-07 | 2014-07-09 | ソニー株式会社 | Display control apparatus, display control method, and program |
WO2013125138A1 (en) * | 2012-02-22 | 2013-08-29 | ソニー株式会社 | Display apparatus, image processing apparatus, image processing method, and computer program |
US20130300634A1 (en) * | 2012-05-09 | 2013-11-14 | Nokia Corporation | Method and apparatus for determining representations of displayed information based on focus distance |
JP6596972B2 (en) * | 2015-06-24 | 2019-10-30 | 凸版印刷株式会社 | Display device, parallax image display program, and parallax image providing method |
JP2020058051A (en) * | 2019-12-05 | 2020-04-09 | マクセル株式会社 | Broadcast receiver and application control method |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0888017A2 (en) * | 1993-08-26 | 1998-12-30 | Matsushita Electric Industrial Co., Ltd. | Stereoscopic image display apparatus and related system |
JPH08322004A (en) * | 1995-05-24 | 1996-12-03 | Olympus Optical Co Ltd | Stereoscopic display device |
-
1996
- 1996-05-29 JP JP8157577A patent/JPH09322199A/en active Pending
-
1997
- 1997-05-29 US US08/865,187 patent/US6324001B2/en not_active Expired - Lifetime
Cited By (56)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030174204A1 (en) * | 2002-03-08 | 2003-09-18 | Topcon Corporation | Device and method for displaying stereo image |
EP1343334A3 (en) * | 2002-03-08 | 2004-01-07 | Topcon Corporation | Device and method for displaying stereo image |
EP1534021A1 (en) * | 2002-03-08 | 2005-05-25 | Topcon Corporation | Device and method for displaying stereo image |
US7193626B2 (en) | 2002-03-08 | 2007-03-20 | Topcon Corporation | Device and method for displaying stereo image |
EP1343334A2 (en) * | 2002-03-08 | 2003-09-10 | Topcon Corporation | Device and method for displaying stereo image |
US20050285997A1 (en) * | 2002-09-26 | 2005-12-29 | Yoshihide Koyama | 2d/3d switch liquid crystal display panel and 2d/3d selection liquid crystal display |
US7199845B2 (en) * | 2002-09-26 | 2007-04-03 | Sharp Kabushiki Kaisha | 2D/3D switch liquid crystal display panel and 2D/3D selection liquid crystal display |
US20070035618A1 (en) * | 2004-04-26 | 2007-02-15 | Olympus Corporation | Image processing apparatus |
NL1029968C2 (en) * | 2005-03-10 | 2011-02-17 | Minoru Inaba | DIGITAL STEREO CAMERA / DIGITAL STEREO VIDEO CAMERA, 3-DIMENSIONAL DISPLAY, 3-DIMENSIONAL PROJECTOR, AND PRINTER AND STEREOVIEWER. |
WO2007125257A1 (en) * | 2006-04-26 | 2007-11-08 | Essilor International (Compagnie Générale d'Optique) | Driver for display comprising a pair of binocular-type spectacles |
FR2900475A1 (en) * | 2006-04-26 | 2007-11-02 | Essilor Int | DISPLAY COMPRISING A PAIR OF BINOCULAR GLASSES AND WITH A DEVICE FOR ADJUSTING THE IMAGE |
US20100289880A1 (en) * | 2006-04-26 | 2010-11-18 | Renaud Moliton | Driver for Display Comprising a Pair of Binocular-Type Spectacles |
US8170325B2 (en) * | 2006-06-29 | 2012-05-01 | Himax Display, Inc. | Image inspecting device and method for a head-mounted display |
US20080002859A1 (en) * | 2006-06-29 | 2008-01-03 | Himax Display, Inc. | Image inspecting device and method for a head-mounted display |
EP2389003A1 (en) * | 2009-01-19 | 2011-11-23 | Minoru Inaba | Three-dimensional video image pick-up and display system |
EP2389003A4 (en) * | 2009-01-19 | 2012-05-09 | Minoru Inaba | Three-dimensional video image pick-up and display system |
EP2608552A1 (en) * | 2009-01-19 | 2013-06-26 | Minoru Inaba | Stereoscopic video imaging display system |
TWI573435B (en) * | 2009-01-20 | 2017-03-01 | Inaba Minoru | Dimensional image camera display system |
US8791989B2 (en) | 2009-01-21 | 2014-07-29 | Nikon Corporation | Image processing apparatus, image processing method, recording method, and recording medium |
EP2252070A3 (en) * | 2009-05-13 | 2013-03-27 | Ltd. Nintendo Co. | Display control program and method for controlling display capable of providing three-dimensional display |
US20100289882A1 (en) * | 2009-05-13 | 2010-11-18 | Keizo Ohta | Storage medium storing display control program for controlling display capable of providing three-dimensional display and information processing device having display capable of providing three-dimensional display |
CN102484730A (en) * | 2009-06-04 | 2012-05-30 | 寇平公司 | 3d video processor integrated with head mounted display |
US20100309295A1 (en) * | 2009-06-04 | 2010-12-09 | Chow Kenny W Y | 3d video processor integrated with head mounted display |
US20110032252A1 (en) * | 2009-07-31 | 2011-02-10 | Nintendo Co., Ltd. | Storage medium storing display control program for controlling display capable of providing three-dimensional display and information processing system |
US9019261B2 (en) | 2009-10-20 | 2015-04-28 | Nintendo Co., Ltd. | Storage medium storing display control program, storage medium storing library program, information processing system, and display control method |
US11089290B2 (en) | 2009-11-04 | 2021-08-10 | Nintendo Co., Ltd. | Storage medium storing display control program, information processing system, and storage medium storing program utilized for controlling stereoscopic display |
US20110102425A1 (en) * | 2009-11-04 | 2011-05-05 | Nintendo Co., Ltd. | Storage medium storing display control program, information processing system, and storage medium storing program utilized for controlling stereoscopic display |
US9128293B2 (en) | 2010-01-14 | 2015-09-08 | Nintendo Co., Ltd. | Computer-readable storage medium having stored therein display control program, display control apparatus, display control system, and display control method |
EP2536159A1 (en) * | 2010-02-12 | 2012-12-19 | Sony Corporation | Image processing device, image processing method, and program |
EP2536159A4 (en) * | 2010-02-12 | 2014-06-11 | Sony Corp | Image processing device, image processing method, and program |
US9088774B2 (en) | 2010-02-12 | 2015-07-21 | Sony Corporation | Image processing apparatus, image processing method and program |
EP3079360A1 (en) * | 2010-02-12 | 2016-10-12 | Sony Corporation | Image processing apparatus, image processing method, and program |
US20130010093A1 (en) * | 2010-04-01 | 2013-01-10 | Thomson Licensing Llc | Method and system of using floating window in three-dimensional (3d) presentation |
US9681113B2 (en) * | 2010-04-01 | 2017-06-13 | Thomson Licensing | Method and system of using floating window in three-dimensional (3D) presentation |
US9118895B2 (en) * | 2010-04-16 | 2015-08-25 | Sony Corporation | Data structure, image processing apparatus, image processing method, and program |
US20110254926A1 (en) * | 2010-04-16 | 2011-10-20 | Ushiki Suguru | Data Structure, Image Processing Apparatus, Image Processing Method, and Program |
US9693039B2 (en) | 2010-05-27 | 2017-06-27 | Nintendo Co., Ltd. | Hand-held electronic device |
CN103238341A (en) * | 2010-12-09 | 2013-08-07 | 索尼公司 | Image processing device, image processing method, and program |
US20130271569A1 (en) * | 2010-12-10 | 2013-10-17 | Fujitsu Limited | Stereoscopic moving picture generating apparatus and stereoscopic moving picture generating method |
US20130141425A1 (en) * | 2011-12-06 | 2013-06-06 | Novatek Microelectronics Corp. | Three-dimension image processing method |
US20150198808A1 (en) * | 2012-07-24 | 2015-07-16 | Sony Corporation | Image display apparatus and method for displaying image |
US9835864B2 (en) * | 2012-07-24 | 2017-12-05 | Sony Corporation | Image display apparatus and method for displaying image |
US9841598B2 (en) * | 2013-12-31 | 2017-12-12 | 3M Innovative Properties Company | Lens with embedded multilayer optical film for near-eye display systems |
US10578872B2 (en) | 2013-12-31 | 2020-03-03 | 3M Innovative Properties Company | Lens with embedded multilayer optical film for near-eye display systems |
US10642044B2 (en) | 2014-04-09 | 2020-05-05 | 3M Innovative Properties Company | Near-eye display system having a pellicle as a combiner |
US10223948B2 (en) * | 2014-08-11 | 2019-03-05 | Sung Jae Cho | Three-dimensional label having moving patterns using fine patterns and microlens |
US20160321967A1 (en) * | 2014-08-11 | 2016-11-03 | Sung Jae Cho | Three-Dimensional Label Having Moving Patterns Using Fine Patterns And Microlens |
CN107945649A (en) * | 2014-08-11 | 2018-04-20 | 赵成载 | The three-dimensional label for moving pattern using fine pattern and lenticule |
US20170212587A1 (en) * | 2014-09-29 | 2017-07-27 | Kyocera Corporation | Electronic device |
US10397560B2 (en) * | 2015-05-01 | 2019-08-27 | Seiko Epson Corporation | Transmission-type display |
US20160320623A1 (en) * | 2015-05-01 | 2016-11-03 | Seiko Epson Corporation | Transmission-type display |
GB2553032A (en) * | 2016-07-01 | 2018-02-21 | Google Llc | Head mounted display device having display panels with asymetric panel borders for improved nasal FOV |
DE102017114803B4 (en) | 2016-07-01 | 2023-01-26 | Google LLC (n.d.Ges.d. Staates Delaware) | Head mountable display device with display panels with asymmetrical panel borders for improved nasal field of view |
US20180284885A1 (en) * | 2017-03-31 | 2018-10-04 | Sony Interactive Entertainment LLC | Depth-Keying of Web Content |
US11086396B2 (en) * | 2017-03-31 | 2021-08-10 | Sony Interactive Entertainment LLC | Depth-keying of web content |
US20190265481A1 (en) * | 2018-02-26 | 2019-08-29 | Shanghai Xiaoyi Technology Co., Ltd. | Method, device, and storage medium for virtual reality display |
Also Published As
Publication number | Publication date |
---|---|
JPH09322199A (en) | 1997-12-12 |
US6324001B2 (en) | 2001-11-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US6324001B2 (en) | Stereo image display apparatus | |
US11669160B2 (en) | Predictive eye tracking systems and methods for foveated rendering for electronic displays | |
US20240280827A1 (en) | Dynamic field of view variable focus display system | |
US11656468B2 (en) | Steerable high-resolution display having a foveal display and a field display with intermediate optics | |
US6111597A (en) | Stereo image forming apparatus | |
US10129520B2 (en) | Apparatus and method for a dynamic “region of interest” in a display system | |
US10871825B1 (en) | Predictive eye tracking systems and methods for variable focus electronic displays | |
US5742264A (en) | Head-mounted display | |
US6445365B1 (en) | Image display apparatus and image photographing apparatus therefor | |
US7562985B2 (en) | Mirror assembly with integrated display device | |
WO2018100239A1 (en) | Imaging system and method of producing images for display apparatus | |
JPH08292394A (en) | Head-mounted image display device | |
EP3929650A1 (en) | Gaze tracking apparatus and systems | |
CN110221439A (en) | Augmented reality equipment and augmented reality adjusting method | |
KR19980063710A (en) | Display device and display method | |
US20200218067A1 (en) | Mirror Device | |
JP4102410B2 (en) | 3D image display device | |
EP4261768A1 (en) | Image processing system and method | |
JPS61261719A (en) | Monaural picture appreciating method and instrument using change of optical angle |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: OLYMPUS OPTICAL CO., LTD, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:TABATA, SEIICHIRO;REEL/FRAME:008588/0065 Effective date: 19970508 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
FPAY | Fee payment |
Year of fee payment: 4 |
|
FPAY | Fee payment |
Year of fee payment: 8 |
|
FEPP | Fee payment procedure |
Free format text: PAYER NUMBER DE-ASSIGNED (ORIGINAL EVENT CODE: RMPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
FPAY | Fee payment |
Year of fee payment: 12 |