WO2014024649A1 - 画像表示装置および画像表示方法 - Google Patents
画像表示装置および画像表示方法 Download PDFInfo
- Publication number
- WO2014024649A1 WO2014024649A1 PCT/JP2013/069387 JP2013069387W WO2014024649A1 WO 2014024649 A1 WO2014024649 A1 WO 2014024649A1 JP 2013069387 W JP2013069387 W JP 2013069387W WO 2014024649 A1 WO2014024649 A1 WO 2014024649A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- display
- observer
- displayed
- eye
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B30/00—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
- G02B30/20—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
- G02B30/22—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the stereoscopic type
- G02B30/24—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the stereoscopic type involving temporal multiplexing, e.g. using sequentially activated left and right shutters
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0093—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/001—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
- G09G3/003—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background to produce spatial visual effects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/128—Adjusting depth or disparity
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/144—Processing image signals for flicker reduction
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/156—Mixing image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/167—Synchronising or controlling image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/239—Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/332—Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
- H04N13/344—Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/366—Image reproducers using viewer tracking
- H04N13/383—Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/398—Synchronisation thereof; Control thereof
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/011—Head-up displays characterised by optical features comprising device for correcting geometrical aberrations, distortion
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0143—Head-up displays characterised by optical features the two eyes not being equipped with identical nor symmetrical optical devices
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
- G02B2027/0187—Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/332—Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
- H04N13/337—Displays for viewing with the aid of special glasses or head-mounted displays [HMD] using polarisation multiplexing
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/332—Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
- H04N13/341—Displays for viewing with the aid of special glasses or head-mounted displays [HMD] using temporal multiplexing
Definitions
- the present technology relates to an image display device and an image display method, and in particular, an image display device such as an optically transmissive head mounted display configured to superimpose an image displayed on a display element on an image of the outside world and guide the image to an observer's eye. Etc.
- a head-mounted display to be mounted on a user's head
- this head-mounted display has a configuration in which an image displayed on a small display element is magnified by a magnifying optical system and guided to an observer's eye. That is, the head-mounted display is configured to optically enlarge an image displayed on the display element and allow the user to observe it as an enlarged virtual image.
- an optically transmissive head-mounted display in which an observer can observe not only the above-described virtual image but also an external image.
- This optically transmissive head-mounted display has a configuration in which an image displayed on a display element is superimposed on an image of the outside world by an optical system and guided to an observer's eye.
- the visibility of the virtual image reproduced by this optically transmissive head mounted display depends on the environment in which this virtual image is displayed. For example, there is a problem that the observation state of the virtual image contradicts the situation in the real world and the comfortable observation is impaired, or the visibility is deteriorated depending on the display position of the virtual image.
- Patent Document 1 describes that the entire depth is adjusted in consideration of the subject that the viewer pays attention to.
- Patent Document 2 describes that the parallax of a virtual image is adjusted according to eye convergence using eye gaze detection.
- the purpose of this technology is to satisfactorily superimpose a display image on an external image.
- An optical system that superimposes the display image displayed on the display element on the image of the outside world and guides it to the observer's eyes;
- An image display apparatus comprising: a display control unit that controls a display size and a display position of the display image on the display element so that the display image is displayed in an image superposition region detected from the image of the outside world. .
- an image displayed on the display element is superimposed on an image of the outside world by an optical system and guided to an observer's eye.
- the optical system is a magnifying optical system
- the image displayed on the display element is optically magnified and observed by the observer as a magnified virtual image.
- the display control unit controls the display size and the display position of the display image superimposed on the external image.
- the display size and display position of the display image on the display element are controlled so that the display image is displayed in the image superimposing region detected from the image of the outside world.
- the image overlapping area is detected based on captured image data obtained by capturing an image of the outside world.
- a flat area included in the image of the outside world is detected as the image overlapping area.
- the detection of the image superimposition region may be performed on a cloud, for example.
- the display control unit controls the display size and the display position of the display image by processing the image data for displaying the display image on the display element based on the information on the image superimposing region (geometric transformation process). It may be made like. In this case, the display size and display position of the display image are electronically controlled, and the control becomes easy.
- display control is performed so that a display image is displayed in an image superposition region detected from an image of the outside world, for example, a flat region. Therefore, it becomes easy for the observer to visually recognize the display image superimposed and displayed on the outside image.
- the display control unit may change the display state of the display image in accordance with the state of the image superimposing region of the external image.
- the display control unit corrects the image data for displaying the display image so that the component of the image of the outside world is removed from the display image observed by the observer according to the state of the image superimposing region. May be. In this case, the visibility of the display image can be improved regardless of the state of the image of the outside world.
- the display control unit may be configured to change a display image display method when an image overlapping region is not detected from an image of the outside world. For example, the display is stopped. Further, for example, the user selects a superimposition position, and the superimposition is displayed at the position. Also, for example, it is displayed in a superimposed manner at a preset overlapping position. Further, for example, the image is superimposed and displayed at the previously displayed overlapping position. For example, the display position or the on / off state of the display is changed according to the non-detection duration.
- the display control unit smoothes the display size and the display position on the display element determined by the periodically detected image superposition region in the time direction, and determines the display size and the display position related to the control. May be acquired. In this case, it is possible to stably superimpose and display a display image on the image of the outside world even when the position and size of the image superimposition region detected for each frame is large, for example. Become.
- the display control unit may change the way of displaying the display image when it is detected that there is a change in the image of the outside world. For example, the display is stopped. Further, for example, the user selects a superimposition position, and the superimposition is displayed at the position. Also, for example, it is displayed in a superimposed manner at a preset overlapping position. Further, for example, the image is superimposed and displayed at the previously displayed overlapping position. Further, for example, the display position or the on / off state of the display is changed according to the duration of the change detection.
- the optical system includes a first optical system that superimposes the left-eye image displayed on the first display element on the image of the outside world and guides the image to the left eye of the observer, and a second display.
- a second optical system that superimposes the right-eye image displayed on the element on the image of the outside world and guides the right-eye image to the observer's right eye, and the display controller perceives the observer by the left-eye image and the right-eye image
- the parallax between the left-eye image and the right-eye image may be controlled so that the depth position of the stereoscopic image to be performed is closer to the depth position of the region where the stereoscopic image of the external image is superimposed.
- the display image (stereoscopic image) can be superimposed and displayed on the image of the outside world so as not to cause a contradiction in depth.
- An optical system that superimposes a display image displayed on the display element on an image of the outside world and guides it to an observer;
- a first control mode for controlling the display image to be displayed in an area where the observer's line of sight is concentrated in the external image, and the observer's line of sight is concentrated in the external image.
- a display control unit having a second control mode for controlling the display image to be displayed in a region other than the region where the display image is displayed.
- an image displayed on the display element is superimposed on an image of the outside world by an optical system and guided to an observer's eye.
- the optical system is a magnifying optical system
- the image displayed on the display element is optically magnified and observed by the observer as a magnified virtual image.
- the display control unit controls display of the display image superimposed on the image of the outside world in the first control mode or the second control mode.
- control is performed so that the display image is displayed in an area where the observer's line of sight is concentrated in the external image.
- the display image is controlled to be displayed in an area other than the area where the observer's line of sight is concentrated in the external image.
- the display of the display image superimposed on the image of the outside world can be controlled in the first control mode or the second control mode. That is, the display image can be displayed in an area of the external image where the observer's line of sight is concentrated, and the display image can be displayed in an area other than the area of the external image where the observer's line of sight is concentrated. Thus, it is possible to display a display image so as not to disturb the work.
- the display control unit is controlled in the first control mode when the observer is not moving, and is controlled in the second control mode when the observer is moving. Also good.
- the display mode is automatically switched to the first control mode, and the display image is displayed in an area of the external image where the observer's line of sight is concentrated. Become. That is, in this case, the observer does not need to perform mode switching in order to concentrate and observe the display image, thereby improving usability.
- the display control unit may be configured to change the display state of the image according to the state of the region where the display image of the external image is superimposed. For example, the display control unit corrects the image data for displaying the display image so that the component of the image of the outside world is removed from the display image observed by the observer according to the state of the region. Also good. In this case, the visibility of the display image can be improved regardless of the state of the image of the outside world.
- the optical system includes a first optical system that superimposes the left-eye image displayed on the first display element on the image of the outside world and guides the image to the left eye of the observer, and a second display.
- a second optical system that superimposes the right-eye image displayed on the element on the image of the outside world and guides the right-eye image to the observer's right eye, and the display controller perceives the observer by the left-eye image and the right-eye image
- the parallax between the left-eye image and the right-eye image is controlled (adjusted) so that the depth position of the stereoscopic image to be moved is closer to the depth position of the region where the stereoscopic image of the external image is superimposed.
- the display image (stereoscopic image) can be superimposed and displayed on the image of the outside world so as not to cause a contradiction in depth.
- An optical system that superimposes the display image displayed on the display element on the image of the outside world and guides it to the observer's eyes;
- An image display apparatus comprising: a display control unit that changes a display state of the display image in accordance with a region state of the external image on which the display image is superimposed.
- an image displayed on the display element is superimposed on an image of the outside world by an optical system and guided to an observer's eye.
- the optical system is a magnifying optical system
- the image displayed on the display element is optically magnified and observed by the observer as a magnified virtual image.
- the display control unit changes the display state of the display image in accordance with the state of the external image region on which the display image is superimposed.
- the area situation of the external image is acquired based on captured image data obtained by capturing the external image.
- the acquisition of the area situation of the image of the outside world may be performed on the cloud, for example.
- the display control unit corrects the image data for displaying the display image so that the component of the image of the outside world is removed from the display image observed by the observer according to the state of the region. Also good.
- the display state of the display image is changed according to the state of the area of the external image on which the display image is superimposed. Therefore, it is possible to remove an external image component from the display image observed by the observer, and it is possible to improve the visibility of the display image regardless of the state of the external image.
- a first optical system that superimposes the left-eye image displayed on the first display element on the image of the outside world and guides it to the left eye of the observer
- a second optical system that superimposes the right eye image displayed on the second display element on the image of the outside world and guides it to the right eye of the observer
- the left-eye image so that the depth position of the stereoscopic image perceived by the observer from the left-eye image and the right-eye image is closer to the depth position of the area where the stereoscopic image of the external image is superimposed.
- an image display device including a display control unit that controls parallax of the right-eye image.
- the left eye image displayed on the first display element is superimposed on the image of the outside world by the first optical system and guided to the left eye of the observer.
- the right eye image displayed on the second display element is superimposed on the image of the outside world by the second optical system and guided to the observer's right eye.
- the optical system is a magnifying optical system
- the left eye image and the right eye image are optically enlarged and observed by the observer as an enlarged virtual image.
- the parallax between the left eye image and the right eye image is controlled by the display control unit.
- the left-eye image and the right-eye image are so positioned that the depth position of the stereoscopic image perceived by the observer from the left-eye image and the right-eye image is closer to the depth position of the area where the stereoscopic image of the external image is superimposed.
- the parallax of the eye image is controlled (adjusted).
- the depth position of the display image is positioned in front of the depth position of the area based on the depth position of the external image area on which the display image (stereoscopic image) is superimposed.
- the parallax between the left eye image and the right eye image is controlled. Therefore, it is possible to superimpose and display the display image (stereoscopic image) on the image of the outside world so that no contradiction in depth occurs.
- FIG. 1 It is a figure which shows the schematic structural example of the optically transmissive head mounted display (both eyes) as embodiment. It is a figure for demonstrating the parallax map which shows the depth position of each pixel of the image of an external field, and the parallax map which shows the depth position of each pixel of a display image (stereoscopic image). It is a figure for demonstrating the smoothing process of the time direction of the display size concerning a display control of a display image, and a display position. It is a figure explaining the case where inconsistency arises in a feeling of depth in the depth position of a display image (stereoscopic image).
- FIG. 1 It is a figure which shows the example by which a display image (stereoscopic image) is superimposed and displayed on the blue sky part among the images of the outside world. It is a figure for demonstrating the case where a display image is displayed on the image superimposition area
- FIG. 1 shows a schematic configuration example of an optically transmissive head mounted display (HMD) 100 as an embodiment.
- This configuration example is a binocular HMD.
- the HMD 100 includes a left eyeglass lens portion 101L and a right eyeglass lens portion 101R.
- the eyeglass lens portion 101L and the eyeglass lens portion 101R are integrally connected by a connecting member 102.
- the eyeglass lens portions 101L and 101R are each formed by integrating an eyeglass lens and a HOE (Holographic Optical Element) sheet.
- This HOE sheet has both a function of a half mirror that synthesizes external light and display light, and a function of a concave surface and a free-form surface that enlarge a display image.
- Infrared sensors 103L and 103R are attached to the eyeglass lens portions 101L and 101R, respectively.
- the infrared sensor 103L is attached to the horizontal center position of the eyeglass lens unit 101L (the horizontal center position of the left-eye optical system).
- the infrared sensor 103R is attached to, for example, the horizontal center position of the eyeglass lens unit 101R (the horizontal center position of the right-eye optical system).
- the sensor outputs of the infrared sensors 103L and 103R are used for eye position estimation by the scleral reflection method.
- the scleral reflection method is a technique that utilizes the fact that the reflectivity is different between the cornea (black eye) and the sclera (white eye).
- the infrared sensor scans in the horizontal direction the weak infrared ray irradiated to the observer's eye side, and detects the reflected light. Since the intensity of reflected light in the cornea (black eye) and the sclera (white eye) is greatly different, it is possible to estimate the eyeball position of the observer from the sensor output.
- a gyro sensor 104 is attached to the eyeglass lens portion 101L.
- the sensor output of the gyro sensor 104 is used to determine whether the image of the outside world is changing and whether the observer (user) is moving.
- the sensor output of the gyro sensor 104 is used to detect whether or not there is a change in the image of the outside world that the observer observes through the spectacle lens units 101L and 101R.
- a camera 105L is attached to the eyeglass lens unit 101L at a horizontal center position (a horizontal center position of the left eye optical system).
- the camera 105L captures an image of the outside world (left eye image) observed by the left eye of the observer through the eyeglass lens unit 101L, and outputs captured image data.
- a camera 105R is attached to the spectacle lens unit 101R at the horizontal center position (the horizontal center position of the right-eye optical system).
- the camera 105R captures an image of the outside world (right eye image) that the observer's right eye observes through the eyeglass lens unit 101R, and outputs captured image data.
- the outputs of the cameras 105L and 105R are used to obtain information on the depth position of an external image area on which a stereoscopic image is superimposed and displayed.
- the outputs of the cameras 105L and 105R can also be used to detect whether or not there is a change in the external image observed by the observer through the eyeglass lens portions 101L and 101R.
- the outputs of the cameras 105L and 105R are used to obtain information (luminance information, color information, etc.) indicating the state of the external image area on which the stereoscopic image is superimposed and displayed. Further, the outputs of the cameras 105L and 105R are used for detecting an image superimposition area from the image of the outside world, in this embodiment, a flat area.
- the HMD 100 includes display drivers 111L and 111R and displays 112L and 112R.
- Each of the displays 112L and 112R is configured by, for example, an LCD (Liquid Crystal Display).
- the display 112L is driven by the display driver 111L based on the left eye image data, and displays a left eye image for allowing the observer to perceive a stereo stereoscopic image.
- the display 112 ⁇ / b> R is driven by the display driver 111 ⁇ / b> R based on the right eye image data, and displays a right eye image for allowing an observer to perceive a stereo stereoscopic image.
- the HMD 100 also includes an eyeball position estimation unit 121, a line-of-sight estimation unit 122, a depth / structure estimation unit 123, a display control unit 124, and a display image generation unit 125.
- the eyeball position estimation unit 121 estimates the eyeball positions of the left eye and right eye of the observer based on the sensor outputs from the infrared sensors 103L and 103R.
- the line-of-sight estimation unit 122 estimates the observer's line of sight based on the estimation results of the left and right eyeball positions in the eyeball position estimation unit 121.
- the depth / structure estimation unit 123 calculates a parallax map indicating the depth position of each pixel of the image of the outside world based on the captured image data from the cameras 105L and 105R.
- FIG. 2A shows an example of a left eye image and a right eye image of an external image
- FIG. 2B shows an example of a parallax map corresponding to the left eye image and the right eye image.
- the parallax of each pixel is image-displayed as pixel data, and the brighter portion indicates the near side as the depth position.
- FIG. 2C shows an example of a parallax histogram of the entire screen.
- the depth / structure estimation unit 123 calculates a parallax map indicating the depth position of each pixel of the display image (stereoscopic image) based on the left-eye and right-eye image data that is display image data.
- FIG. 2D shows an example of a left eye image and a right eye image
- FIG. 2E shows an example of a parallax map corresponding to the left eye image and the right eye image.
- the parallax of each pixel is displayed as image data, and the brighter portion indicates the near side as the depth position.
- FIG. 2F shows an example of a parallax histogram of the entire screen.
- the depth / structure estimation unit 123 detects an image superposition region from an image of the outside world based on the captured image data from the cameras 105L and 105R. For example, the depth / structure estimation unit 123 detects a region (flat region) having only a low frequency component in the horizontal direction and the vertical direction as an image superimposition region.
- this image superimposing area is an area for superimposing and displaying the display image, and is an area having a sufficient size in the horizontal and vertical directions. In this case, not only one image overlap region but also a plurality may be detected from an image of the outside world.
- the depth / structure estimation unit 123 determines the display size and display position of the display image based on the detected image superposition region.
- the depth / structure estimation unit 123 since the depth / structure estimation unit 123 periodically detects the above-described image overlapping region, for example, for each frame, the display size and display position of the display image are also determined for each frame.
- the depth / structure estimation unit 123 smoothes the display size and display position determined for each frame in the time direction, and displays the display size related to display control. And determine the display position.
- FIG. 3 shows an example of the smoothing process in the time direction with respect to the display position. Coordinate filtering (average, IIR, majority vote, etc.) is performed, and temporal stabilization is achieved. For example, in the case of average filtering, the coordinates (x, y) of the display area to be actually used are obtained based on the following formula (1).
- the depth / structure estimation unit 123 determines the depth position of the display image (stereoscopic image) based on the display size and the display position related to the display control determined as described above. In this case, the depth / structure estimation unit 123 determines the depth position of the display image so that the depth position of the display image is closer to the depth position of the area where the display image of the external image is superimposed and displayed.
- FIG. 4 schematically shows a case where mismatch occurs.
- the depth position of the object A in the image of the outside world is on the near side and the depth position of the object B in the display image is on the back side, if the superimposed display is performed as it is, the object A is divided by the object B and the sense of depth becomes unnatural.
- inconsistency occurs in the sense of depth, there is a problem such as making the observer feel tired.
- FIG. 5 schematically illustrates that the depth position of the display image (stereoscopic image) is positioned before the depth position of the area where the display image of the external image is superimposed and displayed, thereby avoiding inconsistency in the sense of depth. Show.
- the depth position of the object A in the image of the outside world is in front and the depth position of the object B in the display image is on the back side, the depth position of the object B is set in front of the depth position of the object A. Even if A is divided by the object B, a natural depth feeling is obtained. By matching the sense of depth in this way, the observer can observe naturally without feeling tired.
- the depth / structure estimation unit 123 determines the parallax to be given to the left eye image and the right eye image.
- the depth / structure estimation unit 123 sets Ha as the average parallax of an area in which the display image determined in the above-described manner and the display image at the display position are superimposed and displayed. Further, the depth / structure estimation unit 123 sets the average of the parallax of the entire display image having the display size determined as described above as Hb. This Hb is obtained by multiplying the above-mentioned average parallax (see FIG. 2F) of the entire display image before size adjustment by n times (n is the magnification of the display size).
- FIG. 6A consider a case where a display image is superimposed on an external image.
- FIG. 6B in the parallax map of the left-eye image and the right-eye image of the image of the outside world, a region (in a rectangular frame) in which the left-eye image and the right-eye image of the display image are superimposed and displayed.
- the parallax shown in the figure is used, and the average is Ha, as shown in FIG.
- the rectangular frame is shown twice, but the inner side matches the display size and the display position determined as described above, and the outer rectangular frame corresponds to that. A margin is added around.
- An outer rectangular frame is used as a region for obtaining Ha. This is because the relative position between the left eye image and the right eye image is changed to adjust the parallax, as will be described later, in order to set the depth position of the display image (stereoscopic image) to be closer to the depth position of the external image. Because it may be.
- FIG. 6D shows a histogram of the parallax of the entire display image before size adjustment (same as FIG. 2F).
- FIG. 6E shows a parallax histogram of the entire display screen when the display size is reduced to n times, and the average Hb of the parallax is n times the average of the parallax of the entire display screen before size adjustment. .
- the depth / structure estimation unit 123 compares the parallax average Ha related to the image of the outside world obtained as described above with the parallax average Hb related to the display image. Then, the depth / structure estimation unit 123 determines whether or not the depth position of the display image (stereoscopic image) is a distance away from the depth position of the corresponding region of the external image by a certain distance or more, that is, the parallax of the parallax average Hb It is determined whether or not the parallax average difference with respect to the average Ha is equal to or higher than H0 that satisfies this condition.
- the depth / structure estimation unit 123 determines whether the left-eye image or the right-eye image is horizontal so that a parallax average Hb having the parallax average difference equal to or higher than H0 is obtained. Adjust the display position of the direction.
- FIG. 7A shows an example of the parallax averages Ha and Hb before the display position adjustment, and shows a case where the parallax average difference is less than H0. In this case, by adjusting the horizontal display position of either or both of the left eye image and the right eye image, for example, as shown in FIG. 7B, the parallax average difference becomes equal to or higher than H0. .
- the horizontal display position of either the left-eye image or the right-eye image or both is adjusted so that the parallax average difference between the parallax average Hb and the parallax average Ha is equal to or greater than H0.
- the difference between the parallax that is 90% of the parallax histogram of the image of the outside world and the parallax that is 10% of the parallax histogram of the display image is a predetermined threshold value.
- the display control unit 124 determines the display image based on the line-of-sight estimation result of the line-of-sight estimation unit 122, the sensor output of the gyro sensor 104, and the display size and display position information of the display image determined by the depth / structure estimation unit 123. Control the display. Although not shown, a user operation signal is also supplied to the display control unit 124.
- the display control unit 124 When the display control unit 124 is instructed to display a display image by a user operation, the display control unit 124 basically displays the display image so that the display size and the display position are determined by the depth / structure estimation unit 123. Control the display.
- the display control unit 124 sets the display condition when the depth / structure estimation unit 123 does not detect a flat region that is an image superimposition region, that is, when the display size and display position information is not supplied from the depth / structure estimation unit 123. change.
- the display control unit 124 performs control so that display of the display image is stopped. Further, for example, the display condition control unit 124 controls the user to select a superimposed position and display a display image at the position. Further, for example, the display condition control unit 124 performs control so that the display image is displayed at a preset superposition position. Further, for example, the display condition control unit 124 performs control so that the display image is displayed at the previously displayed superposition position.
- the display control unit 124 controls the display position of the display image or the on / off of the display according to the non-detection duration. In this case, for example, control is performed so that the duration is displayed at the previous display position until the first time, and the duration is displayed at the preset position until the second time after the first time. If the duration exceeds the second time, the display is controlled to be stopped.
- the display control unit 124 detects whether there is a change in the image of the outside world based on the gyro sensor 104, and changes the display condition when it is detected that there is a change. For example, the display condition control unit 124 performs control so that display of the display image is stopped. Further, for example, the display condition control unit 124 controls the user to select a superimposed position and display a display image at the position. Further, for example, the display condition control unit 124 performs control so that the display image is displayed at a preset superposition position. Further, for example, the display condition control unit 124 performs control so that the display image is displayed at the previously displayed superposition position.
- the display control unit 124 controls the display position of the display image or the on / off of the display according to the duration of the change. In this case, for example, control is performed so that the duration is displayed at the previous display position until the first time, and the duration is displayed at the preset position until the second time after the first time. If the duration exceeds the second time, the display is controlled to be stopped.
- the display control unit 124 changes the display condition based on the line-of-sight estimation result from the line-of-sight estimation unit 122.
- the display condition control unit 124 performs control as follows according to the mode setting of the user (observer). The user can set “automatic control mode”, “first control mode”, or “second control mode”.
- the display control unit 124 controls the display image to be displayed in an area where the line of sight is concentrated, that is, an area matching the line of sight. Further, when the “second control mode” is set, the display control unit 124 performs control so that the display image is displayed in a region other than the region where the line of sight is concentrated, that is, a region outside the line of sight.
- the display control unit 124 performs the following control depending on whether or not the user (observer) is moving. That is, when the user is not moving, the display control unit 124 controls the display image to be displayed in the region where the line of sight is concentrated, as in the case of setting the “first control mode”. Further, when the user is moving, the display control unit 124 controls the display image to be displayed in a region other than the region where the line of sight is concentrated, as in the case of setting the “second control mode”. Note that the display control unit 124 determines whether or not the observer is moving based on the sensor output of the gyro sensor 104.
- the display image generation unit 125 displays a display image under the control of the display control unit 124
- the display image is displayed with the display size and display position determined by the depth / structure estimation unit 123.
- image data for the left eye and right eye for display is generated.
- the left eye and right eye image data supplied from outside is subjected to reduction processing and movement processing (geometric transformation processing) to obtain display left eye and right eye image data.
- the display image generation unit 125 is configured to display the display image so that the display state of the display image is changed according to the area state of the external image on which the display image is displayed under the control of the display control unit 124.
- the left eye and right eye image data are corrected. In this case, correction is performed so that an element (component) of the image of the outside world is removed from the display image observed by the observer.
- Ireal and Idisp are each divided into blocks of N ⁇ N pixels.
- the pixel at the coordinates (i, j) of the image Ireal of the outside world is set as Ireal (i, j), and the pixel at the coordinates (i, j) of the display image Idisp is set as Idisp (i, j).
- the display image generation unit 125 corrects the data of each pixel of the block of coordinates (s, t) as shown in the following formula (2).
- ⁇ is a correction coefficient
- clip (x) is a function for performing a saturation operation on x in a certain range (for example, 0 to 255). Although detailed description is omitted, this pixel data correction is performed for each color data of red, green, and blue.
- the second term in parentheses may be used by smoothing the correction value of the peripheral block as shown in the following mathematical formula (3).
- the operation of the HMD 100 shown in FIG. 1 will be described.
- the left eye image data generated by the display image generation unit 125 is supplied to the display driver 111L.
- the display 112L is driven by the display driver 111L, and a left eye image is displayed on the display 112L.
- the right eye image data generated by the display image generation unit 125 is supplied to the display driver 111R.
- the display 112R is driven by the display driver 111R, and a right eye image is displayed on the display 112R.
- the light from the left eye image displayed on the display 112L is superimposed on the image of the outside world by the spectacle lens unit 101L and reaches the left eye of the observer. Thereby, the left eye image superimposed on the external image (left eye image) is observed with the left eye of the observer.
- the light from the right eye image displayed on the display 112R reaches the observer's right eye by being superimposed on the image of the outside world by the eyeglass lens unit 101R. Thereby, the right eye image superimposed and displayed on the external image (right eye image) is observed by the right eye of the observer.
- the left eye image and the right eye image superimposed on the image of the outside world are observed on the left eye and the right eye of the observer, respectively, so that the observer can display the display image superimposed on the image of the outside world.
- 3D stereoscopic
- the sensor output of the infrared sensor 103L attached to the center position in the horizontal direction of the eyeglass lens section 101L (the center position in the horizontal direction of the left eye optical system) is supplied to the eyeball position estimation section 121. Further, the sensor output of the infrared sensor 103R attached to the center position in the horizontal direction of the eyeglass lens 101R (the center position in the horizontal direction of the right eye optical system) is supplied to the eyeball position estimation unit 121.
- the eyeball position estimation unit 121 estimates the eyeball positions of the left and right eyes of the observer based on the sensor outputs from the infrared sensors 103L and 103R. Then, the gaze estimation unit 122 estimates the gaze of the observer based on the estimation results of the eyeball positions of the left eye and the right eye in the eyeball position estimation unit 121. This line-of-sight estimation result is supplied to the display control unit 124.
- the sensor output of the gyro sensor 104 attached to the eyeglass lens unit 101L is supplied to the display control unit 124.
- the output (left-eye captured image data) of the camera 105L attached to the center position in the horizontal direction of the eyeglass lens unit 101L (the center position in the horizontal direction of the left-eye optical system) is supplied to the depth / structure estimation unit 123. .
- the output (right-eye captured image data) of the camera 105R attached to the center position in the horizontal direction of the eyeglass lens section 101R (the center position in the horizontal direction of the right-eye optical system) is supplied to the depth / structure estimation section 123. Is done.
- the depth / structure estimation unit 123 is further supplied with left-eye and right-eye image data as display image data.
- the depth / structure estimation unit 123 detects a flat region as an image superimposition region from an image of the outside world based on the captured image data from the cameras 105L and 105R.
- the depth / structure estimation unit 123 determines the display size and display position of the display image based on the detected flat area. In this case, for example, the display size and display position determined for each frame are smoothed and stabilized in the time direction.
- the depth / structure estimation unit 123 calculates a parallax map indicating the depth position of each pixel of the image of the outside world based on the captured image data from the cameras 105L and 105R, and displays the left eye and the right eye as display image data.
- a parallax map indicating the depth position of each pixel of the display image (stereoscopic image) is calculated based on the image data.
- the depth position of the display image (stereoscopic image) based on the display size and display position related to the display control determined as described above and the parallax map calculated as described above. Is determined.
- the depth position of the display image is determined so as to be in front of the depth position of the area in which the display image of the external image is superimposed and displayed.
- the depth / structure estimation unit 123 determines whether or not the condition that the depth position of the display image (stereoscopic image) is in front of the depth position of the corresponding area of the image of the outside world is more than a certain distance. If not satisfied, the depth / structure estimation unit 123 adjusts the display position in the horizontal direction of one or both of the left-eye image and the right-eye image, and performs parallax adjustment so as to satisfy the condition. Information on the display size and the display position relating to the display control determined by the depth / structure estimation unit 123 is supplied to the display control unit 124.
- the display control unit 124 controls display of the display image based on the line-of-sight estimation result of the line-of-sight estimation unit 122, the sensor output of the gyro sensor 104, and the display size and display position information determined by the depth / structure estimation unit 123.
- the display control unit 124 basically controls the display image to be displayed at the display size and display position determined by the depth / structure estimation unit 123. Is done.
- the display control unit 124 changes the display method when the depth / structure estimation unit 123 does not detect a flat region that is an image superimposition region.
- the display control unit 124 changes the display method when it is detected based on the gyro sensor 104 that there is a change in the image of the outside world.
- the display control unit 124 changes the display method based on the line-of-sight estimation result from the line-of-sight estimation unit 122. This control is performed according to the mode setting of the user (observer). For example, the user can set “automatic control mode”, “first control mode”, or “second control mode”.
- the display image When the “first control mode” is set, the display image is controlled to be displayed in an area where the line of sight is concentrated. Further, when the “second control mode” is set, the display image is controlled to be displayed in an area other than the area where the line of sight is concentrated.
- the display image when “automatic mode” is set, when the observer is moving, the display image is controlled to be displayed outside the area where the line of sight is concentrated, and the observer is not moving The display image is controlled to be displayed in an area where the line of sight is concentrated.
- FIG. 8 shows a display example in this case.
- the display image is displayed on the wall part away from the stove part.
- the display image generation unit 125 is supplied with display image data of the left eye and right eye.
- the display image generation unit 125 is supplied with captured image data from the cameras 105L and 105R.
- the left-eye and right-eye images for displaying the display image so that the display image is displayed with the determined display size and display position under the control of the display control unit 124. Data is generated.
- the left eye and right eye image data supplied from outside is subjected to reduction processing and movement processing to obtain display left eye and right eye image data.
- the display size and the display position are changed electronically. Note that when the display of the display image is stopped, the generation of the left-eye and right-eye image data is stopped.
- the display image generation unit 125 corrects the image data of the left eye and the right eye so that the display state of the display image is changed according to the state of the area of the external image on which the display image is displayed ( Formula (2)). By correcting the image data in this way, it becomes possible to improve the visibility of the display image regardless of the state of the image of the outside world.
- FIG. 9 shows an example in which a display image (stereoscopic image) is superimposed and displayed on a blue sky portion of an image of the outside world.
- a display image stereographic image
- the display image is observed in a bluish state by the observer due to the influence of the blue sky.
- the influence of the blue sky is reduced, and the observer can view the display image in a good state.
- the left eye image data for display generated by the display image generation unit 125 is supplied to the display driver 111L, and the left eye image based on the left eye image data is displayed on the display 112L.
- the right eye image data for display generated by the display image generating unit 125 is supplied to the display driver 111R, and the right eye image based on the right eye image data is displayed on the display 112R.
- the left eye image and the right eye image superimposed on the image of the outside world are observed on the left eye and the right eye of the observer, respectively, and the observer is provided with an appropriate position on the image of the outside world.
- a display image (stereoscopic image) superimposed and displayed in size is perceived at a depth position in front of the external image.
- the display image is displayed in the image superimposition area detected from the image of the outside world, for example, a flat area, so that the observer can easily view the display image superimposed on the image of the outside world. It becomes. For example, consider the case where the image of the outside world is as shown in FIG. 10A and the display image is as shown in FIG.
- the flat area of the wall portion on the upper center side is detected as the image superimposition area.
- the display size and the display position are processed so that the display image shown in FIG. 10B is displayed in this flat region.
- the display image is superimposed on the image of the outside world.
- step ST1 the display control unit 124 starts an operation, for example, in accordance with a user power-on operation.
- the display control unit 124 initializes the display position of the display image, for example, to a preset value. Note that the display size is uniquely determined for the display position initialized in this way.
- step ST3 the display control unit 124 determines whether or not an image is displayed. For example, when the user performs an image display setting operation, the display control unit 124 determines that the image display is being performed. In the case of image display, the display control unit 124 determines the user's mode setting value in step ST4.
- the display control unit 124 sets a region outside the line of sight in the image of the outside world as a display target in step ST5.
- the display control unit 124 sets, in step ST ⁇ b> 6, an area that matches the line of sight in the external image as a display target.
- the display control unit 124 determines whether or not the user is moving in step ST7. When the user is moving, in step ST5, the display control unit 124 sets a region outside the line of sight in the external image as a display target. On the other hand, when the user is not moving, the display control unit 124 sets an area that matches the line of sight in the image of the outside world as a display target in step ST6.
- the display control unit 124 proceeds to the process of step ST8 after the operations of step ST5 and step ST6.
- step ST8 the display control unit 124 sets a flat region of the display target as a display position. When there are a plurality of flat regions on the display target, for example, the region with the largest area is set as the display position. Thereby, the display size and display position of the display image are determined.
- step ST9 the display control unit 124 determines whether or not the image of the outside world has changed. If it has changed, the duration of the change is determined in step ST10. When the duration is less than “th1”, the display control unit 124 does not change the display position in step ST11. When the duration is “th1” or more and less than “th2”, the display control unit 124 sets the display position as the preset position in step ST12. When the duration is “th2” or more, the display control unit 124 stops displaying the display image in step ST13.
- the display control unit 124 returns to the process of step ST2 after the process of step ST13, and performs the same process as described above.
- the display control unit 124 proceeds to the process of step ST14 after the processes of step ST11 and step ST12.
- the table display control unit 124 smoothes the display position (display size) in the time direction. Thereby, even if a display position (display size) changes rapidly, it can be changed smoothly.
- step ST15 the display control unit 124 adjusts the depth position of the display image (stereoscopic image) according to the depth position of the external image.
- the display left eye and right eye image data is generated according to the display size and display position from the depth / structure estimation unit 123. That adjustment is made.
- the display positions of the left-eye and right-eye display images are moved and adjusted in the horizontal direction according to the depth position of the external image at the preset position.
- step ST ⁇ b> 16 the display control unit 124 corrects the left-eye and right-eye image data for display according to the state of the display area of the external image. Then, in step ST17, the display control unit 124 performs image display. That is, the display image generation unit 125 supplies display left and right eye image data to the display drivers 111L and 111R, respectively, and displays the left eye image and the right eye image on the displays 112L and 112R.
- step ST18 the display control unit 124 determines whether or not the image is finished. For example, when the user performs an image display cancel operation, the display control unit 124 determines that the image has ended. When it is the end of the image, the display control unit 124 ends the image display in step ST19 and ends the display control process. On the other hand, when the image is not finished, the display control unit 124 returns to step ST3 and repeats the same processing as described above.
- the display control unit 124 controls the display position of the display image or the on / off of the display in accordance with the non-detection duration of the flat area, for example.
- the HMD main body includes all the components of the HMD 100 illustrated in FIG. 1, a configuration in which a part thereof is arranged in a control box connected by wire or wireless, and further, a part of the configuration is arranged in a network. A configuration in which it is arranged on a cloud connected via a network is also conceivable.
- FIG. 12 schematically shows an arrangement example of each component of the HMD 100.
- FIG. 12A shows that all the components of the HMD 100 are arranged in the HMD main body.
- FIG. 12B shows that some of the components of the HMD 100 are arranged in the control box.
- the eyeball position estimation unit 121, the gaze estimation unit 122, the depth / structure estimation unit 123, the display control unit 124, and the display image generation unit 125 are arranged in the control box, and the rest are arranged in the HMD main body.
- FIG. 12C shows that some of the components of the HMD 100 are arranged on the cloud.
- the eyeball position estimation unit 121, the line-of-sight estimation unit 122, the depth / structure estimation unit 123, the display control unit 124, and the display image generation unit 125 are arranged on the cloud, and the rest are arranged on the HMD main body.
- FIG. 12D shows that some of the components of the HMD 100 are arranged on the control and the cloud.
- the eyeball position estimation unit 121, the gaze estimation unit 122, and the display image generation unit 125 are arranged in the control box
- the depth / structure estimation unit 123 and the display control unit 124 are arranged on the cloud
- the rest are the HMDs. Arranged on the body.
- display control is performed so that a display image is displayed in an image superimposition area detected from an image of the outside world, for example, a flat area. It becomes easy to visually recognize the display image superimposed on the image of the outside world.
- the display of the display image superimposed on the image of the outside world can be controlled in the “first control mode” or the “second control mode”. That is, a display image can be displayed in an area of the external image where the observer's line of sight is concentrated, and a display image can be displayed in an area other than the area of the external image where the observer's line of sight is concentrated. Can be displayed, and a display image can be displayed so as not to disturb the work.
- the control is performed in the “first control mode” when the observer is not moving (user), and the observer (user) is moving. At some point, it is controlled in the “second control mode”. Therefore, in a situation where the observer is not moving, the state is automatically switched to the “first control mode”, and the display image is displayed in an area of the outside world where the observer's line of sight is concentrated. It becomes. That is, in this case, the observer does not need to perform mode switching in order to concentrate and observe the display image, thereby improving usability.
- the display state of the display image is changed according to the state of the area of the external image on which the display image is superimposed. Therefore, it is possible to remove an external image component from a display image observed by an observer (user), and it is possible to improve the visibility of the display image regardless of the state of the external image.
- the depth position of the display image is in front of the depth position of the area based on the depth position of the external image area on which the display image (stereoscopic image) is superimposed.
- the parallax between the left eye image and the right eye image is controlled. Therefore, it is possible to superimpose and display the display image (stereoscopic image) on the image of the outside world so that no contradiction in depth occurs.
- the depth / structure estimation unit 123 calculates a parallax map indicating the depth position of each pixel of the image of the outside world based on the captured image data from the cameras 105L and 105R.
- a distance measuring sensor 106 is provided in the connection member 102, and based on the sensor output, the depth / structure estimation unit 123 indicates the depth position of each pixel of the image of the outside world.
- An HMD 100A configured to calculate a map is also conceivable.
- the left-eye and right-eye image data for display are different from the captured image data of the left-eye and right-eye images of the external image obtained by being imaged by the cameras 105L and 105R. It is said.
- an HMD 100 ⁇ / b> B configured to use left-eye and right-eye image data as display left-eye and right-eye image data is also conceivable.
- FIG. 15 shows a configuration example of a monocular HMD 100C.
- portions corresponding to those in FIG. 1 are denoted by the same reference numerals, and detailed description thereof is omitted.
- the HMD 100C is a monocular HMD
- the HMD 100 shown in FIG. 1 has two spectacle lens portions 101L and 101R, but has one spectacle lens portion 101.
- An infrared sensor 103 that functions in the same manner as the infrared sensors 103L and 103R in the HMD 100 shown in FIG. 1 is attached to the center position in the horizontal direction of the eyeglass lens unit 101 (the center position in the horizontal direction of the optical system).
- the sensor output of the infrared sensor 103 is sent to the eyeball position estimation unit 114.
- a gyro sensor 104 is attached to the eyeglass lens unit 101.
- the sensor output of the gyro sensor 104 is used to determine whether the image of the outside world is changing and whether the observer (user) is moving.
- the sensor output of the gyro sensor 104 is sent to the display control unit 124.
- the camera lens 105 is attached to the spectacle lens unit 101 at a horizontal center position (a horizontal center position of the optical system).
- the camera 105 functions in the same manner as the cameras 105L and 105R in the HMD 100 shown in FIG. 1, captures an image of the outside world observed by the left eye or right eye of the observer through the eyeglass lens unit 101, and outputs captured image data. .
- This captured image data is sent to the structure estimation unit 123C.
- the eyeball position estimation unit 121 estimates the eyeball position of the observer's eye (left eye or right eye) based on the sensor output from the infrared sensor 103. Then, the line-of-sight estimation unit 122 estimates the line of sight of the observer based on the estimation result of the eyeball position in the eyeball position estimation unit 121. This line-of-sight estimation result is supplied to the display control unit 124.
- the structure estimation unit 123C detects a flat area as an image superimposition area from an image of the outside world based on the captured image data from the camera 105. Then, the structure estimation unit 123 determines the display size and display position of the display image based on the detected flat area. In this case, for example, the display size and display position determined for each frame are smoothed and stabilized in the time direction. Information on the display size and the display position related to the display control determined by the structure estimation unit 123C is supplied to the display control unit 124.
- the display control unit 124 controls display of the display image based on the line-of-sight estimation result of the line-of-sight estimation unit 122, the sensor output of the gyro sensor 104, and the display size and display position information determined by the structure estimation unit 123C.
- the display control unit 124 basically performs control so that the display image is displayed at the display size and the display position determined by the structure estimation unit 123C. Is called.
- the display control unit 124 changes the display method when the structure estimation unit 123C does not detect a flat region that is an image superimposition region.
- the display control unit 124 changes the display method when it is detected based on the gyro sensor 104 that there is a change in the image of the outside world.
- the display control unit 124 changes the display method based on the line-of-sight estimation result from the line-of-sight estimation unit 122. This control is performed according to the mode setting of the user (observer). For example, the user can set “automatic control mode”, “first control mode”, or “second control mode”.
- the display image When the “first control mode” is set, the display image is controlled to be displayed in an area where the line of sight is concentrated. Further, when the “second control mode” is set, the display image is controlled to be displayed in an area other than the area where the line of sight is concentrated.
- the display image when “automatic mode” is set, when the observer is moving, the display image is controlled to be displayed outside the area where the line of sight is concentrated, and the observer is not moving The display image is controlled to be displayed in an area where the line of sight is concentrated.
- Image data is supplied to the display image generation unit 125.
- captured image data from the camera 105 is supplied to the display image generation unit 125.
- the display image generation unit 125 generates display image data for displaying the display image so that the display image is displayed with the determined display size and display position under the control of the display control unit 124. Is done.
- image data for display is generated by performing reduction processing and movement processing on the image data supplied from the outside. Note that when the display of the display image is stopped, the generation of the image data is stopped. In addition, the display image generation unit 125 corrects the display image data so that the display state of the display image is changed in accordance with the area state of the external image on which the display image is displayed (formula (2) )reference).
- the other parts of the HMD 100C shown in FIG. 15 are not described in detail, but they are configured in the same manner as the HMD 100 shown in FIG.
- the configuration for estimating the line of sight of the observer (user) is not limited to this configuration.
- the present technology is not limited to application to an optically transmissive head mounted display, and can be similarly applied to other transmissive display devices. In this case, it is not essential to display a virtual image.
- this technique can also take the following structures.
- an optical system that superimposes a display image displayed on a display element on an image of the outside world and guides the image to an observer's eye;
- An image display apparatus comprising: a display control unit that controls a display size and a display position of the display image on the display element so that the display image is displayed in an image superimposition region detected from the image of the outside world.
- the image display device according to (1) wherein the image superimposing region is detected based on captured image data obtained by capturing the image of the outside world.
- the display control unit Control the display size and display position of the display image by processing the image data for displaying the display image on the display element based on the information of the image superimposing area.
- (1) or (2) The image display device described.
- the display control unit The image display device according to any one of (1) to (3), wherein a display state of the display image is changed according to a state of the image superimposing region of the image of the outside world.
- the display control unit The image data for displaying the display image is corrected so that an element of the image of the outside world is removed from the display image observed by the observer according to the state of the image superimposing region.
- Image display device (6)
- the display control unit The image display device according to any one of (1) to (5), wherein a display method of the display image is changed when the image superimposed region is not detected from the external image.
- the display control unit The display size and the display position on the display element determined by the periodically detected image superposition region are smoothed in the time direction, and the display size and the display position according to the control are acquired.
- (1) to (6) The image display device according to any one of the above.
- the display control unit The image display device according to any one of (1) to (7), wherein when the change in the image of the outside world is detected, the display method of the display image is changed.
- the optical system is displayed on the first optical system that superimposes the left-eye image displayed on the first display element on the image of the outside world and guides it to the left eye of the observer, and the second display element.
- the display control unit The left eye so that the depth position of the stereoscopic image perceived by the observer from the left eye image and the right eye image is in front of the depth position of the area where the stereoscopic image of the external image is superimposed and displayed.
- the image display device according to any one of (1) to (8), wherein parallax between the image and the right-eye image is controlled.
- An image display apparatus comprising: a display control unit having a second control mode for controlling the display image to be displayed in a region other than the region where the display image is displayed.
- the display control unit The image display device according to (11), wherein control is performed in the first control mode when the observer is not moving, and control is performed in the second control mode when the observer is moving.
- the display control unit The image display device according to (11) or (12), wherein the display state of the image is changed according to a situation of a region where the display image of the external image is superimposed.
- the optical system is displayed on the first optical system that superimposes the left-eye image displayed on the first display element on the image of the outside world and guides it to the left eye of the observer, and the second display element.
- the display control unit The left eye so that the depth position of the stereoscopic image perceived by the observer from the left eye image and the right eye image is in front of the depth position of the area where the stereoscopic image of the external image is superimposed and displayed.
- the image display device according to any one of (11) to (13), wherein the parallax between the image and the right-eye image is controlled.
- the display control unit The image display device according to (16), wherein an area state of the external image is acquired based on captured image data obtained by capturing the external image. (18) superimposing a display image displayed on the display element by the optical system on an image of the outside world and guiding the image to the observer's eye;
- An image display method comprising: changing a display state of the display image in accordance with a region state of the external image on which the display image is superimposed.
- An image display device comprising: a display control unit that controls parallax between the image and the right-eye image.
- 100, 100A to 100C Head mounted display 101, 101L, 101R ... Glasses lens part 102 ... Connection member 103, 103L, 103R ... Infrared sensor 104 ... Gyro sensor 105, 105L, 104R ⁇ Camera 106 ⁇ Distance sensors 111, 111L and 111R ⁇ Display drivers 112, 112L and 112R ⁇ Display 121 ⁇ Eye position estimation unit 122 ⁇ Gaze estimation unit 123 ⁇ Depth ⁇ Structure estimation unit 123C ... Structure estimation unit 124 ... Display control unit 125 ... Display image generation unit
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- General Engineering & Computer Science (AREA)
- Optics & Photonics (AREA)
- Human Computer Interaction (AREA)
- Computer Hardware Design (AREA)
- Computer Graphics (AREA)
- Software Systems (AREA)
- Controls And Circuits For Display Device (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
Abstract
Description
表示素子に表示された表示画像を外界の像に重畳して観察者眼に導く光学系と、
上記外界の像から検出された画像重畳領域に上記表示画像が表示されるように、上記表示素子上における上記表示画像の表示サイズおよび表示位置を制御する表示制御部とを備える
画像表示装置にある。
表示素子に表示された表示画像を外界の像に重畳して観察者に導く光学系と、
上記外界の像のうち上記観察者の視線が集中している領域に上記表示画像が表示されるように制御する第1の制御モードと、上記外界の像のうち上記観察者の視線が集中している領域以外の領域に上記表示画像が表示されるように制御する第2の制御モードとを有する表示制御部とを備える
画像表示装置にある。
表示素子に表示された表示画像を外界の像に重畳して観察者眼に導く光学系と、
上記表示画像が重畳される上記外界の像の領域状況に応じて、該表示画像の表示状態を変更する表示制御部とを備える
画像表示装置にある。
第1の表示素子に表示された左眼画像を外界の像に重畳して観察者の左眼に導く第1の光学系と、
第2の表示素子に表示された右眼画像を外界の像に重畳して観察者の右眼に導く第2の光学系と、
上記左眼画像および上記右眼画像により上記観察者が知覚する立体画像の奥行き位置が、上記外界の像の該立体画像が重畳される領域の奥行き位置より手前になるように、上記左眼画像および上記右眼画像の視差を制御する表示制御部とを備える
画像表示装置にある。
1.実施の形態
2.変形例
[光学透過ヘッドマウントディスプレイの構成例]
図1は、実施の形態としての光学透過のヘッドマウントディスプレイ(HMD)100の概略的な構成例を示している。この構成例は両眼HMDである。このHMD100は、左側のメガネレンズ部101Lと、右側のメガネレンズ部101Rを有している。メガネレンズ部101Lとメガネレンズ部101Rは、接続部材102により一体的に接続されている。
ディスプレイドライバ111L,111Rにそれぞれ表示用の左眼、右眼の画像データ
を供給し、ディスプレイ112L,112Rに左眼画像、右眼画像を表示する。
なお、上述実施の形態においては、奥行き・構造推定部123は、カメラ105L,105Rからの撮像画像データに基づいて、外界の像の各画素の奥行き位置を示す視差マップを算出している。しかし、図13に示すように、例えば、接続部材102の部分に測距離センサ106を設け、そのセンサ出力に基づいて、奥行き・構造推定部123が外界の像の各画素の奥行き位置を示す視差マップを算出する構成のHMD100Aも考えられる。
(1)表示素子に表示された表示画像を外界の像に重畳して観察者眼に導く光学系と、
上記外界の像から検出された画像重畳領域に上記表示画像が表示されるように、上記表示素子上における上記表示画像の表示サイズおよび表示位置を制御する表示制御部とを備える
画像表示装置。
(2)上記画像重畳領域は、上記外界の像を撮像して得られた撮像画像データに基づいて検出される
前記(1)に記載の画像表示装置。
(3)上記表示制御部は、
上記表示画像を上記表示素子に表示するための画像データを上記画像重畳領域の情報に基づいて処理することで、上記表示画像の表示サイズおよび表示位置を制御する
前記(1)または(2)に記載の画像表示装置。
(4)上記表示制御部は、
上記外界の像の上記画像重畳領域の状況に応じて、上記表示画像の表示状態を変更する
前記(1)から(3)のいずれかに記載の画像表示装置。
(5)上記表示制御部は、
上記表示画像を表示するための画像データを、上記画像重畳領域の状況に応じて、観察者が観察する表示画像から上記外界の像の要素が取り除かれるように補正する
前記(4)に記載の画像表示装置。
(6)上記表示制御部は、
上記外界の像から上記画像重畳領域が検出されないとき、上記表示画像の表示の仕方を変更する
前記(1)から(5)のいずれかに記載の画像表示装置。
(7)上記表示制御部は、
周期的に検出される画像重畳領域で決まる上記表示素子上の表示サイズおよび表示位置を時間方向に平滑化して、上記制御に係る表示サイズおよび表示位置を取得する
前記(1)から(6)のいずれかに記載の画像表示装置。
(8)上記表示制御部は、
上記外界の像に変化があることが検出されるとき、上記表示画像の表示の仕方を変更する
前記(1)から(7)のいずれかに記載の画像表示装置。
(9)上記光学系は、第1の表示素子に表示された左眼画像を外界の像に重畳して観察者の左眼に導く第1の光学系と、第2の表示素子に表示された右眼画像を外界の像に重畳して観察者の右眼に導く第2の光学系とを有し、
上記表示制御部は、
上記左眼画像および上記右眼画像により上記観察者が知覚する立体画像の奥行き位置が、上記外界の像の該立体画像が重畳表示される領域の奥行き位置より手前になるように、上記左眼画像および上記右眼画像の視差を制御する
前記(1)から(8)のいずれかに記載の画像表示装置。
(10)光学系により表示素子に表示された表示画像を外界の像に重畳して観察者眼に導くステップと、
上記外界の像から検出された画像重畳領域に上記表示画像が表示されるように、上記表示素子上における上記表示画像の表示サイズおよび表示位置を制御するステップとを備える
画像表示方法。
(11)表示素子に表示された表示画像を外界の像に重畳して観察者に導く光学系と、
上記外界の像のうち上記観察者の視線が集中している領域に上記表示画像が表示されるように制御する第1の制御モードと、上記外界の像のうち上記観察者の視線が集中している領域以外の領域に上記表示画像が表示されるように制御する第2の制御モードとを有する表示制御部とを備える
画像表示装置。
(12)上記表示制御部は、
上記観察者が移動中でないとき上記第1の制御モードで制御し、上記観察者が移動中であるとき上記第2の制御モードで制御する
前記(11)に記載の画像表示装置。
(13)上記表示制御部は、
上記外界の像の上記表示画像が重畳される領域の状況に応じて、上記画像の表示状態を変更する
前記(11)または(12)に記載の画像表示装置。
(14)上記光学系は、第1の表示素子に表示された左眼画像を外界の像に重畳して観察者の左眼に導く第1の光学系と、第2の表示素子に表示された右眼画像を外界の像に重畳して観察者の右眼に導く第2の光学系とを有し、
上記表示制御部は、
上記左眼画像および上記右眼画像により上記観察者が知覚する立体画像の奥行き位置が、上記外界の像の該立体画像が重畳表示される領域の奥行き位置より手前になるように、上記左眼画像および上記右眼画像の視差を制御する
前記(11)から(13)のいずれかに記載の画像表示装置。
(15)光学系により表示素子に表示された表示画像を外界の像に重畳して観察者眼に導くステップと、
上記外界の像のうち上記観察者の視線が集中している領域に上記表示画像を表示する制御と、上記外界の像のうち上記観察者の視線が集中している領域以外の領域に上記表示画像を表示する制御とを選択的に行うステップとを備える
画像表示方法。
(16)表示素子に表示された表示画像を外界の像に重畳して観察者眼に導く光学系と、
上記表示画像が重畳される上記外界の像の領域状況に応じて、該表示画像の表示状態を変更する表示制御部とを備える
画像表示装置。
(17)上記表示制御部は、
上記外界の像を撮像して得られた撮像画像データに基づいて、上記外界の像の領域状況を取得する
前記(16)に記載の画像表示装置。
(18)光学系により表示素子に表示された表示画像を外界の像に重畳して観察者眼に導くステップと、
上記表示画像が重畳される上記外界の像の領域状況に応じて、該表示画像の表示状態を変更するステップとを備える
画像表示方法。
(19)第1の表示素子に表示された左眼画像を外界の像に重畳して観察者の左眼に導く第1の光学系と、
第2の表示素子に表示された右眼画像を外界の像に重畳して観察者の右眼に導く第2の光学系と、
上記左眼画像および上記右眼画像により上記観察者が知覚する立体画像の奥行き位置が、上記外界の像の該立体画像が重畳表示される領域の奥行き位置より手前になるように、上記左眼画像および上記右眼画像の視差を制御する表示制御部とを備える
画像表示装置。
(20)第1の光学系により第1の表示素子に表示された左眼画像を外界の像に重畳して観察者の左眼に導くステップと、
第2の光学系により第2の表示素子に表示された右眼画像を外界の像に重畳して観察者の右眼に導くステップと、
上記左眼画像および上記右眼画像により上記観察者が知覚する立体画像の奥行き位置が、上記外界の像の該立体画像が重畳表示される領域の奥行き位置より手前になるように、上記左眼画像および上記右眼画像の視差を制御するステップとを備える
画像表示方法。
101,101L,101R・・・メガネレンズ部
102・・・接続部材
103,103L,103R・・・赤外線センサ
104・・・ジャイロセンサ
105,105L,104R・・・カメラ
106・・・測距センサ
111,111L,111R・・・ディスプレイドライバ
112,112L,112R・・・ディスプレイ
121・・・眼球位置推定部
122・・・視線推定部
123・・・奥行き・構造推定部
123C・・・構造推定部
124・・・表示制御部
125・・・表示用画像生成部
Claims (20)
- 表示素子に表示された表示画像を外界の像に重畳して観察者眼に導く光学系と、
上記外界の像から検出された画像重畳領域に上記表示画像が表示されるように、上記表示素子上における上記表示画像の表示サイズおよび表示位置を制御する表示制御部とを備える
画像表示装置。 - 上記画像重畳領域は、上記外界の像を撮像して得られた撮像画像データに基づいて検出される
請求項1に記載の画像表示装置。 - 上記表示制御部は、
上記表示画像を上記表示素子に表示するための画像データを上記画像重畳領域の情報に基づいて処理することで、上記表示画像の表示サイズおよび表示位置を制御する
請求項1に記載の画像表示装置。 - 上記表示制御部は、
上記外界の像の上記画像重畳領域の状況に応じて、上記表示画像の表示状態を変更する
請求項1に記載の画像表示装置。 - 上記表示制御部は、
上記表示画像を表示するための画像データを、上記画像重畳領域の状況に応じて、観察者が観察する表示画像から上記外界の像の要素が取り除かれるように補正する
請求項4に記載の画像表示装置。 - 上記表示制御部は、
上記外界の像から上記画像重畳領域が検出されないとき、上記表示画像の表示の仕方を変更する
請求項1に記載の画像表示装置。 - 上記表示制御部は、
周期的に検出される画像重畳領域で決まる上記表示素子上の表示サイズおよび表示位置を時間方向に平滑化して、上記制御に係る表示サイズおよび表示位置を取得する
請求項1に記載の画像表示装置。 - 上記表示制御部は、
上記外界の像に変化があることが検出されるとき、上記表示画像の表示の仕方を変更する
請求項1に記載の画像表示装置。 - 上記光学系は、第1の表示素子に表示された左眼画像を外界の像に重畳して観察者の左眼に導く第1の光学系と、第2の表示素子に表示された右眼画像を外界の像に重畳して観察者の右眼に導く第2の光学系とを有し、
上記表示制御部は、
上記左眼画像および上記右眼画像により上記観察者が知覚する立体画像の奥行き位置が、上記外界の像の該立体画像が重畳表示される領域の奥行き位置より手前になるように、上記左眼画像および上記右眼画像の視差を制御する
請求項1に記載の画像表示装置。 - 光学系により表示素子に表示された表示画像を外界の像に重畳して観察者眼に導くステップと、
上記外界の像から検出された画像重畳領域に上記表示画像が表示されるように、上記表示素子上における上記表示画像の表示サイズおよび表示位置を制御するステップとを備える
画像表示方法。 - 表示素子に表示された表示画像を外界の像に重畳して観察者に導く光学系と、
上記外界の像のうち上記観察者の視線が集中している領域に上記表示画像が表示されるように制御する第1の制御モードと、上記外界の像のうち上記観察者の視線が集中している領域以外の領域に上記表示画像が表示されるように制御する第2の制御モードとを有する表示制御部とを備える
画像表示装置。 - 上記表示制御部は、
上記観察者が移動中でないとき上記第1の制御モードで制御し、上記観察者が移動中であるとき上記第2の制御モードで制御する
請求項11に記載の画像表示装置。 - 上記表示制御部は、
上記外界の像の上記表示画像が重畳される領域の状況に応じて、上記画像の表示状態を変更する
請求項11に記載の画像表示装置。 - 上記光学系は、第1の表示素子に表示された左眼画像を外界の像に重畳して観察者の左眼に導く第1の光学系と、第2の表示素子に表示された右眼画像を外界の像に重畳して観察者の右眼に導く第2の光学系とを有し、
上記表示制御部は、
上記左眼画像および上記右眼画像により上記観察者が知覚する立体画像の奥行き位置が、上記外界の像の該立体画像が重畳表示される領域の奥行き位置より手前になるように、上記左眼画像および上記右眼画像の視差を制御する
請求項11に記載の画像表示装置。 - 光学系により表示素子に表示された表示画像を外界の像に重畳して観察者眼に導くステップと、
上記外界の像のうち上記観察者の視線が集中している領域に上記表示画像を表示する制御と、上記外界の像のうち上記観察者の視線が集中している領域以外の領域に上記表示画像を表示する制御とを選択的に行うステップとを備える
画像表示方法。 - 表示素子に表示された表示画像を外界の像に重畳して観察者眼に導く光学系と、
上記表示画像が重畳される上記外界の像の領域状況に応じて、該表示画像の表示状態を変更する表示制御部とを備える
画像表示装置。 - 上記表示制御部は、
上記外界の像を撮像して得られた撮像画像データに基づいて、上記外界の像の領域状況を取得する
請求項16に記載の画像表示装置。 - 光学系により表示素子に表示された表示画像を外界の像に重畳して観察者眼に導くステップと、
上記表示画像が重畳される上記外界の像の領域状況に応じて、該表示画像の表示状態を変更するステップとを備える
画像表示方法。 - 第1の表示素子に表示された左眼画像を外界の像に重畳して観察者の左眼に導く第1の光学系と、
第2の表示素子に表示された右眼画像を外界の像に重畳して観察者の右眼に導く第2の光学系と、
上記左眼画像および上記右眼画像により上記観察者が知覚する立体画像の奥行き位置が、上記外界の像の該立体画像が重畳表示される領域の奥行き位置より手前になるように、上記左眼画像および上記右眼画像の視差を制御する表示制御部とを備える
画像表示装置。 - 第1の光学系により第1の表示素子に表示された左眼画像を外界の像に重畳して観察者の左眼に導くステップと、
第2の光学系により第2の表示素子に表示された右眼画像を外界の像に重畳して観察者の右眼に導くステップと、
上記左眼画像および上記右眼画像により上記観察者が知覚する立体画像の奥行き位置が、上記外界の像の該立体画像が重畳表示される領域の奥行き位置より手前になるように、上記左眼画像および上記右眼画像の視差を制御するステップとを備える
画像表示方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014529404A JP6520119B2 (ja) | 2012-08-06 | 2013-07-17 | 画像処理装置および画像処理方法 |
CN201380040731.2A CN104509108A (zh) | 2012-08-06 | 2013-07-17 | 图像显示设备和图像显示方法 |
US14/418,533 US9618759B2 (en) | 2012-08-06 | 2013-07-17 | Image display apparatus and image display method |
US15/457,859 US10670880B2 (en) | 2012-08-06 | 2017-03-13 | Image display apparatus and image display method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012173556 | 2012-08-06 | ||
JP2012-173556 | 2012-08-06 |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/418,533 A-371-Of-International US9618759B2 (en) | 2012-08-06 | 2013-07-17 | Image display apparatus and image display method |
US15/457,859 Continuation US10670880B2 (en) | 2012-08-06 | 2017-03-13 | Image display apparatus and image display method |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014024649A1 true WO2014024649A1 (ja) | 2014-02-13 |
Family
ID=50067884
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/069387 WO2014024649A1 (ja) | 2012-08-06 | 2013-07-17 | 画像表示装置および画像表示方法 |
Country Status (4)
Country | Link |
---|---|
US (2) | US9618759B2 (ja) |
JP (1) | JP6520119B2 (ja) |
CN (1) | CN104509108A (ja) |
WO (1) | WO2014024649A1 (ja) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104967837A (zh) * | 2015-06-30 | 2015-10-07 | 西安三星电子研究有限公司 | 用于调整三维显示效果的设备和方法 |
WO2015159847A1 (ja) * | 2014-04-18 | 2015-10-22 | シャープ株式会社 | シースルー型表示装置およびその表示方法 |
JP2016024273A (ja) * | 2014-07-17 | 2016-02-08 | 株式会社ソニー・コンピュータエンタテインメント | 立体画像提示装置、立体画像提示方法、およびヘッドマウントディスプレイ |
JP2016104603A (ja) * | 2014-12-01 | 2016-06-09 | 日本精機株式会社 | 車両用表示システム |
JP2019134341A (ja) * | 2018-01-31 | 2019-08-08 | 株式会社ジャパンディスプレイ | 表示装置及び表示システム |
Families Citing this family (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101590777B1 (ko) * | 2014-12-16 | 2016-02-11 | 경북대학교 산학협력단 | 스테레오 비젼에서의 시차 보정장치 및 그 방법 |
CN107211118B (zh) * | 2014-12-31 | 2020-02-07 | 诺基亚技术有限公司 | 立体成像 |
US10757399B2 (en) * | 2015-09-10 | 2020-08-25 | Google Llc | Stereo rendering system |
US20170115489A1 (en) * | 2015-10-26 | 2017-04-27 | Xinda Hu | Head mounted display device with multiple segment display and optics |
KR101766756B1 (ko) * | 2015-11-20 | 2017-08-10 | 경북대학교 산학협력단 | 스테레오 비전 시스템의 렉티피케이션 장치 및 그 방법 |
KR20180061956A (ko) * | 2016-11-30 | 2018-06-08 | 삼성전자주식회사 | 눈 위치 예측 방법 및 장치 |
CN109725418B (zh) * | 2017-10-30 | 2020-10-16 | 华为技术有限公司 | 显示设备、用于调整显示设备的图像呈现的方法及装置 |
US11057667B2 (en) * | 2017-11-17 | 2021-07-06 | Gfycat, Inc. | Selection of a prerecorded media file for superimposing into a video |
US11057601B2 (en) * | 2017-11-17 | 2021-07-06 | Gfycat, Inc. | Superimposing a prerecorded media file into a video |
US10523912B2 (en) * | 2018-02-01 | 2019-12-31 | Microsoft Technology Licensing, Llc | Displaying modified stereo visual content |
US10845872B2 (en) * | 2018-02-09 | 2020-11-24 | Ricoh Company, Ltd. | Eye-gaze tracker, eye-gaze tracking method, and recording medium |
JP2020043387A (ja) * | 2018-09-06 | 2020-03-19 | キヤノン株式会社 | 画像処理装置、画像処理方法、プログラム、及び、記憶媒体 |
US10945042B2 (en) | 2018-11-19 | 2021-03-09 | Gfycat, Inc. | Generating an interactive digital video content item |
CN111580273B (zh) * | 2019-02-18 | 2022-02-01 | 宏碁股份有限公司 | 视频穿透式头戴显示器及其控制方法 |
US11315328B2 (en) * | 2019-03-18 | 2022-04-26 | Facebook Technologies, Llc | Systems and methods of rendering real world objects using depth information |
US11315483B2 (en) * | 2019-04-18 | 2022-04-26 | Google Llc | Systems, devices, and methods for an infrared emitting display |
US11786694B2 (en) | 2019-05-24 | 2023-10-17 | NeuroLight, Inc. | Device, method, and app for facilitating sleep |
US11989344B2 (en) * | 2020-02-21 | 2024-05-21 | Maxell, Ltd. | Information display device |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006154890A (ja) * | 2004-11-25 | 2006-06-15 | Olympus Corp | 情報端末装置及びシステム |
JP2006267604A (ja) * | 2005-03-24 | 2006-10-05 | Canon Inc | 複合情報表示装置 |
JP2010177788A (ja) * | 2009-01-27 | 2010-08-12 | Brother Ind Ltd | ヘッドマウントディスプレイ |
JP2010237522A (ja) * | 2009-03-31 | 2010-10-21 | Brother Ind Ltd | 画像提示システム及びこの画像提示システムに用いられるヘッドマウントディスプレイ |
JP2011091789A (ja) * | 2009-09-24 | 2011-05-06 | Brother Industries Ltd | ヘッドマウントディスプレイ |
JP2012108379A (ja) * | 2010-11-18 | 2012-06-07 | Nec Corp | 情報表示システム、装置、方法及びプログラム |
Family Cites Families (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH05328408A (ja) | 1992-05-26 | 1993-12-10 | Olympus Optical Co Ltd | ヘッド・マウンテッド・ディスプレイ |
JP3744984B2 (ja) * | 1995-10-04 | 2006-02-15 | キヤノン株式会社 | 情報表示装置 |
JP3802630B2 (ja) | 1996-12-28 | 2006-07-26 | オリンパス株式会社 | 立体画像生成装置および立体画像生成方法 |
JP3450792B2 (ja) * | 1999-03-25 | 2003-09-29 | キヤノン株式会社 | 奥行き画像計測装置及び方法、並びに複合現実感提示システム |
JP3492942B2 (ja) * | 1999-06-30 | 2004-02-03 | 株式会社東芝 | 装着型情報呈示装置および方法および記憶媒体 |
US6522474B2 (en) * | 2001-06-11 | 2003-02-18 | Eastman Kodak Company | Head-mounted optical apparatus for stereoscopic display |
JP4009851B2 (ja) * | 2002-05-20 | 2007-11-21 | セイコーエプソン株式会社 | 投写型画像表示システム、プロジェクタ、プログラム、情報記憶媒体および画像投写方法 |
EP1571839A1 (en) * | 2004-03-04 | 2005-09-07 | C.R.F. Società Consortile per Azioni | Head-mounted system for projecting a virtual image within an observer's field of view |
CN100359363C (zh) * | 2004-05-06 | 2008-01-02 | 奥林巴斯株式会社 | 头戴式显示装置 |
US20090040233A1 (en) * | 2004-06-10 | 2009-02-12 | Kakuya Yamamoto | Wearable Type Information Presentation Device |
JP2006121231A (ja) * | 2004-10-19 | 2006-05-11 | Olympus Corp | プロジェクタ |
KR101108435B1 (ko) * | 2005-05-31 | 2012-02-16 | 서강대학교산학협력단 | 의사윤곽 제거 방법 및 이 방법이 적용되는 디스플레이장치 |
JP4901539B2 (ja) * | 2007-03-07 | 2012-03-21 | 株式会社東芝 | 立体映像表示システム |
CN201145774Y (zh) * | 2008-01-23 | 2008-11-05 | 胡超 | 轻便式立体显示装置 |
CN102232294B (zh) * | 2008-12-01 | 2014-12-10 | 图象公司 | 用于呈现具有内容自适应信息的三维动态影像的方法和系统 |
US20110075257A1 (en) * | 2009-09-14 | 2011-03-31 | The Arizona Board Of Regents On Behalf Of The University Of Arizona | 3-Dimensional electro-optical see-through displays |
KR20110057629A (ko) * | 2009-11-24 | 2011-06-01 | 엘지전자 주식회사 | Ui 제공 방법 및 디지털 방송 수신기 |
JP2011216937A (ja) * | 2010-03-31 | 2011-10-27 | Hitachi Consumer Electronics Co Ltd | 立体画像表示装置 |
JP5685499B2 (ja) * | 2010-07-09 | 2015-03-18 | 株式会社東芝 | 表示装置、画像データ生成装置、画像データ生成プログラム及び表示方法 |
-
2013
- 2013-07-17 CN CN201380040731.2A patent/CN104509108A/zh active Pending
- 2013-07-17 WO PCT/JP2013/069387 patent/WO2014024649A1/ja active Application Filing
- 2013-07-17 US US14/418,533 patent/US9618759B2/en active Active
- 2013-07-17 JP JP2014529404A patent/JP6520119B2/ja not_active Expired - Fee Related
-
2017
- 2017-03-13 US US15/457,859 patent/US10670880B2/en not_active Expired - Fee Related
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006154890A (ja) * | 2004-11-25 | 2006-06-15 | Olympus Corp | 情報端末装置及びシステム |
JP2006267604A (ja) * | 2005-03-24 | 2006-10-05 | Canon Inc | 複合情報表示装置 |
JP2010177788A (ja) * | 2009-01-27 | 2010-08-12 | Brother Ind Ltd | ヘッドマウントディスプレイ |
JP2010237522A (ja) * | 2009-03-31 | 2010-10-21 | Brother Ind Ltd | 画像提示システム及びこの画像提示システムに用いられるヘッドマウントディスプレイ |
JP2011091789A (ja) * | 2009-09-24 | 2011-05-06 | Brother Industries Ltd | ヘッドマウントディスプレイ |
JP2012108379A (ja) * | 2010-11-18 | 2012-06-07 | Nec Corp | 情報表示システム、装置、方法及びプログラム |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015159847A1 (ja) * | 2014-04-18 | 2015-10-22 | シャープ株式会社 | シースルー型表示装置およびその表示方法 |
JP2016024273A (ja) * | 2014-07-17 | 2016-02-08 | 株式会社ソニー・コンピュータエンタテインメント | 立体画像提示装置、立体画像提示方法、およびヘッドマウントディスプレイ |
US10306214B2 (en) | 2014-07-17 | 2019-05-28 | Sony Interactive Entertainment Inc. | Stereoscopic image presenting device, stereoscopic image presenting method, and head-mounted display |
JP2016104603A (ja) * | 2014-12-01 | 2016-06-09 | 日本精機株式会社 | 車両用表示システム |
CN104967837A (zh) * | 2015-06-30 | 2015-10-07 | 西安三星电子研究有限公司 | 用于调整三维显示效果的设备和方法 |
US10531066B2 (en) | 2015-06-30 | 2020-01-07 | Samsung Electronics Co., Ltd | Method for displaying 3D image and device for same |
JP2019134341A (ja) * | 2018-01-31 | 2019-08-08 | 株式会社ジャパンディスプレイ | 表示装置及び表示システム |
JP7008521B2 (ja) | 2018-01-31 | 2022-01-25 | 株式会社ジャパンディスプレイ | 表示装置及び表示システム |
Also Published As
Publication number | Publication date |
---|---|
US20170276956A1 (en) | 2017-09-28 |
CN104509108A (zh) | 2015-04-08 |
US20150261003A1 (en) | 2015-09-17 |
US9618759B2 (en) | 2017-04-11 |
JP6520119B2 (ja) | 2019-05-29 |
US10670880B2 (en) | 2020-06-02 |
JPWO2014024649A1 (ja) | 2016-07-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2014024649A1 (ja) | 画像表示装置および画像表示方法 | |
EP3029935A1 (en) | Holographic displaying method and device based on human eyes tracking | |
US9338370B2 (en) | Visual system having multiple cameras | |
JP6644371B2 (ja) | 映像表示装置 | |
EP2701390B1 (en) | Apparatus for adjusting displayed picture, display apparatus and display method | |
JP6248931B2 (ja) | 画像表示装置および画像表示方法 | |
US8208008B2 (en) | Apparatus, method, and program for displaying stereoscopic images | |
JPWO2018008232A1 (ja) | 情報処理装置、情報処理方法、及びプログラム | |
US10992917B2 (en) | Image processing device, image processing method, program, and image processing system that use parallax information | |
CN108632599B (zh) | 一种vr图像的显示控制系统及其显示控制方法 | |
CN106570852B (zh) | 一种实时3d图像态势感知方法 | |
JP2007052304A (ja) | 映像表示システム | |
CN104967837A (zh) | 用于调整三维显示效果的设备和方法 | |
WO2017133160A1 (zh) | 一种智能眼镜透视方法及系统 | |
JP2019025082A (ja) | 画像処理装置、カメラ装置及び画像処理方法 | |
JP2019029876A (ja) | 画像処理装置、カメラ装置及び出力制御方法 | |
US20120320038A1 (en) | Three-dimensional image processing apparatus, method for processing three-dimensional image, display apparatus, and computer program | |
TWI589150B (zh) | 3d自動對焦顯示方法及其系統 | |
KR100439341B1 (ko) | 시각 피로 감소를 위한 스테레오 영상의 초점심도 조정장치 및 그 방법 | |
JP2015228543A (ja) | 電子機器および表示処理方法 | |
JPH11155154A (ja) | 立体映像処理装置 | |
JP6887824B2 (ja) | 画像処理装置、画像処理方法およびプログラム | |
KR101747929B1 (ko) | 입체영상의 화질을 개선하기 위한 장치 및 방법 | |
WO2012176683A1 (ja) | 画像表示装置および画像表示システム | |
JP2019029875A (ja) | 画像処理装置、カメラ装置及び画像処理方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13827311 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2014529404 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14418533 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 13827311 Country of ref document: EP Kind code of ref document: A1 |