WO2010061689A1 - 表示装置、端末装置および表示方法 - Google Patents
表示装置、端末装置および表示方法 Download PDFInfo
- Publication number
- WO2010061689A1 WO2010061689A1 PCT/JP2009/067469 JP2009067469W WO2010061689A1 WO 2010061689 A1 WO2010061689 A1 WO 2010061689A1 JP 2009067469 W JP2009067469 W JP 2009067469W WO 2010061689 A1 WO2010061689 A1 WO 2010061689A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- display device
- display
- image
- image data
- unit
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B30/00—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
- G02B30/20—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
- G02B30/26—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type
- G02B30/27—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type involving lenticular arrays
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B30/00—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
- G02B30/20—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
- G02B30/26—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type
- G02B30/27—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type involving lenticular arrays
- G02B30/28—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type involving lenticular arrays involving active lenticular arrays
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B30/00—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
- G02B30/20—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
- G02B30/26—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type
- G02B30/30—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type involving parallax barriers
- G02B30/31—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type involving parallax barriers involving active parallax barriers
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/302—Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
- H04N13/31—Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using parallax barriers
- H04N13/315—Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using parallax barriers the parallax barriers being time-variant
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/356—Image reproducers having separate monoscopic and stereoscopic modes
- H04N13/359—Switching between monoscopic and stereoscopic modes
Definitions
- the present invention relates to a display device and a display method for displaying an image, and particularly to a display device, a terminal device and a display method for displaying a stereoscopic image.
- stereoscopic display devices have attracted attention as a new added value for portable devices.
- a means for displaying a stereoscopic image a method of projecting images having binocular parallax to the left and right eyes is generally used, and a stereoscopic display of a method in which a display panel is provided with a lenticular lens and a parallax barrier as image distribution means.
- a device There is a device.
- time-division type stereoscopic display device that includes two light sources that collect light on the right eye and the left eye and alternately projects left and right parallax images on the right eye and the left eye (see, for example, Patent Document 1). .)
- 3D display devices are suitable for mounting on portable devices because they do not need to wear special glasses and do not have the trouble of wearing glasses.
- Mobile phones equipped with a parallax barrier type stereoscopic display device have been commercialized (for example, see Non-Patent Document 1).
- the parallax images are spatially separated and projected, the place where the observer can see as a regular three-dimensional object is limited.
- the area that the observer sees as a stereoscopic image is called a stereoscopic viewing area, and the stereoscopic viewing area is determined when the stereoscopic display device is designed.
- the left-eye image and the right-eye image appear to overlap (so-called double image), or an image with a sense of depth reversed (so-called inverse) Problem).
- FIG. 1 shows an example of an optical model for projecting parallax images to the left and right eyes of an observer in a parallax barrier type stereoscopic display device.
- FIG. 1 is a cross-sectional view as viewed from above the observer's head, and the observer's eyes (right eye 55R and left eye 55L) are located on an observation surface 30 that is an optimum observation distance OD away from the display surface of the display device.
- the positional relationship is such that the center of the display panel coincides with the center of both eyes of the observer.
- the display panel (not shown) is composed of light modulation element groups (for example, a liquid crystal panel) that are pixels arranged in a matrix.
- FIG. 1 shows alternately right-eye pixels 4R and left-eye pixels. Only the pixels at both ends and the center of the panel of 4L are shown.
- the parallax barrier 6 that functions as image distribution means is disposed in the back of the display panel as viewed from the observer.
- the parallax barrier 6 is a barrier (light-shielding plate) in which a large number of thin vertical stripe-shaped slits 6a are formed.
- the parallax barrier 6 has a barrier against the direction in which the left-eye pixels 4L and the right-eye pixels 4R are arranged on the display panel.
- a light source (not shown: so-called backlight) is installed further behind the parallax barrier, and the light emitted from the light source passes through the slit 6a, and the intensity is modulated at the pixels in the display panel and projected toward the observer.
- the projection direction of the right-eye pixel 4R and the left-eye pixel 4L is limited by the presence of the slit 6a.
- the locus of the light passing through the nearest pixel among the light emitted from each slit 6a is illustrated as the light ray 20, the right eye region 70R where the projected images of all the right eye pixels 4R overlap and all the left eyes.
- a left eye region 70L in which the projected images of the pixel for use 4L overlap is obtained. Only the projected image from the right eye pixel 4R can be observed in the right eye region 70R, and only the projected image from the left eye pixel 4L can be observed in the left eye region 70L. Therefore, when the right eye 55R of the observer is located in the right eye area 70R and the left eye 55L is located in the left eye area 70L, the observer visually recognizes a stereoscopic image when the parallax images are projected to the left and right eyes. To do. In other words, the observer can observe a desired stereoscopic image when the right eye 55R is located in the right eye region 70R and the left eye 55L is located in the left eye region 70L.
- the projection images (width P ′) are all designed to overlap.
- the width P ′ of the projected image can be determined mainly by the distance h between the slit 6 a and the pixel, the pixel pitch P, and the optimum observation distance OD.
- P ′ is designed to be equal to the binocular distance e, where b is the distance between the eyes.
- P ′ is designed to be equal to the binocular distance e, where b is the distance between the eyes.
- P ′ is smaller than the binocular interval e
- the region where stereoscopic vision can be performed is limited to P ′.
- P ′ is larger than the binocular interval e, only the area where both eyes are located in the right eye area 70R or the left eye area 70L is increased.
- the shortest distance ND and the longest distance FD with respect to the panel that allows the observer to perform stereoscopic viewing are also determined by the restriction of the binocular interval e and the right eye region 70R and the left eye region 70L.
- the region in which the observer visually recognizes the stereoscopic image by the projection of the parallax image is not only the right eye region 70R and the left eye region 70L optically determined by the image distribution unit, but also the distance between the eyes of the observer. determined from e.
- the stereoscopic viewing area may be expressed by the area of the midpoint M between the right eye 55R and the left eye 55L of the observer.
- the stereoscopic viewing area 71 in this case is a diamond-like square.
- the stereoscopic viewing area 71 of FIG. 2 is effective only when the parallel relationship between the plane including the eyes of the observer and the display panel surface is maintained.
- FIG. 3 shows an optical model in the case where the parallax barrier 6 functioning as image distribution means is arranged in front of the display panel as viewed from the observer. Similar to the example arranged in the back of the observer's display panel, the observer is positioned at the optimum observation position OD, and the projected images (width P ′) of the left and right pixels (width P) overlap on the observation surface 30. Designed to be The locus of the light passing through the closest slit 6a out of the light emitted from each pixel is illustrated as a light ray 20, so that the right eye region 70R where the projected images of all the right eye pixels 4R overlap and the left eye for all the eyes. A left eye region 70L where the projected images of the pixels 4L overlap is obtained.
- FIG. 4 shows a stereoscopic viewing area when a lenticular lens is used as the image distribution means.
- FIG. 4 is the same as FIG. 3 except for the image distribution means.
- the observer is out of the stereoscopic viewing zone using a lenticular lens optical model.
- FIG. 5 is a cross-sectional view seen from above the observer when the observer moves to the right and deviates from the stereoscopic viewing area 71 expressed using the midpoint M between the right eye 55R and the left eye 55L.
- the observer's right eye 55R is located outside the right eye region 70R, and the left eye 55L is located within the right eye region 70R.
- the light beam 20 passing through the principal point (vertex) of the cylindrical lens 3a closest to the left eye pixel 4L and the right eye pixel 4R at the position of the right eye 55R of the observer is Not reach.
- the second left-eye region 72 is obtained when illustrated as the light beam 21. . That is, in FIG. 5, the observer observes the projection image from the left-eye pixel 4L with the right eye 55R, and observes the projection image from the right-eye pixel 4R with the left eye 55L. When observed, the pop-out and the depth are opposite (so-called reverse viewing), and a desired stereoscopic image cannot be observed.
- JP 2001-66547 A (pages 3 to 4, FIG. 6) JP-A-9-152668 JP 2000-152285 A Japanese Patent Laid-Open No. 11-72697
- an observer can adjust the display device to an optimal position for stereoscopic viewing using his / her body, but the display device itself may be tilted due to external factors such as device operation and vehicle shake. Or move.
- the observer's binocular position may deviate from the stereoscopic viewing area.
- Observers are not only uncomfortable to see double images and reverse vision, but in some cases, repeated regular images and double images and reverse vision can cause fatigue, resulting in dizziness and motion sickness. There may be such symptoms.
- a general viewpoint tracking method must be equipped with a camera, an image processing function for detecting the position of the viewpoint, and an infrared irradiation device. Due to the increase in the size of the device and the demand for advanced image processing capabilities, There is a problem that it is not suitable for a stereoscopic display device.
- An object of the present invention is to provide a display device, a terminal device, and a display method that solve the above-described problems.
- the display device of the present invention includes: A display device for displaying an image, The movement of the display device is detected, and the image is displayed in one of three-dimensional display and flat display according to the detected movement.
- the display method of the present invention includes: A display method for displaying an image on a display device, Detecting the movement of the display device; In accordance with the detected movement, there is a process of displaying the image in one of three-dimensional display and flat display.
- the movement of the display device is detected, and the image is displayed in either one of the stereoscopic display and the flat display according to the detected movement, which is contrary to the intention of the observer. Even if the viewer moves out of the stereoscopic viewing zone due to the movement of the display device, it is possible to prevent the viewer from easily observing a reverse-viewed image or a double image and to give the viewer a feeling of discomfort or fatigue. Can be avoided.
- FIG. 1 It is a figure which shows the optical model explaining the case where an observer remove
- FIG. 1 It is a front view of the display apparatus which concerns on this invention. It is sectional drawing of the display apparatus which concerns on this invention. It is a functional block diagram of the display controller in the 1st Embodiment of this invention. It is an image of the image data in the first and second embodiments of the present invention. It is an image of the image data in the first and second embodiments of the present invention. It is an optical model when an observer observes a parallax image with the display apparatus of this invention in the optimal position.
- this is an optical model when the width of the projected image of the pixel is not equal to the distance between the eyes of the observer.
- the optical model is a case where the distance between the center slit and the end slit of the image distribution means is not equal to the binocular interval.
- the optical model is a case where the distance between the center slit and the end slit of the image distribution means is not equal to the binocular interval. It is an optical model when the display apparatus of this invention moves along the X-axis. It is an optical model when the display apparatus of this invention moves along the X-axis.
- FIG. 6 is a front view of an example of a display device to which the present invention is applied.
- FIG. 7 is a cross-sectional view of the cross section of the housing at line b in FIG. 6 as viewed from above the observer's head.
- the display panel 11, the image distribution means 13, the display controller 12, and the operation switch 14 are accommodated in the housing 10.
- the display panel 11 is a transmissive liquid crystal panel in which a plurality of unit pixels are formed in a matrix.
- the display panel 11 is a unit aligned in the horizontal direction, which is a direction parallel to the direction in which the eyes of the observer are aligned.
- the pixels are alternately used as the left eye pixel 4L and the right eye pixel 4R. In FIG. 7, the left-eye pixel 4L and the right-eye pixel 4R other than both ends and the center of the display panel 11 are not shown.
- the image distribution means 13 is an electro-optical element that displays a parallax barrier pattern, and for example, a transmissive liquid crystal panel is applicable.
- the transmissive portion serving as a slit is perpendicular to the display panel 11 and is positioned between the right-eye pixel 4R and the left-eye pixel 4L. Placed in.
- the distance to the display panel 11 and the slit pitch determine the observer's optimum observation position, the projection image from all the right-eye pixels 4R of the display panel 11 is projected to the right eye 55R of the observer 50 for display.
- the projection images from all the left-eye pixels 4L on the panel 11 are designed to be projected onto the left eye 55L of the observer 50.
- the parallax barrier pattern is not displayed, it does not function as a barrier, so that the projected images from both the right-eye and left-eye pixels are projected onto the viewer's eyes, as in the normal panel display. Is done.
- the image distribution means 13 controls the projection of the image displayed on the display panel 11 from the display panel 11 to the outside.
- the display controller 12 has a function of driving the display panel 11, a function of controlling a barrier, and a function of detecting the movement of the housing 10 and determining stereoscopic vision.
- the display controller 12 will be described with reference to FIG.
- FIG. 8 is a functional block diagram of the display controller 12 in the first embodiment of the present invention.
- the display controller 12 includes an image generation unit 100, a detection unit 80, a determination unit 90, a display panel drive circuit 110, and an image distribution control circuit 111.
- the detection unit 80 includes a sensor for detecting a displacement that occurs as a result of the movement of the housing 10.
- the displacement of the housing 10 is a change in the tilt angle or the amount of movement. For example, if a sensor such as an acceleration sensor or a geomagnetic sensor is used, it can be calculated based on gravitational acceleration or geomagnetism.
- the determination unit 90 includes a memory 91 that holds information about the tilt angle and the amount of movement obtained from the sensor of the detection unit 80, information about the stereoscopic viewing area of the display panel 11, and the like, and obtains it appropriately from the sensor of the detection unit 80. And an arithmetic unit 92 for determining whether or not both eyes of the observer are within the stereoscopic viewing area from the stored information and the information stored in the memory 91.
- the image generation unit 100 has a function of generating image data to be sent to the display panel 11 and includes an arithmetic unit 101, a data storage unit 102, a memory 103, and an external IF (Interface) 104. Further, the image generation unit 100 has a function of generating image data having parallax (3D data) or generating image data having no parallax (2D data) in accordance with a signal from the determination unit 90.
- Image data is generated by reading display target data in the data storage unit 102 by the arithmetic unit 101 and performing image processing.
- the display target data is three-dimensional data including depth information, and the arithmetic unit 101 performs rendering processing.
- a method of generating two-dimensional image data by applying is preferable.
- 3D data used for stereoscopic display that is, two-dimensional image data for left and right eyes having parallax, is generated by setting two virtual viewpoints corresponding to the left and right eyes of the observer and performing rendering processing on each.
- 2D data used for flat display that is, image data having no parallax, is generated by setting one viewpoint corresponding to the center of both eyes of the observer and performing a rendering process.
- each two-dimensional image data generated for stereoscopic display is half that of the display panel 11. That is, when the image of the generated image data is illustrated, the 3D data is as shown in FIG. 9A, and the 2D data is as shown in FIG. 9B.
- image data is preferably generated from three-dimensional data including depth information.
- display target data that has been subjected to rendering processing in advance is stored in the data storage unit 102 and selectively read out. May be. That is, a method of accumulating in the form of two-dimensional data corresponding to FIGS. 9A and 9B that does not include depth information, and selecting and reading out the three-dimensional display according to the planar display may be used. Since this method does not require rendering processing, the processing capability and calculation speed of the computing unit 101 may be lower than those that require rendering. For this reason, there exists an advantage which can comprise the image generation part 100 at low cost.
- the image generation unit 100 generates 2D / 3D data according to the signal from the determination unit 90 and outputs the 2D / 3D data to the display panel drive circuit 110.
- the image generation unit 100 has a function of sending a signal for enabling the barrier to the image distribution control circuit 111 during stereoscopic display and a signal for disabling the barrier during planar display.
- the display panel drive circuit 110 has a function of generating a signal (synchronization signal or the like) necessary for driving the display panel 11, and the image distribution control circuit 111 generates a signal for displaying the parallax barrier pattern. It has a function to do.
- the image distribution means 13 is, for example, an electro-optic that can be turned on / off by an electric signal in which a lenticular lens is formed using a plurality of liquid crystal lenses instead of an electro-optic element that turns on / off the parallax barrier pattern.
- An element may be used.
- the computing units 92 and 101 exist independently in the determination unit 90 and the image generation unit 100, but the same computing unit may be used. Further, a processing function may be provided in an arithmetic unit that processes other functions (for example, communication control) of the portable display device to be applied, or in another processor.
- the image distribution means uses an electro-optical element that displays a parallax barrier pattern, but may be a means that constitutes a lenticular lens as described above.
- the parallax barrier may be replaced with the lenticular lens, and the slit may be replaced with the principal point of the lens.
- FIG. 10 is an example of an optical model when the observer 50 performs a stereoscopic view on the display device of the present invention, that is, when a parallax image is observed.
- the XYZ rectangular coordinate system is defined as follows.
- the direction in which the eyes of the observer 50 are lined up and the horizontal direction of the display panel 11 is defined as the X axis.
- a direction perpendicular to the X-axis and the projection plane of the display device (or the plane in which pixels in parallel relation are formed in a matrix) is defined as the Y-axis.
- An axis that intersects the projection plane of the display device at a right angle is taken as a Z axis.
- the positive and negative directions of the XYZ axes are as shown in FIG.
- the observer 50 and the display device are in the most suitable positional relationship for stereoscopic viewing, and the distance between the image distribution means 13 and the observer's eyes 55R and 55L is the optimum observation distance OD.
- the XY plane separated by a distance OD is set as the optimum observation surface.
- the image distribution means 13 is functioning as a parallax barrier, and the distance between the center slit and the end slit is WS.
- the display panel 11 includes a plurality of unit pixels and is alternately used as the left-eye pixel 4L and the right-eye pixel 4R in the X-axis direction, but the left-eye pixel 4L and the right-eye pixel in the center portion. Only 4R is shown.
- the pitch (width) of unit pixels is P
- the width of the projected image projected from the slit located closest to each pixel is P ′.
- the light rays forming the projection image P ′ from the left-eye pixel and the right-eye pixel at both ends and the center of the display panel 11 are 22R, 23R, 24R, 25R, 22L, 23L, 24L, and 25L, respectively.
- the display panel 11 and the image distribution means 13 have the projection images P ′ of all the right eye pixels overlapped on the optimum observation plane, and the projection images P ′ of all the left eye pixels are overlapped. It is preferable that they are designed to overlap. Furthermore, it is preferable that P ′ is set equal to the observer's binocular interval e.
- the optimum observation distance OD is a design value.
- FIG. 11 shows an optical model when the width P ′ of the projected image of the pixel is not equal to the observer's binocular distance e.
- the stereoscopic viewing area is narrowed.
- the right eye region 70R and the left eye region 70L can be expanded, but it is impossible to place the eyes of the observer at an arbitrary position. This does not expand the stereoscopic viewing area.
- the distance between the pixel and the parallax barrier may be long, and there is an advantage that choices of components increase in designing the display device.
- FIG. 10 shows an optical model in which WS and the width P ′ of the projected image are designed to be equal, that is, WS and the binocular interval e of the observer are equal.
- FIG. 12A shows a case where WS> e
- FIG. 12B shows a case where WS ⁇ e.
- the right eye region 70R where the right eye projection image overlaps and the left eye region 70L where the left eye projection image overlaps are determined by the design conditions.
- the right eye region 70R and the left eye region 70L derived from the design conditions, or the right eye region 70R and the left eye region 70L obtained by measurement from the completed display device are stored as stereoscopic viewing area data, and the observer If the data of the binocular positions are acquired and compared using an arithmetic unit, it can be determined whether or not both eyes of the observer are located within the stereoscopic viewing area.
- the stereoscopic viewing area and the observer's binocular position are in a relative relationship. Therefore, if the observer's eyes do not move from the optimum observation position, whether or not stereoscopic viewing is possible is determined according to the movement of the display device housing 10.
- the display device of the present invention stores the data of the stereoscopic viewing area, and determines whether or not the observer's eyes are located in the stereoscopic viewing area by detecting the movement of the display device housing 10.
- the rhombus boundary information shown in FIGS. 10 to 12A and 12B is determined by the light rays 22R, 23R, 24R, 22L, 23L, and 24L shown in FIG.
- the boundary information may be a polygon other than the diamond.
- the determination can be made without the polygon boundary information which is the right eye and left eye region information. An example will be described below.
- the positional relationship between the case 10 and the observer 50 shown in FIG. 10 is set as the optimum observation position, and the stereoscopic view when the case 10 is moved without moving the observer 50 is described with reference to the drawing. explain.
- FIG. 13 is a diagram showing the limit of stereoscopic vision when the casing 10 moves in parallel along the X axis.
- Fig. 13 (a) is a diagram when moving in the positive (+) direction of the X-axis
- Fig. 13 (b) is a diagram when moving in the negative (-) direction of the X-axis.
- the observer 50 can appropriately perform stereoscopic viewing when the right eye 55R is in the right eye region 70R and the left eye 55L is in the left eye region 70L. Therefore, the limit of the amount of movement in the positive (+) direction of the X axis is when both eyes overlap with the light rays 22R and 23R emitted from the display device.
- the limit of the amount of movement in the negative ( ⁇ ) direction of the X-axis is when both eyes overlap with the light rays 22L and 23L emitted from the display device.
- FIG. 14 is a diagram showing the limit of stereoscopic vision when the casing 10 moves in parallel along the Z axis.
- FIG. 14A is a view when the housing 10 moves in the positive (+) direction of the Z axis.
- FIG. 14B is a diagram when the housing 10 is moved in the negative ( ⁇ ) direction of the Z-axis.
- the limit of the amount of movement in the positive (+) direction of the Z-axis is when both eyes overlap with the light beams 23R and 23L emitted from the display device.
- the limit of the amount of movement in the negative ( ⁇ ) direction of the Z-axis is when both eyes overlap with the light rays 22L and 22L emitted from the display device.
- FIG. 15 shows the limit of stereoscopic vision when the housing 10 moves in the X-axis direction and the Z-axis direction while maintaining the parallel relationship between the display panel 11 surface and the observation surface including both eyes of the observer 50.
- FIG. 15A shows the case in which the housing 10 moves in the positive (+) direction of the X axis and the positive (+) direction of the Z axis. This is when 23R and the right eye 55R overlap.
- FIG. 15B shows the case in which the housing 10 moves in the negative ( ⁇ ) direction of the X axis and the negative ( ⁇ ) direction of the Z axis. , When the light ray 22R and the right eye 55R overlap.
- the inclinations of the light rays 22R, 23R, 24R, the light rays 22L, 23L, 24L, and the display surface are determined when the stereoscopic display device is designed. For this reason, if the amount of movement of the housing 10 from the optimum observation position is obtained, the possibility of stereoscopic viewing can be calculated.
- the above condition is when the housing 10 is not tilted, that is, when the surface of the display panel 11 and the surface where the observer's eyes are located maintain a parallel relationship.
- the limit of stereoscopic vision when the housing 10 is tilted needs to be calculated in consideration of the tilt angle.
- FIG. 16 is a diagram illustrating the limit of stereoscopic vision when the housing 10 is tilted about the surface of the display panel 11 and having the Y axis as the rotation axis.
- FIG. 16A is a view when the image is rotated counterclockwise when viewed in the positive (+) direction of the Y axis.
- the limit at which the observer 50 can appropriately perform stereoscopic viewing is that the light beam 23L and the left eye 55L overlap. Is the time.
- FIG. 16B is a diagram when the Y axis rotates in the clockwise direction when viewed in the positive (+) direction, and the limits that the observer 50 can appropriately stereoscopically view are the ray 23R and the right eye 55R. Is when they overlap. Since the inclinations of the light beams 23R and 23L are determined by the design of the display device, if the inclination angle from the optimum observation position of the housing 10 is obtained, the possibility of stereoscopic viewing can be calculated.
- the stereoscopic vision is determined based on the amount of movement and the tilt angle of the housing 10 from the optimum observation position and the angles of the light rays 22R, 22L, 23R, and 23L determined by design with the panel surface. be able to.
- a case where a triaxial acceleration sensor is used will be described as an example of detection of an inclination angle and a movement amount.
- Accelerator sensor output includes various signals in addition to the tilt angle and amount of movement we want to know.
- the main ones are an acceleration component toward the earth axis due to gravitational acceleration and a noise component of a use environment factor such as vibration that the human body having the housing and the housing receive simultaneously. It is effective to remove noise components such as vibration caused by the use environment by a filter.
- a filter As the filter, a digital filter is optimal. However, depending on the environment and the characteristics of the user, a filter that uses frequency domain characteristics by Fourier transform or wavelet transform is often effective. Below, the detection method of the signal after these filtering processes are performed is described.
- FIG. 17 shows a state in which the acceleration sensor is installed in the housing 10.
- the coordinate system of the lenticular lens 3 on the panel and the coordinate system of the acceleration sensor are defined as shown in FIG. That is, the observer is observing a panel that is located in the positive direction of the Z axis (the direction with the arrow in the figure) and exists in the negative direction of the Z axis. Further, the upward direction of the panel viewed from the observer is the positive direction of the Y axis, and the downward direction of the panel is the negative direction of the Y axis.
- the panel is often used tilted from the vertical direction of the earth axis, and this state is shown in FIG.
- FIG. 18 is a view seen from a plane including the Y axis and the Z axis in FIG. 17, and the vertical inclination angle of the panel and the ground axis is indicated by ⁇ , and the gravitational acceleration vector is indicated by G.
- the amount of movement can be calculated by calculating the speed by time integration of the acceleration sensor output and time integration of the calculated speed.
- the first point is noise accumulation due to integration.
- the second point is the influence of gravitational acceleration.
- the following two methods are effective for this noise.
- the first method is to use a filter for smoothing noise.
- the second method is to set the integration time short. That is, if the integration time is short, ⁇ t also becomes small, so that fluctuations in the movement amount due to noise become small. By adding the short movement amount obtained in a short integration time, the movement amount at a desired time can be calculated.
- tilt angle there are several possible tilts.
- the pitch is rotation around the X axis. That is, it is a rotation of whether the upper end (+ Y side) of the panel approaches or the lower end ( ⁇ Y side) of the panel approaches as viewed from the observer.
- the roll is rotated around the Y axis. That is, the rotation is whether the right end (+ X side) of the panel approaches or the left end ( ⁇ X side) of the panel approaches from the viewpoint of the observer.
- Yaw is rotation around the Z axis. That is, the rotation is such that the panel rotates around the direction of the line of sight of the observer within the plane observed from the front by the observer.
- the pitch can be obtained as follows.
- FIG. 18 shows a plane including the Y axis and the Z axis, and the gravitational acceleration G in the plane.
- the displacement is only rotation around the X axis.
- the method for obtaining the roll which greatly affects the visibility of stereoscopic vision, is the same as the method for obtaining the pitch.
- the mutation is only rotation around the Y axis.
- the gravitational acceleration component in the Y-axis direction is the same as in FIG. 18, and does not change as ⁇ G ⁇ cos ( ⁇ ).
- the gravitational acceleration component in the Z-axis direction is ⁇ G ⁇ sin ( ⁇ ) in the state of FIG. 18 where there is no rotation around the Y-axis, but when rotation around the Y-axis occurs, It is divided into components in the X-axis direction. This situation is shown in FIG.
- the X-axis and Z-axis indicated by broken lines are the axial directions before the rotation around the Y-axis occurs.
- the component of the gravitational acceleration perpendicular to the Y axis is in the negative direction of the Z axis and is ⁇ G ⁇ sin ( ⁇ ).
- the X axis and the Z axis move to positions X ′ and Z ′ in FIG. 19, respectively.
- the gravitational acceleration component on the Z ′ axis is ⁇ G ⁇ sin ( ⁇ ) ⁇ cos ( ⁇ ).
- a three-axis acceleration sensor is shown as an example, but it is obvious that a pitch / roll can be detected by a two-axis acceleration sensor.
- a geomagnetic sensor may be used for detecting the tilt angle
- an acceleration sensor may be used for detecting the movement amount.
- the method of detecting the tilt angle by the triaxial geomagnetic sensor can be explained by replacing the gravitational acceleration with the geomagnetism in the description of the tilt angle by the acceleration sensor described above.
- An angular velocity sensor or a gyro sensor can be used for detecting the tilt angle, and a small camera or an ultrasonic transmission source and an ultrasonic sensor can be used for detecting the movement amount.
- a sensor for detecting the movement of the housing 10 is also activated.
- stereoscopic display means that the function of the image distribution means is turned on (for example, parallax / barrier pattern display), and the display panel 11 has a parallax as shown in FIG.
- the image data is sent and each image is projected onto the left and right eyes of the observer.
- step 1 the observer adjusts the position and tilt of the housing 10 so that the display of the reference screen can be satisfactorily viewed as a solid.
- step 2 the output of the detection unit 80 in a state where the observer adjusts the position and inclination of the housing 10 is recorded as an initial value, and the desired stereoscopic display content is reproduced.
- step 3 the movement amount and the inclination angle for a predetermined period ⁇ T are calculated from the output from the detection unit 80 and the recording of the initial value.
- the stereoscopic vision is determined in step 4 based on the calculated movement amount and inclination angle. This is determined based on whether or not the calculated movement amount and inclination angle are larger than respective preset threshold values. For example, when the calculated movement amount is smaller than a preset movement amount threshold, it is determined that stereoscopic viewing is possible. If the calculated tilt angle is smaller than a preset tilt angle threshold, it is determined that stereoscopic viewing is possible. If it is determined that stereoscopic viewing is possible, the stereoscopic display is performed in step 5 and the process proceeds to step 7.
- step 6 If it is determined that stereoscopic viewing is impossible, switch to flat display in step 6.
- the planar display means that the function of the image distribution means is turned off (for example, the parallax / barrier pattern is not displayed), and the display panel 11 has a parallax as shown in FIG. Sending no image data and projecting an image without parallax to the viewer.
- the process After switching to the flat display, the process returns to step 3 to calculate again the movement amount and the inclination angle during the predetermined period ⁇ T.
- step 7 it is determined whether or not the initial value used as a reference when calculating the movement of the housing 10 is updated. If the determination result in step 7 is No, the process returns to step 3. If the determination result in step 7 is Yes, the process returns to step 2, and the output of the detection unit 80 at this time is recorded by replacing the initial value recorded in step 1. That is, the initial value is updated.
- the predetermined period ⁇ T is preferably set between about the frame period of the display panel 11 and about 0.2 seconds.
- ⁇ T is long, switching from stereoscopic display to flat display is delayed with respect to the movement of the housing 10. As a result, the observer switches to flat display after viewing the back view or double image.
- ⁇ T is short, but there is no time to switch the image data to the entire display screen even if switching between the three-dimensional display and the flat display multiple times within one frame period of the display panel 11. That is, even if ⁇ T is shorter than the frame period, there is not much effect on the tracking follow-up to the movement of the housing 10.
- step 7 is a function for responding to changes in the position and tilt of the case that occur when changing the posture of the observer or changing the way the display device is held. Therefore, the determination in step 7 does not need to be performed for each passage.
- the number of passages is counted, and when an appropriate count value is reached, the observer may input a determination from an operation switch or the like of the display device, or when the predetermined count value is automatically set to Yes Good.
- the acceleration sensor is used for detecting the movement amount, it is preferable to update the initial value appropriately because it has a function of clearing the integration error.
- the determination condition for stereoscopic vision in step 4 is the right determined by the design of the display device, as described above with reference to FIGS. 13A, 13B to 16A, 16B.
- the eye region 70R and the left eye region 70L are the stereoscopic vision limit conditions derived from the optimal positions of the observer's eyes set in the same manner at the time of design.
- the stereoscopic vision determination condition derived from the design conditions described above is applied to the initial setting, and the stereoscopic motion is performed by moving or tilting the housing 10 while the viewer performs the stereoscopic vision in Step 1.
- a function for storing the limit condition of stereoscopic vision (a function of recording a movement amount and an inclination amount at the limit of stereoscopic vision or a related sensor output) may be provided.
- the work is burdened on the observer, but not the binocular distance and the observation distance, which are the design parameters of the display device, but the binocular distance and the preferred observation distance are reflected, and it is suitable for each observer. Switching between display and flat display becomes possible.
- the limit condition of the stereoscopic view that has been recorded is stored even if the display device power is turned off, there is no need to record the limit condition of the stereoscopic view every time the observer uses the display device. It will end.
- the second embodiment has the same configuration as that of the first embodiment described above, and the method for determining whether or not both eyes of the observer are in the stereoscopic viewing area is the same. However, the operations from when the observer's eyes are determined to be out of stereoscopic view and switched to flat display until the stereoscopic display is performed again are different. More specifically, once the display is switched to the flat display, the stereoscopic display is resumed after returning to the vicinity of the initial value where the position and inclination of the housing 10 are recorded.
- a value in the vicinity of the initial value at which stereoscopic display is resumed (hereinafter referred to as 2D ⁇ 3D return value) indicates, for example, a large / medium / small option on the display screen (for example, “large” is ⁇ 10% of the initial value, “ “Medium” is ⁇ 5% of the initial value, and “Small” is ⁇ 2% of the initial value. That is, in the second embodiment, compared to the first embodiment, a function for setting a 2D ⁇ 3D return value is added, and the operation is different.
- FIG. 21 is a functional block diagram according to the second embodiment of the present invention.
- the display panel 11, the image distribution unit 13, and the display controller 12 are configured.
- the display controller 12 includes the image generation unit 100, the detection unit 80, the determination unit 90, and the display panel drive circuit 110.
- the image distribution control circuit 111 is configured.
- the configuration of the second embodiment is the same as that of the first embodiment except that a 2D ⁇ 3D return value setting 93 function is added to the determination unit 90 as described above.
- the method for determining whether the observer's eyes are in the stereoscopic viewing zone is also as described in the first embodiment.
- a determination region for the return value is calculated by reducing the right eye region 70R and the left eye region 70L shown in FIG. Just use it.
- a sensor for detecting the movement of the housing 10 is also activated.
- the reference screen is displayed to guide the observer to the optimum observation position.
- the stereoscopic display means that the function of the image distribution means is turned on for example, parallax / barrier pattern display
- the display panel has FIG.
- the image data having parallax as shown in FIG. 3 is sent and each image is projected onto the left and right eyes of the observer.
- step 11 the observer adjusts the position and inclination of the housing 10 so that the display of the reference screen can be satisfactorily viewed as a solid.
- 2D ⁇ 3D return value setting 93 when switching from flat display to stereoscopic display is performed.
- step 12 the output of the detection unit 80 in a state where the observer adjusts the position and inclination of the housing 10 is recorded as an initial value, and a desired stereoscopic display content is reproduced.
- step 13 the movement amount and the inclination angle for a predetermined period ⁇ T are calculated from the output from the detection unit 80 and the recording of the initial value.
- the stereoscopic vision is determined in step 14 based on the calculated movement amount and inclination angle. Similar to the first embodiment, this is determined based on whether or not the calculated movement amount and inclination angle are larger than respective threshold values set in advance. For example, when the calculated movement amount is smaller than a preset movement amount threshold, it is determined that stereoscopic viewing is possible. If the calculated tilt angle is smaller than a preset tilt angle threshold, it is determined that stereoscopic viewing is possible. If it is determined that stereoscopic viewing is possible, the stereoscopic display is performed in step 15 and the process proceeds to step 17.
- the display is switched to flat display in step 16.
- the planar display means that the function of the image distribution means is turned off for example, the parallax / barrier pattern is not displayed
- the display panel is shown in FIG. )
- step 18 After switching to the flat display, the process proceeds to step 18 to calculate the movement amount and the inclination angle during the predetermined period ⁇ T.
- step 19 it is determined whether or not the calculated movement amount and inclination angle are 2D ⁇ 3D return values.
- the display is switched to 3D display at step 15. If it is not within the 2D ⁇ 3D return value, the process returns to step 18 with the flat display. That is, unless the 2D ⁇ 3D return value is reached, step 18 and step 19 are not repeated to restore the stereoscopic display.
- step 17 After the output of the detection unit 80 falls within the 2D ⁇ 3D return value and switched to the stereoscopic display, the process proceeds to step 17.
- step 17 it is determined whether or not the initial value used as a reference when calculating the movement of the housing 10 is updated. If the determination result in step 17 is No, the process returns to step 13. If the determination result in step 17 is Yes, the process returns to step 12, and the output of the detection unit at this time is recorded by replacing the initial value recorded in step 11. That is, the initial value is updated.
- the predetermined period ⁇ T is set between about the frame period of the display panel 11 and about 0.2 seconds as described in the operation of the first embodiment. Is preferred.
- the determination in step 17 does not need to be performed for each passage as described in the operation of the first embodiment. The number of passages is counted, and when an appropriate count value is reached, the observer may input a determination from an operation switch or the like of the display device, or when the predetermined count value is automatically set to Yes Good.
- the determination condition for stereoscopic vision may be provided in step 14 so that the observer determines and stores the limit condition for stereoscopic vision.
- the second embodiment is more complicated in processing and added with functions than the first embodiment.
- an electro-optical element for example, a transmission type that displays a parallax barrier pattern
- an electric signal used in the first and second embodiments described above as image distribution means instead of a liquid crystal panel, an ordinary optical element (such as a parallax barrier or a lenticular lens) is used.
- the configuration other than the image distribution unit is the same as that of the first embodiment.
- FIG. 23 shows a functional block diagram of the third embodiment.
- 3rd Embodiment is comprised from the display panel 11, the image distribution means 13, and the display controller 12 similarly to 1st Embodiment.
- the display controller 12 since the image distribution unit is a normal optical element as described above, the display controller 12 has a configuration in which the image distribution control circuit 111 is omitted from the first embodiment (FIG. 8).
- the display controller 12 of the third embodiment includes an image generation unit 100, a detection unit 80, a determination unit 90, and a display panel drive circuit 110, and the role of each component is the first. Since it is the same as that of embodiment, description is abbreviate
- 2D data used for planar display generated by the image generation unit 100 is different from that of the first embodiment.
- the image distribution function cannot be turned off. Therefore, when performing flat display, the unit pixel of the display panel is alternately used as the right eye pixel and the left eye pixel in the horizontal direction as in the stereoscopic display. Therefore, the horizontal resolution of each two-dimensional image data generated for flat display is also half that of the display panel.
- the image data is preferably generated by rendering a 3D data including depth information.
- the 3D data for stereoscopic display is a virtual 2D image corresponding to the eyes of the observer.
- FIG. 24A shows 3D data
- FIG. 24B shows 2D data.
- image data is preferably generated from three-dimensional data including depth information, but FIGS. 24A and 24B in which rendering processing is performed on the data storage unit 102 in advance.
- the display target data as shown in FIG. Since the rendering process is not necessary, the processing capability and calculation speed of the calculator 101 may be lower than a method that requires rendering. For this reason, there exists an advantage which can comprise the image generation part 100 at low cost.
- the image distribution means when switching to flat display in step 6, the image distribution means is not controlled.
- the planar display in the third embodiment is such that the right eye image shown in FIG. 24A is sent to both left and right pixels, the left eye image is sent to both left and right pixels, or FIG. It is to project an image without parallax to the observer by sending the image data shown. That is, the image data sent to both the left and right pixels is the same data.
- the planar display in step 16 is performed by moving the housing 10 at the time of determination using the coordinate system at the time of description in FIGS. 10 and 13A, 13B to 16A, 16B.
- the direction is the negative direction of the X axis or when the inclination is counterclockwise
- the left eye data may be sent to both the left and right pixels.
- the movement direction of the housing 10 at the time of determination is a positive direction of the X axis or when the inclination is a right rotation
- the data for the left eye may be sent to both the left and right pixels.
- the second embodiment can be applied.
- the functional block diagram is obtained by adding a 2D ⁇ 3D return value setting 93 function to the determination unit 90 as shown in FIG.
- the configuration of the third embodiment is different from the configuration of the second embodiment in that the image distribution control circuit 111 is not present and the image data generated in the image generation unit 100 is different. With a point. In the operation, the flowchart shown in FIG. 22 can be applied. As described above, except for the point where there is no control of the image distribution means in the switching to the flat display (step 16) and the image data applied to the flat display, The operation is the same as that of the second embodiment. Further, as described above, the image data to be applied to the flat display may be selected according to the movement and tilt direction of the housing 10 or may be selected according to the dominant eye of the observer.
- the configuration according to the fourth embodiment can project different images to the right and left eyes of the observer from a portion corresponding to one pixel which is the minimum display unit constituting an image of a standard flat display panel.
- a display panel is used.
- the above-mentioned display panel is, for example, a time-division method stereoscopic display panel described in Patent Document 1. For example, it is a display panel having twice the number of pixels in the horizontal direction, which is the direction in which the viewer's eyes are aligned, with respect to a standard flat display panel.
- FIG. 6 An example of a display device to which the fourth embodiment is applied is the front view shown in FIG. 6 as described in the first embodiment.
- FIG. 7 which is a cross-sectional view of the housing 10 taken along line b in FIG. 6, the display device of the fourth embodiment includes a display panel 11, an image distribution unit 13, a display controller 12, The operation switch 14 is housed in the housing 10.
- the display panel 11 is a transmissive liquid crystal panel formed in a plurality of unit pixels, but has the following characteristics as compared with a standard flat display panel.
- FIG. 26 (a) is a schematic diagram of a pixel structure of a standard flat display panel, which is composed of pixels 4 arranged in a matrix of six in the horizontal direction and three in the vertical direction. Any gradation expression can be expressed, and a 6 ⁇ 3 image can be expressed according to the input data.
- the pixels 4 are represented by squares, but for convenience of explanation, any shape may be used as long as the aspect ratios when image representation is performed from 6 ⁇ 3 pixels are equal.
- FIG. 26B is a schematic diagram when the pixel structure of an example display panel used in the fourth embodiment is compared with FIG.
- the pixels 41 having a shape obtained by vertically dividing the pixels 4 of the standard flat display panel are arranged in a matrix of 12 in the horizontal direction and 3 in the vertical direction.
- Each pixel 41 can express an arbitrary gradation and can express a 12 ⁇ 3 image.
- the display screen size is as shown in FIG. It is the same as the standard flat display panel shown in a).
- the pixel 41 is expressed by a rectangle, for convenience of explanation, any shape may be used as long as the aspect ratio is 2: 1.
- the lenticular lens as the image distribution means 13 is arranged as shown in FIG. 27 so that the pixels 41 alternately function as the left eye pixel 4L and the right eye pixel 4R in the horizontal direction. To do.
- the functional block diagram of the fourth embodiment is the same as FIG. 23 of the third embodiment.
- the fourth embodiment is different from the third embodiment in image data generated by the image generation unit 100.
- the display target data of the data storage unit 102 is three-dimensional data including depth information as in the above-described embodiment, and two-dimensional image data is generated by rendering processing by the computing unit 101.
- 3D data used for stereoscopic display that is, two-dimensional image data for left and right eyes having parallax, is generated by setting two virtual viewpoints corresponding to the left and right eyes of the observer and performing rendering processing on each.
- 2D data used for flat display may be generated by setting one viewpoint corresponding to the center of both eyes of the observer and performing rendering processing. Since the display panel of the fourth embodiment has double the resolution in the horizontal direction, right-eye data that has been subjected to rendering processing for stereoscopic display is used as 2D data used for planar display. Alternatively, left-eye data that has been subjected to rendering processing for stereoscopic display may be used as data for left and right eyes.
- the images of the 3D data to be generated and the image data of 2D data are shown in FIG. 28 (a) to FIG. 28 (d).
- display target data that has been subjected to rendering processing in the data storage unit 102 that is, is stored in the form of two-dimensional data corresponding to FIG. 28A that does not include depth information. Also good. This format is often used for live-action content shot using two cameras. Further, two-dimensional data corresponding to FIGS. 28A and 28D may be accumulated. As described above, in these cases, since the rendering process is not necessary, there is an advantage that the arithmetic unit 101 and the memory 103 can be configured at low cost.
- a lenticular lens is used for the image distribution means 13, but a parallax barrier can also be used.
- a parallax barrier By using a parallax barrier, although the brightness is inferior to that of a lenticular lens, there is an effect that a display device can be manufactured at a lower cost.
- the method for determining whether or not both eyes of the observer are located within the stereoscopic viewing area is as described in the first embodiment.
- the operation of the fourth embodiment can be described in the same manner as in the third embodiment.
- the difference between the operation of the fourth embodiment and the third embodiment is only the flat display data applied in step 6 with reference to the flowchart of FIG.
- the flat display means that the image data sent to the right-eye pixel 4R and the left-eye pixel 4L are the same data as in the third embodiment.
- the horizontal resolution of the applied image data is different from that of the third embodiment, and the image of the image data may be any of FIGS. 28B to 28D.
- the image data to be applied to the flat display is selected according to the movement and tilt direction of the housing 10 or selected according to the observer's dominant eye. May be.
- the stereoscopic display device of the fourth embodiment since the display panel 11 whose schematic diagram is shown in FIG. 26B is used, the resolution does not change between planar display and stereoscopic display. For this reason, for example, when the plane display is switched from step 4 to step 6 in FIG. 20, in the first embodiment and the second embodiment, the sense of incongruity is caused by the change in the horizontal resolution. There is an effect that a sense of incongruity does not occur.
- a right-eye pixel and a left-eye pixel are provided in the unit pixel in the horizontal direction, but the present invention is not limited to this.
- the functional block diagram in the fifth embodiment is the same as FIG.
- the difference between the fifth embodiment and the fourth embodiment is that the fifth embodiment has a 2D ⁇ 3D return value setting 93 function described in the second embodiment in the determination unit 90. There is only a point, and explanation is omitted.
- the operation is almost the same as in the second embodiment.
- the difference between the fifth embodiment and the second embodiment is that there is no on / off control of the image distribution means and the image data applied in step 16.
- the image data applied to the flat display may be any one of FIG. 28B to FIG. 28D as described in the fourth embodiment.
- the image data to be applied may be selected according to the movement and tilt direction of the housing 10 or may be selected according to the dominant eye of the observer.
- the sixth embodiment is characterized in that a display panel in which at least three or more viewpoint pixels are provided in the horizontal direction is used.
- each unit pixel can be used as each viewpoint pixel.
- one unit pixel which is a minimum display unit constituting an image of a standard flat display panel is used.
- a display panel in which at least three or more viewpoint pixels are provided in the horizontal direction in the corresponding portions. That is, when the number of viewpoints is N, a display panel characterized by using N pixels in a portion corresponding to one pixel which is a minimum display unit constituting an image of a standard flat display panel is used. That is.
- N 4 as an example.
- the display panel 11 In the display device to which the sixth embodiment is applied, as shown in FIG. 6, the display panel 11, the image sorting means 13, the display controller 12, and the operation switch 14 are housed in the housing 10. As described above, the display panel 11 has four pixels in a portion corresponding to one pixel which is a minimum display unit constituting an image of a standard flat display panel.
- FIG. 29 is a schematic diagram of a pixel structure of a display panel used in the sixth embodiment.
- each pixel in FIG. 29 can express an arbitrary gradation (for example, a liquid crystal panel) and can express a 24 ⁇ 3 image, but the horizontal is 1/4 with respect to the vertical. Therefore, the display screen size is the same as that of a standard flat display panel composed of 6 ⁇ 3 pixels shown in FIG. In FIG. 29, pixels are represented by rectangles, but for convenience of explanation, any shape may be used as long as the ratio of the number of pixels is 4: 1 in the vertical and horizontal directions.
- the display panel is composed of 24 ⁇ 3 pixels, for convenience of explanation, the total number of pixels may be determined according to the purpose.
- these pixels function as a first viewpoint pixel 4D, a second viewpoint pixel 4C, a third viewpoint pixel 4B, and a fourth viewpoint pixel 4A in the horizontal direction.
- a lenticular lens as the image distribution means 13 is arranged.
- FIG. 30 is a cross-sectional view of an optical model that projects an image toward an observation surface 30 that is parallel to the display panel surface and is separated from the optimal observation distance OD.
- the display panel (not shown) is composed of a light modulation element group (for example, a liquid crystal panel) that becomes pixels arranged in a matrix as described above.
- FIG. 30 shows the first viewpoint pixel 4D, Only the state in which the second viewpoint pixel 4C, the third viewpoint pixel 4B, and the fourth viewpoint pixel 4A are arranged in order is illustrated.
- the lenticular lens 3 that functions as image distribution means is disposed on the front surface (observation surface 30 side) of the display panel, and a light source (not shown: so-called backlight) is disposed on the back surface (opposite side of the lenticular lens 3) of the display panel. Is installed.
- the lenticular lens 3 is a lens array in which a large number of cylindrical lenses 3a are arranged one-dimensionally.
- the cylindrical lens 3a is a one-dimensional lens having a semi-cylindrical convex portion, has no lens effect in the longitudinal direction, and has a lens effect only in the arrangement direction which is an orthogonal direction thereof.
- the lenticular lens 3 is formed by the cylindrical lens 3a with respect to the direction in which the first viewpoint pixel 4D, the second viewpoint pixel 4C, the third viewpoint pixel 4B, and the fourth viewpoint pixel 4A are aligned. It arrange
- One cylindrical lens 3a is provided for each set of four pixels, with the pixels 4D, 4C, 4B, and 4A as one set.
- each pixel The light emitted from each pixel is deflected and projected by the lenticular lens 3.
- attention is paid to the light passing through the principal point (vertex) of the closest cylindrical lens 3a, and it is illustrated as a light beam.
- the region 74D where the projection images from all the first viewpoint pixels 4D overlap the region 74C where the projection images from all the second viewpoint pixels 4C overlap, and similarly, the region 74B from the third viewpoint pixel 4B.
- a region 74A is obtained from the fourth viewpoint pixel 4A.
- FIG. 30 is a model diagram when the right eye 55R of the observer 50 is positioned in the region 74B and the left eye 55L is positioned in the region 74C.
- the observer recognizes it as a stereoscopic image. Further, the observer's right eye may be positioned in the region 74A and the left eye may be positioned in the region 74B. As shown in FIG. 31, in the sixth embodiment, the observer is between the region 74A and the region 74D. It is possible to enjoy various combinations of parallax images.
- the image projected from the region 74A to the region 74D is an image obtained by rendering the display target from four viewpoints, the observer can enjoy a stereoscopic image from different angles by changing the observation position. At the same time, since motion parallax is also given, there is an effect of giving a more stereoscopic effect.
- the image data is generated in the same manner as in the above-described embodiment.
- the display target data in the data storage unit 102 is three-dimensional data including depth information.
- the arithmetic unit 101 performs rendering processing to generate two-dimensional image data. Is preferred. 3D data used for stereoscopic display, that is, four two-dimensional image data having parallax, are generated by setting four virtual viewpoints and performing rendering processing.
- 2D data used for flat display that is, image data having no parallax
- 2D data used for planar display may be generated by setting one viewpoint corresponding to the center of both eyes of the observer and performing rendering processing. Since the display panel of the sixth embodiment has four times the resolution in the horizontal direction, one of the data (four images) subjected to rendering processing for stereoscopic display is used as 2D data used for planar display. May be.
- Fig. 32 (a) and Fig. 32 (b) show images of 3D data to be generated and 2D data.
- display target data that has been subjected to rendering processing in advance in the data storage unit may be stored in the form of two-dimensional data corresponding to FIG. 32A that does not include depth information. For example, it can deal with live-action content shot using four cameras. Further, two-dimensional data corresponding to FIGS. 32A and 32B may be accumulated. In these cases, since rendering processing is not required, there is an advantage that the arithmetic unit and the memory can be configured at low cost.
- the image generation unit 100 generates 2D / 3D data from the 2D / 3D data according to the signal from the determination unit 90, and outputs the 2D / 3D data to the display panel drive circuit 110.
- a lenticular lens is used for the image distribution means 13, but a parallax barrier can also be used.
- a parallax barrier By using a parallax barrier, although the brightness is inferior to that of a lenticular lens, there is an effect that a display device can be manufactured at a lower cost.
- the condition for determining whether or not the observer's eyes are located in the stereoscopic viewing area may be the boundary information of the rhombic areas 74A to 74D shown in FIG.
- the combination of the areas 74A to 74D shown in FIG. 31 is allowed.
- a position where both can be seen as shown in FIG. 33 is allowed. Therefore, in the present embodiment, it is preferable to determine the stereoscopic viewing condition so that the observer can suit his / her preference.
- planar display means that all image data sent from the first viewpoint pixel to the fourth viewpoint pixel is the same.
- An image of the image data is illustrated in FIG. 32 (a) or FIG. 32 (b).
- the stereoscopic vision determination in Step 4 of FIG. 20 applies the stereoscopic vision determination conditions derived from the design conditions to the initial settings. While performing stereoscopic viewing, the housing 10 may be moved or tilted to search for the limit of stereoscopic viewing and the stereoscopic viewing limit condition may be stored. As described in the description of the condition to be determined earlier, this method is particularly effective in the sixth embodiment.
- the first viewpoint data is sent to the four types of pixels.
- the moving direction of the casing at the time of determination is the positive direction of the X axis, or when the inclination is clockwise
- the fourth viewpoint data may be sent to the four types of pixels.
- the first and second viewpoint data or the third and third viewpoint data is used regardless of the moving direction or the tilting direction of the housing 10.
- switching to 4-viewpoint data feels more natural.
- the data that the observer naturally feels was the display data that matched the dominant eye of the observer.
- the sixth embodiment has been described as a panel having pixels for four viewpoints, the number of viewpoints may be N viewpoints. In that case, the image data generated by the image generation unit 100 is also prepared for N viewpoints.
- the operation of the second embodiment is applied to the configuration of the above-described sixth embodiment, and the operation from switching to flat display until performing stereoscopic display again. Is different.
- the configuration is the same as that of the sixth embodiment except that the determination unit 90 has a 2D ⁇ 3D return value setting 93 function, and the description thereof will be omitted.
- the display panel used in the seventh embodiment is described as a panel having pixels for four viewpoints as in the sixth embodiment. However, the number of viewpoints is N and the number of image data is N viewpoints. Good.
- the operation of the seventh embodiment can be explained by the flowchart shown in FIG. 22 in the same manner as the operations of the third and fifth embodiments.
- the flat display data applied in step 16 is either FIG. 32A or FIG. 32B as in the sixth embodiment.
- the application data when performing the planar display in step 16 is selected according to the movement and tilt direction of the housing 10 or in accordance with the dominant eye of the observer. You may provide the function to choose.
- an observer can enjoy a stereoscopic image from a different angle, and at the same time, motion parallax is also given, thereby providing a more stereoscopic effect. There is.
- the present invention can be applied to portable information terminals (terminal devices) such as mobile phones, portable personal computers, portable game machines, and portable media players.
- the stereoscopic display device of the present invention detects the movement of the casing including the display device, and in a situation where the stereoscopic display is not appropriate, projects an image without parallax to give the viewer an unpleasant feeling. It does not cause symptoms such as dizziness or motion sickness.
- the stereoscopic viewing area from the detection and calculation of the movement of the housing, it is simple and inexpensive compared to the conventional line-of-sight tracking type that requires an image processing function for camera and viewpoint position detection and an infrared irradiation device.
- a display device can be provided.
Abstract
Description
画像を表示する表示装置であって、
当該表示装置の動きを検出し、該検出した動きに応じて、立体表示と平面表示とのどちらか一方で前記画像を表示する。
画像を表示装置に表示する表示方法であって、
前記表示装置の動きを検出する処理と、
前記検出した動きに応じて、立体表示と平面表示とのどちらか一方で前記画像を表示する処理とを有する。
(第1の実施の形態)
[構成の説明]
図6は、本発明を適用した表示装置の一例を正面から見た図である。
[動作の説明]
次に、本発明の実施の形態の動作について図20のフローチャートを参照して説明する。
(第2の実施の形態)
第2の実施の形態は、前述の第1の実施の形態と同じ構成であり、観察者の両眼が立体視域にあるか判定する方法も同じである。しかし、観察者の両眼が立体視外と判定され平面表示に切り換えてから、再び立体表示を行うまでの動作が異なる。より具体的には、一旦、平面表示に切り換えたのちは、筐体10の位置および傾きが記録された初期値近傍に、戻ってから、立体表示を再開することを特徴とする。立体表示再開する初期値近傍の値(以降、2D→3D戻り値と呼ぶ)は、例えば、表示画面に大/中/小と選択肢を示し(例えば「大」は初期値の±10%、「中」は初期値の±5%、「小」は初期値の±2%とする)、観察者が好みに合わせて設定をすることが好ましい。すなわち、第2の実施の形態においては、第1の実施の形態に比べ、2D→3D戻り値を設定する機能が加わり、動作が異なる。
(第3の実施の形態)
第3の実施の形態は、画像振分手段として、前述の第1および2の実施の形態において用いた電気信号によりオン/オフできる電気光学素子(例えば、パララックス・バリアパターンを表示する透過型液晶パネル)ではなく、通常の光学素子(パララックスバリア、レンチキュラレンズ等)を用いること特徴とする。画像振分手段以外の他の構成については、第1の実施の形態と同じである。
(第4の実施の形態)
第4の実施の形態における構成は、標準的な平面表示パネルの画像を構成する最小表示単位である1つの画素に相当する部分から、観察者の右眼と左眼とに異なる画像を投影できる表示パネルを用いることを特徴とする。前述の表示パネルとは、たとえば、特許文献1記載の時分割法式の立体表示パネルである。たとえば、標準的な平面表示パネルに対して、観察者の両眼の並ぶ方向である横方向に2倍の画素数を有する表示パネルである。
[効果]
第4の実施の形態の立体表示装置においては、図26の(b)に模式図を示す表示パネル11を使用するため平面表示と立体表示とで解像度が変わらない。このため、例えば、図20のステップ4からステップ6の平面表示切り換え時に、第1の実施の形態および第2の実施の形態では、水平方向の解像度が変化することによって違和感が生じるのに対して、違和感が発生しないという効果がある。
(第5の実施の形態)
第5の実施の形態は、前述の第4の実施の形態の構成において、第2の実施の形態の動作を適用したものである。
[効果]
第5の実施の形態の立体表示装置においては、第4の実施の形態と同じく、平面表示と立体表示との解像度が同じであるため、解像度の変化による違和感が発生しないという効果、ならびに第2の実施の形態と同じく立体表示と平面表示とが頻繁に切換わることによって生じる不快感を抑制する効果がある。
(第6の実施の形態)
第6の実施の形態は、少なくとも3つ以上の各視点用画素を水平方向に設けた表示パネルを用いることを特徴とする。ここで、各視点用画素として各単位画素を用いることも可能であるが、本実施の形態の説明においては、標準的な平面表示パネルの画像を構成する最小表示単位である一つの単位画素に相当する部分に、少なくとも3つ以上の各視点用画素を水平方向に設けた表示パネルを用いて説明を行う。すなわち視点数をNとすると、標準的な平面表示パネルの画像を構成する最小表示単位である一つの画素に相当する部分に、N個の画素を設けたことを特徴とする表示パネルを用いるということである。
[動作の説明]
第6の実施の形態の動作は、第3および第4実施形態の動作と同様に図20に示すフローチャートで説明できる。動作において違う点は、ステップ6において適用する平面表示のデータが異なる点である。第6の実施の形態において平面表示とは、第1の視点用画素から第4の視点用画素に送る画像データを全て同じすることである。画像データのイメージを図示すると、図32(a)のいずれか、あるいは、図32(b)となる。
(第7の実施の形態)
第7の実施の形態は、前述の第6の実施の形態の構成において、第2の実施の形態の動作を適用したものであり、平面表示に切り換えてから、再び立体表示を行うまでの動作が異なる。
Claims (25)
- 画像を表示する表示装置であって、
当該表示装置の動きを検出し、該検出した動きに応じて、立体表示と平面表示とのどちらか一方で前記画像を表示する表示装置。 - 請求項1に記載の表示装置において、
当該表示装置の動きを検出する検出部と、
該検出部が検出した当該表示装置の動きとあらかじめ設定された閾値との大小関係に基づいて、少なくとも2つの視差を有する画像データと視差の無い画像データとのどちらか一方を生成して出力する画像生成部と、
複数の単位画素から構成され、前記画像生成部が出力した画像データを表示する表示パネルと、
前記表示パネルに表示された画像データの、該表示パネルから外部への投影を制御する画像振分手段とを有する表示装置。 - 請求項2に記載の表示装置において、
前記検出部が検出した当該表示装置の動きを示す値とあらかじめ設定された閾値とを比較する判定部を有し、
前記画像生成部は、前記判定部が、前記表示装置の動きを示す値が前記閾値よりも小さいと判定した場合、前記視差を有する画像データを生成し、またそれ以外である場合は、前記視差の無い画像データを生成することを特徴とする表示装置。 - 請求項3に記載の表示装置において、
前記画像振分手段は、電気光学素子からなることを特徴とする表示装置。 - 請求項4に記載に表示装置において、
前記画像振分手段は、前記判定部が、前記表示装置の動きを示す値が前記閾値よりも小さいと判定した場合、オンとし、またそれ以外である場合は、オフとすることを特徴とする表示装置。 - 請求項3に記載の表示装置において、
前記画像生成部は、前記判定部の判定結果にかかわらず、前記視差を有する画像データを生成し、
前記表示パネルは、前記判定部が、前記表示装置の動きを示す値が前記閾値よりも小さいと判定した場合、前記単位画素を用いて少なくとも2つの視差を有する前記画像データを表示し、それ以外の場合である場合は、前記単位画素を用いて前記視差を有する画像データのうち1つの画像データを表示することを特徴とする表示装置。 - 請求項3に記載の表示装置において、
前記表示パネルは、各単位画素を右眼用画素と左眼用画素とし、少なくとも2つの単位画素から構成される立体画素ユニットを用いて前記画像データを表示することを特徴とする表示装置。 - 請求項3に記載の表示装置において、
前記表示パネルは、前記単位画素に右眼用画素と左眼用画素とを水平方向に設け、1つの単位画素から構成される立体画素ユニットを用いて前記画像データを表示することを特徴とする表示装置。 - 請求項3に記載の表示装置において、
前記表示パネルは、少なくとも3つ以上の単位画素から構成される各視点用画素を立体画素ユニットとして水平方向に設け、該立体画素ユニットを用いて前記画像データを表示することを特徴とする表示装置。 - 請求項3に記載の表示装置において、
前記表示パネルは、少なくとも3つ以上の各視点用画素を1つの単位画素内に立体画素ユニットとして水平方向に設け、1つの単位画素から構成される立体画素ユニットを用いて前記画像データを表示することを特徴とする表示装置。 - 請求項3に記載の表示装置において、
前記検出部は、当該表示装置の移動量を検出し、
前記判定部は、前記検出部が検出した移動量とあらかじめ設定された移動量閾値とを比較し、
前記画像生成部は、前記判定部が、前記移動量が前記移動量閾値よりも小さいと判定した場合、前記視差を有する画像データを生成し、またそれ以外である場合は、前記視差の無い画像データを生成することを特徴とする表示装置。 - 請求項11に記載の表示装置において、
前記検出部は、加速度センサまたは超音波センサまたは小型カメラであることを特徴とする表示装置。 - 請求項3に記載の表示装置において、
前記検出部は、当該表示装置の傾き角を検出し、
前記判定部は、前記検出部が検出した傾き角とあらかじめ設定された傾き角閾値とを比較し、
前記画像生成部は、前記判定部が、前記傾き角が前記傾き角閾値よりも小さいと判定した場合、前記視差を有する画像データを生成し、またそれ以外である場合は、前記視差の無い画像データを生成することを特徴とする表示装置。 - 請求項13に記載の表示装置において、
前記検出部は、加速度センサまたは地磁気センサまたはジャイロセンサの角速度センサであることを特徴とする表示装置。 - 請求項5乃至10のいずれか1項に記載の表示装置において、
前記画像生成部は、奥行き情報を持つ表示対象データに対して、該奥行き情報に応じた視差量を展開することを特徴とする表示装置。 - 請求項5乃至8のいずれか1項に記載の表示装置において、
前記画像生成部は、左眼用と中央用と右眼用との3つの画像を有する表示対象データに対して、前記少なくとも2つの視差を有する画像データを該左眼用の画像データおよび右眼用の画像データとし、前記視差の無い画像データを該中央用の画像データとすることを特徴とする表示装置。 - 請求項1に記載の表示装置において、
前記立体表示と前記平面表示とが互いに同じ解像度であることを特徴とする表示装置。 - 請求項1に記載の表示装置において、
前記立体表示が、少なくとも3つの視点からの表示を行うことを特徴とする表示装置。 - 請求項3に記載の表示装置において、
前記画像生成部は、前記視差の無い画像データを生成する場合、前記検出部が検出した当該表示装置の動きの方向に応じて任意の視点用画像を選択して出力することを特徴とする表示装置。 - 請求項3に記載の表示装置において、
前記画像生成部は、前記視差の無い画像データを生成する場合、外部から設定された観察者の利き目に応じて任意の視点用画像を選択して出力することを特徴とする表示装置。 - 請求項1乃至20のいずれか1項に記載の表示装置を用いた端末装置。
- 画像を表示装置に表示する表示方法であって、
前記表示装置の動きを検出する処理と、
前記検出した動きに応じて、立体表示と平面表示とのどちらか一方で前記画像を表示する処理とを有する表示方法。 - 請求項22に記載の表示方法において、
前記検出した当該表示装置の動きとあらかじめ設定された閾値とを比較する処理と、
前記比較の結果に基づいて、少なくとも2つの視差を有する画像データと視差の無い画像データとのどちらか一方を生成する処理と、
前記生成した画像データを表示する処理とを有する表示方法。 - 請求項23に記載の表示方法において、
前記表示装置の移動量を検出する処理と、
前記検出した移動量とあらかじめ設定された移動量閾値とを比較する処理と、
前記比較の結果、前記移動量が前記移動量閾値よりも小さい場合、前記視差を有する画像データを生成する処理と、
前記比較の結果、前記移動量が前記移動量閾値よりも小さくない場合は、前記視差の無い画像データを生成する処理とを有することを特徴とする表示方法。 - 請求項23に記載の表示方法において、
前記表示装置の傾き角を検出する処理と、
前記検出した傾き角とあらかじめ設定された傾き角閾値とを比較する処理と、
前記比較の結果、前記傾き角が前記傾き角閾値よりも小さい場合、前記視差を有する画像データを生成する処理と、
前記比較の結果、前記傾き角が前記傾き角閾値よりも小さくない場合は、前記視差の無い画像データを生成する処理とを有することを特徴とする表示方法。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP09828938A EP2365701A1 (en) | 2008-11-26 | 2009-10-07 | Display device, terminal device, and display method |
JP2010540421A JP5170254B2 (ja) | 2008-11-26 | 2009-10-07 | 表示装置、端末装置および表示方法 |
US13/129,753 US9122064B2 (en) | 2008-11-26 | 2009-10-07 | Display device, terminal device, and display method |
CN200980155402.6A CN102292998B (zh) | 2008-11-26 | 2009-10-07 | 显示设备、终端设备和显示方法 |
US14/816,178 US9880395B2 (en) | 2008-11-26 | 2015-08-03 | Display device, terminal device, and display method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2008300965 | 2008-11-26 | ||
JP2008-300965 | 2008-11-26 |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/129,753 A-371-Of-International US9122064B2 (en) | 2008-11-26 | 2009-10-07 | Display device, terminal device, and display method |
US14/816,178 Continuation US9880395B2 (en) | 2008-11-26 | 2015-08-03 | Display device, terminal device, and display method |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2010061689A1 true WO2010061689A1 (ja) | 2010-06-03 |
Family
ID=42225569
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2009/067469 WO2010061689A1 (ja) | 2008-11-26 | 2009-10-07 | 表示装置、端末装置および表示方法 |
Country Status (5)
Country | Link |
---|---|
US (2) | US9122064B2 (ja) |
EP (1) | EP2365701A1 (ja) |
JP (2) | JP5170254B2 (ja) |
CN (3) | CN102292998B (ja) |
WO (1) | WO2010061689A1 (ja) |
Cited By (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011108256A (ja) * | 2011-01-07 | 2011-06-02 | Nintendo Co Ltd | 情報処理プログラム、情報処理方法、情報処理装置及び情報処理システム |
CN102447919A (zh) * | 2010-10-08 | 2012-05-09 | 宏碁股份有限公司 | 三维视频图像调整方法及装置 |
JP2012090256A (ja) * | 2010-09-22 | 2012-05-10 | Nikon Corp | 画像表示装置及び撮像装置 |
CN102790898A (zh) * | 2011-05-18 | 2012-11-21 | 索尼公司 | 显示控制设备,显示控制方法,程序和及记录介质 |
EP2549720A1 (en) * | 2011-07-22 | 2013-01-23 | ST-Ericsson SA | Method for a mobile electronic device and such a mobile electronic device |
JP2013069301A (ja) * | 2012-10-24 | 2013-04-18 | Nintendo Co Ltd | 情報処理プログラム、情報処理方法、情報処理装置及び情報処理システム |
EP2395762A3 (en) * | 2010-06-10 | 2013-10-30 | LG Electronics Inc. | Mobile terminal and controlling method thereof |
EP2693758A2 (en) | 2012-07-31 | 2014-02-05 | NLT Technologies, Ltd. | Stereoscopic image display device, image processing device, and stereoscopic image processing method |
US9259645B2 (en) | 2011-06-03 | 2016-02-16 | Nintendo Co., Ltd. | Storage medium having stored therein an image generation program, image generation method, image generation apparatus and image generation system |
CN106454326A (zh) * | 2016-10-13 | 2017-02-22 | 张家港康得新光电材料有限公司 | 串扰值的测试装置 |
JP2018182745A (ja) * | 2013-09-26 | 2018-11-15 | Tianma Japan株式会社 | 立体画像表示装置及び端末装置 |
WO2019150880A1 (ja) * | 2018-01-30 | 2019-08-08 | ソニー株式会社 | 情報処理装置、情報処理方法、及びプログラム |
WO2020004275A1 (ja) * | 2018-06-26 | 2020-01-02 | 京セラ株式会社 | 3次元表示装置、制御コントローラ、3次元表示方法、3次元表示システム、および移動体 |
US10567741B2 (en) | 2013-09-26 | 2020-02-18 | Tianma Microelectronics Co., Ltd. | Stereoscopic image display device, terminal device, stereoscopic image display method, and program thereof |
JP2020072455A (ja) * | 2018-11-02 | 2020-05-07 | 京セラ株式会社 | 3次元表示装置、ヘッドアップディスプレイ、移動体、およびプログラム |
JP7433902B2 (ja) | 2019-04-26 | 2024-02-20 | Tianma Japan株式会社 | 表示装置 |
Families Citing this family (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2355526A3 (en) | 2010-01-14 | 2012-10-31 | Nintendo Co., Ltd. | Computer-readable storage medium having stored therein display control program, display control apparatus, display control system, and display control method |
JP5898842B2 (ja) | 2010-01-14 | 2016-04-06 | 任天堂株式会社 | 携帯型情報処理装置、携帯型ゲーム装置 |
JP5800501B2 (ja) | 2010-03-12 | 2015-10-28 | 任天堂株式会社 | 表示制御プログラム、表示制御装置、表示制御システム、及び、表示制御方法 |
US8384770B2 (en) | 2010-06-02 | 2013-02-26 | Nintendo Co., Ltd. | Image display system, image display apparatus, and image display method |
JP6021296B2 (ja) | 2010-12-16 | 2016-11-09 | 任天堂株式会社 | 表示制御プログラム、表示制御装置、表示制御システム、および、表示制御方法 |
JP2015038650A (ja) * | 2010-12-24 | 2015-02-26 | 株式会社東芝 | 情報処理装置及び情報処理方法 |
JP5770018B2 (ja) * | 2011-06-03 | 2015-08-26 | 任天堂株式会社 | 表示制御プログラム、表示制御装置、表示制御方法及び表示制御システム |
TWI509289B (zh) * | 2012-08-27 | 2015-11-21 | Innocom Tech Shenzhen Co Ltd | 立體顯示裝置及其影像顯示方法 |
CN102798982B (zh) * | 2012-08-31 | 2015-11-25 | 深圳超多维光电子有限公司 | 一种立体显示设备及立体显示控制方法 |
TWI463181B (zh) * | 2012-09-25 | 2014-12-01 | Au Optronics Corp | 三維顯示器系統及其控制方法 |
TWI454968B (zh) | 2012-12-24 | 2014-10-01 | Ind Tech Res Inst | 三維互動裝置及其操控方法 |
US9118911B2 (en) * | 2013-02-07 | 2015-08-25 | Delphi Technologies, Inc. | Variable disparity three-dimensional (3D) display system and method of operating the same |
KR102094124B1 (ko) * | 2013-03-25 | 2020-03-27 | 조세프 제거 | 진동하는 격자 기반의 3차원 공간 시각화 장치 |
KR20150093014A (ko) * | 2014-02-06 | 2015-08-17 | 삼성전자주식회사 | 디스플레이 장치 및 그 제어 방법 |
FR3019553B1 (fr) | 2014-04-02 | 2020-07-31 | Arkema France | Nouvelle composition thermoplastique modifiee choc presentant une plus grande fluidite a l'etat fondu |
US10156723B2 (en) * | 2016-05-12 | 2018-12-18 | Google Llc | Display pre-distortion methods and apparatus for head-mounted displays |
EP3425907B1 (en) * | 2017-07-03 | 2022-01-05 | Vestel Elektronik Sanayi ve Ticaret A.S. | Display device and method for rendering a three-dimensional image |
JP2019066753A (ja) * | 2017-10-04 | 2019-04-25 | 三菱電機株式会社 | 立体画像表示装置 |
US10795176B2 (en) * | 2018-08-24 | 2020-10-06 | 3D Media Ltd | Three-dimensional display adapted for viewers with a dominant eye |
JP2021131490A (ja) * | 2020-02-20 | 2021-09-09 | キヤノン株式会社 | 情報処理装置、情報処理方法、プログラム |
US11538214B2 (en) * | 2020-11-09 | 2022-12-27 | Meta Platforms Technologies, Llc | Systems and methods for displaying stereoscopic rendered image data captured from multiple perspectives |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH09152668A (ja) | 1995-02-22 | 1997-06-10 | Sanyo Electric Co Ltd | 立体映像表示装置 |
JPH1172697A (ja) | 1997-08-28 | 1999-03-16 | Nec Corp | 視点位置検出方法及び検出装置 |
JP2000152285A (ja) | 1998-11-12 | 2000-05-30 | Mr System Kenkyusho:Kk | 立体画像表示装置 |
JP2001066547A (ja) | 1999-08-31 | 2001-03-16 | Toshiba Corp | 立体表示装置 |
JP2004356997A (ja) * | 2003-05-29 | 2004-12-16 | Fuji Photo Film Co Ltd | 立体視画像管理装置および方法並びにプログラム |
JP2005006114A (ja) * | 2003-06-12 | 2005-01-06 | Sharp Corp | 放送データ送信装置、放送データ送信方法および放送データ受信装置 |
JP2005151080A (ja) * | 2003-11-14 | 2005-06-09 | Sanyo Electric Co Ltd | 立体映像表示装置 |
JP2005266293A (ja) * | 2004-03-18 | 2005-09-29 | Mitsubishi Electric Corp | 液晶表示装置及び画像表示システム |
JP2007047294A (ja) * | 2005-08-08 | 2007-02-22 | Nippon Hoso Kyokai <Nhk> | 立体画像表示装置 |
JP2007318184A (ja) * | 2004-08-18 | 2007-12-06 | Sharp Corp | 立体画像生成装置及びその立体画像生成方法 |
JP2008300965A (ja) | 2007-05-29 | 2008-12-11 | Funai Electric Co Ltd | 電話機 |
Family Cites Families (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3387624B2 (ja) * | 1994-05-20 | 2003-03-17 | キヤノン株式会社 | 立体ディスプレイ装置 |
EP0830034B1 (en) * | 1996-09-11 | 2005-05-11 | Canon Kabushiki Kaisha | Image processing for three dimensional display of image data on the display of an image sensing apparatus |
JP3397602B2 (ja) | 1996-11-11 | 2003-04-21 | 富士通株式会社 | 画像表示装置及び方法 |
JPH11234703A (ja) | 1998-02-09 | 1999-08-27 | Toshiba Corp | 立体表示装置 |
JP3646969B2 (ja) * | 1998-05-25 | 2005-05-11 | 富士通株式会社 | 3次元画像表示装置 |
JP4149037B2 (ja) | 1998-06-04 | 2008-09-10 | オリンパス株式会社 | 映像システム |
JP3969252B2 (ja) * | 2002-08-27 | 2007-09-05 | 日本電気株式会社 | 立体画像平面画像切換表示装置及び携帯端末装置 |
WO2004029701A1 (ja) * | 2002-09-26 | 2004-04-08 | Sharp Kabushiki Kaisha | 2d/3d切替型液晶表示パネル、および2d/3d切替型液晶表示装置 |
KR101017231B1 (ko) * | 2002-10-30 | 2011-02-25 | 가부시키가이샤 한도오따이 에네루기 켄큐쇼 | 표시 장치 및 전자 기기 |
JP2004294861A (ja) | 2003-03-27 | 2004-10-21 | Sanyo Electric Co Ltd | 立体映像表示装置 |
KR100768837B1 (ko) * | 2003-04-17 | 2007-10-19 | 샤프 가부시키가이샤 | 3차원 화상 작성 장치, 3차원 화상 재생 장치, 3차원 화상 처리 장치, 3차원 화상 처리 프로그램을 기록한 기록 매체 |
WO2004107763A1 (ja) * | 2003-05-28 | 2004-12-09 | Sanyo Electric Co., Ltd. | 立体映像表示装置及びプログラム |
JP2005167310A (ja) | 2003-11-28 | 2005-06-23 | Sharp Corp | 撮影装置 |
JP2006023599A (ja) | 2004-07-09 | 2006-01-26 | Ts Photon:Kk | 2d/3d切換式ディスプレイシステム |
JP2006047507A (ja) | 2004-08-02 | 2006-02-16 | Sharp Corp | 表示装置及び表示方法 |
JP4619216B2 (ja) | 2005-07-05 | 2011-01-26 | 株式会社エヌ・ティ・ティ・ドコモ | 立体画像表示装置及び立体画像表示方法 |
JP4977995B2 (ja) | 2005-10-26 | 2012-07-18 | 日本電気株式会社 | 携帯表示装置 |
JP5006587B2 (ja) * | 2006-07-05 | 2012-08-22 | 株式会社エヌ・ティ・ティ・ドコモ | 画像提示装置および画像提示方法 |
KR100823197B1 (ko) * | 2007-03-02 | 2008-04-18 | 삼성에스디아이 주식회사 | 전자 영상 기기 및 그 구동방법 |
KR20080093637A (ko) * | 2007-04-17 | 2008-10-22 | 삼성전자주식회사 | 입체 영상 표시 장치 및 방법 |
-
2009
- 2009-10-07 US US13/129,753 patent/US9122064B2/en active Active
- 2009-10-07 CN CN200980155402.6A patent/CN102292998B/zh active Active
- 2009-10-07 CN CN201410718630.2A patent/CN104409032B/zh active Active
- 2009-10-07 CN CN201510160882.2A patent/CN104914587B/zh active Active
- 2009-10-07 EP EP09828938A patent/EP2365701A1/en not_active Withdrawn
- 2009-10-07 WO PCT/JP2009/067469 patent/WO2010061689A1/ja active Application Filing
- 2009-10-07 JP JP2010540421A patent/JP5170254B2/ja active Active
-
2012
- 2012-12-20 JP JP2012278159A patent/JP5497882B2/ja active Active
-
2015
- 2015-08-03 US US14/816,178 patent/US9880395B2/en active Active
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH09152668A (ja) | 1995-02-22 | 1997-06-10 | Sanyo Electric Co Ltd | 立体映像表示装置 |
JPH1172697A (ja) | 1997-08-28 | 1999-03-16 | Nec Corp | 視点位置検出方法及び検出装置 |
JP2000152285A (ja) | 1998-11-12 | 2000-05-30 | Mr System Kenkyusho:Kk | 立体画像表示装置 |
JP2001066547A (ja) | 1999-08-31 | 2001-03-16 | Toshiba Corp | 立体表示装置 |
JP2004356997A (ja) * | 2003-05-29 | 2004-12-16 | Fuji Photo Film Co Ltd | 立体視画像管理装置および方法並びにプログラム |
JP2005006114A (ja) * | 2003-06-12 | 2005-01-06 | Sharp Corp | 放送データ送信装置、放送データ送信方法および放送データ受信装置 |
JP2005151080A (ja) * | 2003-11-14 | 2005-06-09 | Sanyo Electric Co Ltd | 立体映像表示装置 |
JP2005266293A (ja) * | 2004-03-18 | 2005-09-29 | Mitsubishi Electric Corp | 液晶表示装置及び画像表示システム |
JP2007318184A (ja) * | 2004-08-18 | 2007-12-06 | Sharp Corp | 立体画像生成装置及びその立体画像生成方法 |
JP2007047294A (ja) * | 2005-08-08 | 2007-02-22 | Nippon Hoso Kyokai <Nhk> | 立体画像表示装置 |
JP2008300965A (ja) | 2007-05-29 | 2008-12-11 | Funai Electric Co Ltd | 電話機 |
Non-Patent Citations (1)
Title |
---|
NIKKEI ELECTRONICS, no. 838, 6 January 2003 (2003-01-06), pages 26 - 27 |
Cited By (31)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2395762A3 (en) * | 2010-06-10 | 2013-10-30 | LG Electronics Inc. | Mobile terminal and controlling method thereof |
JP2012090256A (ja) * | 2010-09-22 | 2012-05-10 | Nikon Corp | 画像表示装置及び撮像装置 |
US9076245B2 (en) | 2010-09-22 | 2015-07-07 | Nikon Corporation | Image display apparatus and imaging apparatus |
CN102447919A (zh) * | 2010-10-08 | 2012-05-09 | 宏碁股份有限公司 | 三维视频图像调整方法及装置 |
US20120176369A1 (en) * | 2011-01-07 | 2012-07-12 | Nintendo Co., Ltd. | Computer-readable storage medium having information processing program stored therein, information processing method, information processing apparatus, and information processing system |
JP2011108256A (ja) * | 2011-01-07 | 2011-06-02 | Nintendo Co Ltd | 情報処理プログラム、情報処理方法、情報処理装置及び情報処理システム |
US9325961B2 (en) | 2011-01-07 | 2016-04-26 | Nintendo Co., Ltd. | Computer-readable storage medium having information processing program stored therein, information processing method, information processing apparatus, and information processing system |
CN102591487A (zh) * | 2011-01-07 | 2012-07-18 | 任天堂株式会社 | 信息处理方法、信息处理装置及信息处理系统 |
EP2475178A3 (en) * | 2011-01-07 | 2013-12-11 | Nintendo Co., Ltd. | Computer-readable storage medium having information processing program stored therein, information processing method, information processing apparatus, and information processing system |
CN102591487B (zh) * | 2011-01-07 | 2016-04-06 | 任天堂株式会社 | 信息处理方法、信息处理装置及信息处理系统 |
CN102790898A (zh) * | 2011-05-18 | 2012-11-21 | 索尼公司 | 显示控制设备,显示控制方法,程序和及记录介质 |
US9914056B2 (en) | 2011-06-03 | 2018-03-13 | Nintendo Co., Ltd. | Storage medium having stored therein an image generation program, image generation method, image generation apparatus and image generation system |
US9259645B2 (en) | 2011-06-03 | 2016-02-16 | Nintendo Co., Ltd. | Storage medium having stored therein an image generation program, image generation method, image generation apparatus and image generation system |
WO2013014024A1 (en) * | 2011-07-22 | 2013-01-31 | St-Ericsson Sa | Method for a mobile electronic device and such a mobile electronic device |
EP2549720A1 (en) * | 2011-07-22 | 2013-01-23 | ST-Ericsson SA | Method for a mobile electronic device and such a mobile electronic device |
US9398290B2 (en) | 2012-07-31 | 2016-07-19 | Nlt Technologies, Ltd. | Stereoscopic image display device, image processing device, and stereoscopic image processing method |
EP2693758A2 (en) | 2012-07-31 | 2014-02-05 | NLT Technologies, Ltd. | Stereoscopic image display device, image processing device, and stereoscopic image processing method |
JP2013069301A (ja) * | 2012-10-24 | 2013-04-18 | Nintendo Co Ltd | 情報処理プログラム、情報処理方法、情報処理装置及び情報処理システム |
US10567741B2 (en) | 2013-09-26 | 2020-02-18 | Tianma Microelectronics Co., Ltd. | Stereoscopic image display device, terminal device, stereoscopic image display method, and program thereof |
JP2018182745A (ja) * | 2013-09-26 | 2018-11-15 | Tianma Japan株式会社 | 立体画像表示装置及び端末装置 |
CN106454326A (zh) * | 2016-10-13 | 2017-02-22 | 张家港康得新光电材料有限公司 | 串扰值的测试装置 |
WO2019150880A1 (ja) * | 2018-01-30 | 2019-08-08 | ソニー株式会社 | 情報処理装置、情報処理方法、及びプログラム |
US11327317B2 (en) | 2018-01-30 | 2022-05-10 | Sony Corporation | Information processing apparatus and information processing method |
WO2020004275A1 (ja) * | 2018-06-26 | 2020-01-02 | 京セラ株式会社 | 3次元表示装置、制御コントローラ、3次元表示方法、3次元表示システム、および移動体 |
JPWO2020004275A1 (ja) * | 2018-06-26 | 2021-07-08 | 京セラ株式会社 | 3次元表示装置、制御コントローラ、3次元表示方法、3次元表示システム、および移動体 |
JP7145214B2 (ja) | 2018-06-26 | 2022-09-30 | 京セラ株式会社 | 3次元表示装置、制御コントローラ、3次元表示方法、3次元表示システム、および移動体 |
US11924400B2 (en) | 2018-06-26 | 2024-03-05 | Kyocera Corporation | Three-dimensional display device, control controller, three-dimensional display method, three-dimensional display system, and moving body |
JP2020072455A (ja) * | 2018-11-02 | 2020-05-07 | 京セラ株式会社 | 3次元表示装置、ヘッドアップディスプレイ、移動体、およびプログラム |
WO2020090712A1 (ja) * | 2018-11-02 | 2020-05-07 | 京セラ株式会社 | 3次元表示装置、ヘッドアップディスプレイ、移動体、およびプログラム |
JP7105173B2 (ja) | 2018-11-02 | 2022-07-22 | 京セラ株式会社 | 3次元表示装置、ヘッドアップディスプレイ、移動体、およびプログラム |
JP7433902B2 (ja) | 2019-04-26 | 2024-02-20 | Tianma Japan株式会社 | 表示装置 |
Also Published As
Publication number | Publication date |
---|---|
US20110221750A1 (en) | 2011-09-15 |
US9880395B2 (en) | 2018-01-30 |
CN104409032B (zh) | 2017-05-10 |
CN102292998A (zh) | 2011-12-21 |
CN104409032A (zh) | 2015-03-11 |
JP5170254B2 (ja) | 2013-03-27 |
CN104914587A (zh) | 2015-09-16 |
CN102292998B (zh) | 2015-05-06 |
EP2365701A1 (en) | 2011-09-14 |
JP2013128285A (ja) | 2013-06-27 |
US9122064B2 (en) | 2015-09-01 |
JPWO2010061689A1 (ja) | 2012-04-26 |
JP5497882B2 (ja) | 2014-05-21 |
CN104914587B (zh) | 2017-12-01 |
US20150338669A1 (en) | 2015-11-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5497882B2 (ja) | 表示装置、端末装置および表示方法 | |
JP5006587B2 (ja) | 画像提示装置および画像提示方法 | |
US9280951B2 (en) | Stereoscopic image display device, image processing device, and stereoscopic image processing method | |
JP4857732B2 (ja) | 仮想現実感生成システム | |
CN103595987A (zh) | 立体图像显示装置、图像处理装置及图像处理方法 | |
JP2015210297A (ja) | 立体画像表示装置,立体画像表示方法,及び立体画像表示プログラム | |
JP2008146221A (ja) | 画像表示システム | |
US20120120065A1 (en) | Image providing apparatus and image providing method based on user's location | |
SG181708A1 (en) | A system and method for producing stereoscopic images | |
JP6238532B2 (ja) | 画像表示装置及び画像表示方法 | |
US11012682B2 (en) | Linearly actuated display | |
JP4853241B2 (ja) | 車両用表示装置 | |
JP4929768B2 (ja) | 視覚情報呈示装置及び視覚情報呈示方法 | |
US20190141314A1 (en) | Stereoscopic image display system and method for displaying stereoscopic images | |
JP2008541165A (ja) | フラットパネルディスプレイを利用した3次元映像表示装置 | |
JPWO2005088386A1 (ja) | 立体表示装置及び立体表示方法 | |
JP2015037282A (ja) | 画像処理装置、画像処理方法及びプログラム | |
JP6677387B2 (ja) | 立体画像表示装置及び立体画像表示方法 | |
KR101093929B1 (ko) | 깊이 지도를 이용하여 3차원 영상을 표시하는 방법 및 시스템 | |
WO2013031864A1 (ja) | 表示装置 | |
US20170052684A1 (en) | Display control apparatus, display control method, and program | |
JP5780224B2 (ja) | 立体表示装置および立体表示システム | |
JP2005099425A (ja) | 三次元表示装置 | |
JPH11355805A (ja) | 3次元画像の表示制御装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200980155402.6 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09828938 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13129753 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 2010540421 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
REEP | Request for entry into the european phase |
Ref document number: 2009828938 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2009828938 Country of ref document: EP |