WO2013110297A1 - Autostereoscopic display and method of displaying a 3d image - Google Patents
Autostereoscopic display and method of displaying a 3d image Download PDFInfo
- Publication number
- WO2013110297A1 WO2013110297A1 PCT/EP2012/001886 EP2012001886W WO2013110297A1 WO 2013110297 A1 WO2013110297 A1 WO 2013110297A1 EP 2012001886 W EP2012001886 W EP 2012001886W WO 2013110297 A1 WO2013110297 A1 WO 2013110297A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- display
- strips
- pixels
- named
- Prior art date
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/349—Multi-view displays for displaying three or more geometrical viewpoints without viewer tracking
- H04N13/351—Multi-view displays for displaying three or more geometrical viewpoints without viewer tracking for displaying simultaneously
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/302—Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
- H04N13/305—Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using lenticular lenses, e.g. arrangements of cylindrical lenses
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/366—Image reproducers using viewer tracking
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/398—Synchronisation thereof; Control thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/302—Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
- H04N13/317—Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using slanted parallax optics
Definitions
- the invention relates to an autostereoscopic display for simultaneously displaying more than two different images in accordance with the preamble of the main claim and to a method of displaying a 3D image in accordance with the preamble of the independent claim which can be carried out using such a display.
- a generic display includes a pixel matrix having a multitude of pixels which are arranged in different rows, wherein a plurality of more than two disjoint subsets of pixels on the pixel matrix are defined such that each of the subsets forms a band of parallel strips which include a non-zero angle with the rows, wherein the strips belonging to the different subsets are interleaved such that strips and/or pixels of the different subsets alternate cyclically in the row direction.
- such a display comprises an optical element which is arranged in front of or behind the pixel matrix, which has a grid-like structure orientated parallel to the strips and imposes a respective defined propagation direction on light emanating or transmitted from the pixels such that, at a nominal distance in front of the display predefined by a geometry of the display, a number, corresponding to the named plurality, of viewing zones , which are laterally offset relative to one another and of which each is associated with exactly one of the subsets are defined such that the light emanating or transmitted from each of the subsets of pixels is directed in the viewing zone associated with this subset.
- Displays of this kind are known per se as so-called multiview displays.
- a respective one of a number, corresponding to the named plurality, of stereoscopic half- images is displayed on the named subgroups of pixels and a respective two of said half-images which are displayed on subgroups having directly adjacent strips combine pairwise to form a stereoscopic image.
- not only a single viewer, but also several viewers positioned next to one another in front of the display can each autostereoscopically perceive an image of the same scene which appears three-dimensional.
- a viewer can move in a lateral direction in front of the display without losing the three- dimensional impression. He will rather see the same scene from a perspective changing according to his movement.
- the object of the invention to propose a corresponding method of displaying 3D images on an autostereoscopic display which satisfi demands.
- the proposed display therefore has a control unit for controlling the pixel matrix in dependence on image data which define a 3D image. This is configured to carry out the respective following steps for each of the strips of pixels for controlling the pixel matrix for an autostereoscopic viewing of the 3D image from a viewing distance in front of the screen differing from the nominal distance:
- the determination of the respective value of the location coordinate in dependence on the viewing distance is possible in this respect by a simpl arithmetic operation and only represents a use of projective geometry.
- Various coordinate systems can be used as the basis for the definition of the location coordinate.
- the named location coordinate can also be selected as the coordinate x.
- the viewing distance can be represented as z and a height by the coordinate y.
- any parameterization, preferably a constant parameterization, of the named line can be used as the location coordinate.
- the named line is typically a section of a horizontal straight line orientated parallel to the display. It is, however, also possible that the line is slanted or curved.
- the named distance designate a distance between the display and a defined point, typically a central point, on the line.
- the line be defined as limited in its length - that is only covering a space of such a limited width in front of the display - such that the value of the location coordinate determined in the described manner is unambiguous. That is, light rays should not be taken into consideration which are conducted through the optical element such that they are not led through one of the viewing zones typically lying centrally in front of the display, but rather through so-called secondary zones.
- Each value of the location coordinate corresponds to a viewing position on the named line which is here only called a position.
- the respective view is defined as the two-dimensional view of the 3D image or of the scene displayed by the 3D image which results from this viewing position or from the direction of gaze thereby predefined.
- An actual - or virtual - camera position can be associated with each value of the named location coordinate.
- the view which corresponds to a direction of gaze from the position defined by the respective value of the location coordinate means a view which results or would result by a taking of the named scene from the camera position associated with this value of the location coordinate.
- the intensity values which are spoken of here can also be called brightness values or control values for the individual pixels. Therefore, they represent image information on the individual picture elements of the respective view to be displayed by the pixels.
- they can additionally contain color information or, in the case of a pixel matrix having pixels of different elementary colors, can depend on a color of the respective pixel - then usually called a subpixel.
- various processes known per se can be considered of which some will be outlined further below.
- stereographic half-images in the present document should designate respective views of a scene of which two combine to one
- stereoscopic image of this scene in that they correspond to views from - actual or virtual - camera positions or eye positions which are laterally offset to one another by approximately an average distance between eyes.
- the individual views should therefore also be called stereoscopic half-images.
- the display described can be a simple multiview display which is only equipped with a special control unit or an especially programmed control unit so that, in addition to the normal distance, other viewing distances are also possible at least within certain limits which are freely choosable.
- the viewing zones in typical embodiments are expediently dimensioned so that their lateral distance in each case approximately corresponds to an average distance between the eyes - e.g. 65 mm - the named plurality can e.g. be 9 or even more.
- the pixel matrix can be provided e.g. by an LCD or by an OLED display.
- the optical element can in particular be a parallax barrier or a lenticular screen. A combination of these screen types is also possible.
- the grid-like structure is typically formed by a group of parallel cylindrical lenses.
- Barrier screens in particular slot screens, can be used as the parallax barrier.
- the optical element can also be a Fresnel structure or an LC structure which reproduces a slot screen or another screen type.
- the pixels can be multicolor pixels or subpixels of different elementary colors - e.g. red, green and blue. In the last named case, typically three respective pixels or subpixels from three mutually following rows will combine to form one color-neutral or true-color picture element.
- the described device of the control unit of a corresponding display is particularly expedient if each of the strips from each of the rows of the pixel matrix contains at most one pixel, that is has a width of only one pixel. There is then namely no possibility of carrying out a lateral displacement of centers of brightness within the individual strips to adapt the control to the changed viewing distance.
- An advantageous method is also proposed for displaying a 3D image on an autostereoscopic display of the described type which achieves the object set.
- This method is a particular use of a display having a pixel matrix and an optical element arranged in front of or behind the pixel matrix, wherein the pixel matrix has a multitude of pixels which are arranged in different rows, wherein a plurality of more than two disjoint subsets of pixels on the pixel matrix are defined such that each of the subsets forms a band of parallel strips which include a non-zero angle with the rows, wherein the strips of the different subsets alternate cyclically in the row direction and wherein preferably each of the strips from each of the rows of the pixel matrix contains at most one pixel, and wherein the optical element has a grid-like structure orientated parallel to the strips and imposes a respective defined propagation direction on light emanating or transmitted from the pixels such that, at a nominal spacing from the display predefined by a geometry of the display, a number, corresponding to the
- the pixel matrix is controlled in the method in dependence on image data which define a 3D image for an autostereoscopic viewing of the 3D image from a viewing distance in front of the display which differs from the nominal distance.
- the method includes the respective following steps for each of the strips of pixels:
- a location coordinate which describes a lateral position of locations on a line orientated in the row direction and disposed at a defined height in the viewing distance in front of the display, wherein the value is respectively determined for the location at which light emanating or transmitted from the pixels of this strip is incident on the named line with the propagation direction imposed by the optical element;
- the location coordinate can be determined - in a scale which is as finely graduated as possible or even has no graduations - in each case so exactly that it adopts a number of different values for the different strips which is larger than then named plurality.
- the control unit of the proposed display can accordingly be configured to determine the location coordinate on a scale which is as finely graduated as possible or even has no graduations so exactly that it adopts a number of different values for the different strips which is larger than the named plurality.
- the location coordinate namely adopts a number, corresponding to the named plurality, of possible values of the location coordinate for those locations whose lateral positions correspond to the previously named viewing zones or which are disposed, viewed from the display, exactly in front of or behind these viewing zones.
- the location coordinate should therefore additionally also adopt or be able to adopt intermediate values between these discrete values.
- it can, however, be advantageous if in each case only a limited number of discrete intermediate values is permitted and if the location coordinate is rounded up or down to the respective next closest permitted value or intermediate value. It can thus be achieved that the number of the views required in total, or more exactly of the views of which at least individual image strips are required or may be required, remains manageable.
- the named views are typically defined so that, for a number, corresponding to the named plurality, of values, they correspond to a number of stereoscopic half-images corresponding to this plurality of which in each case two, which correspond to values closest from one another from this number of values, combine to one stereoscopic image. They are the half-images which are displayed on the named subgroups on a conventional control of the display or when the viewing distance corresponds to the nominal distance. In the present case of a viewing distance differing from the nominal distance, at least one of the views, which corresponds to an intermediate value of the location coordinate, has to be selected corresponding to a direction of gaze disposed between the directions of gaze of these two stereoscopic half- images.
- the location coordinate is determined in an expedient embodiment of the method such that, for a number, corresponding to the named plurality, of directly adjacent strips which extend centrally over the pixel matrix, it adopts the number of values named in the previous paragraph, whereas it adopts intermediate values for at least some of the strips disposed further outwardly.
- the control unit can accordingly be configured to determine the location coordinate in this manner. The control of the pixel matrix at the image center then does not differ or only differs insignificantly from the control provided for the nominal distance
- any desired rendering processes known per se can be used to determine the required intensity values for the different views. It is advantageously sufficient in each case in this respect if, for each of the views, the intensity views are only determined of the image strip or of those image strips for which the value of the location coordinate corresponding to this view has been determined.
- the required computation power therefore remains within limits which also makes the method usable for image sequences which are not defined in advance, but are defined in real time - for example in computer games or in the presentation of live shots which are filmed with stereoscopic cameras.
- the intensity values are determined for the different views in that image information for the required image strip or strips of the respective view are determined from a depth map - or from several depth maps - defined by the named image data and from texture values defined by the image data for area segments of a surface represented by the depth map. Details on such a process for acquiring image information of a view not present in advance can be seen e.g. from document DE 10 2010 028 668 Al.
- the intensity values for the different views can be determined in that disparities between at least two stereoscopic half images, which are defined by the image data, are determined and image information is determined for the required image strip or strips in that the view is defined as an intermediate image between the named stereoscopic half-images in dependence on the disparities and on the respective value of the named location coordinate by interpolation and/or by transformation. Instructions on how this can be done can be found e.g. in the document US 6,366,281 Bl. Such processes are also called "morphing".
- a further possibility comprises the fact that the intensity values for the different views are determined in that disparities between at least two stereoscopic half-images, which are defined by the image data, are
- control unit of the proposed display can be configured, for determining the intensity values for the different views,
- the display can include a tracking device for determining a distance between the eyes of at least one viewer and the display and the control unit can be configured to control the pixel matrix for the viewing distance corresponding to this distance.
- the control unit is therefore then configured to control the pixel matrix - if the measured distance differs from the nominal distance - in the previously described manner so that the named viewing distance used as the basis for the control corresponds to the distance determined by the tracking device.
- a distance from an eye pair of at least one viewer from the display is therefore detected, wherein the viewing distance is selected as corresponding to the spacing thus detected for determining the values of the location coordinate for the different strips.
- images of a space in front of the display taken by a stereoscopic camera can e.g. be evaluated using a suitable image evaluation process.
- Fig. 1 in a schematic representation, a plan view of an autostereoscopic display and a viewing space in front of this display;
- Fig. 2 a detail of a pixel matrix of the display of Fig. 1 in a front view
- Fig. 3 in a representation corresponding to Fig. 1, the same display, with here some components of the display having been omitted and only some beam paths being drawn by way of example to explain the proposed control of the display.
- FIG. 1 An autostereoscopic display is shown in Fig. 1 which is in particular suitable as a multiview display to display a plurality of different images, nine in the present case, simultaneously.
- This display has a pixel matrix 11 and an optical element 12 arranged in front of the pixel matrix 11.
- the display includes a control unit 13 for controlling the pixel matrix 11 in dependence on image data 14 which define a 3D image. Typically, this 3D image will vary over time so that is is more precisely an image sequence.
- the image data 14 can in this respect be stored e.g. on a data carrier and can be read from there or can be defined by a computer game in dependence on its course.
- the pixel matrix 11 is an LCD or an OLED display having a multitude of pixels 15 which are arranged in different rows.
- FIG. 2 A detail of this pixel matrix 11 is shown in Fig. 2.
- the individual pixels 15 are each shown by rectangles there.
- the pixels 15 are subpixels of the elementary colors red, green and blue - marked in Fig. 2 by the letters R, G and B respectively.
- a plurality of disjoint subsets of pixels 15, nine in the present case - the plurality could naturally also be larger or smaller - are defined on the pixel matrix 11 such that each of these subsets forms a group of parallel strips.
- the subsets are numbered continuously from 1 to 9 and in Fig. 2 the pixels 15 are each provided with the number of the subset to which the pixel 15 belongs.
- the named strips include a non-zero angle with the rows, with the strips of the different subsets alternating cyclically in the row direction and with each of the strips not containing more than one pixel 15 in each of the rows.
- the optical element 12 can e.g. be designed as a slot screen or as a lenticular screen and has a grid-like structure which is oriented parallel to the strips and which is indicated by dashed lines in Fig. 2.
- d 9b D n /(D n +a) applies to a period d of this structure in the lateral direction - corresponding to the row direction - where is a lateral distance from the area centers of adjacent pixels 15, a designates a distance between the pixel matrix 11 and the optical element 12 and D n stands for a so-called nominal distance.
- the optical element 12 in each case thereby defines a respective defined propagation direction for light emanating or transmitted from the pixels 15 .
- a number, corresponding to the previously named plurality, of nine viewing zones 16 offset laterally relative to one another are defined so that each of the viewing zones 16 is associated with exactly one of the subsets, and such that light emanating or transmitted from each of the subgroups of pixels 15 is directed into the viewing zone 16 associated with this subset.
- This is illustrated in Fig. 1 by a respective dashed line for two extremely outwardly disposed pixels 15 of the subgroup 2. Modifications in which the optical element 12 is arranged behind the pixel matrix 11 are just as possible.
- the viewing zones 16 are shown with their diamond-shaped cross-section in Figure 1 and are numbered continuously from 1 to 9 in accordance with the subgroups.
- the mutually adjacent viewing zones 16 are each mutually offset laterally by about 65 mm.
- a respective one of nine stereoscopic half-images is displayed on each of the subgroups of pixels 15 so that one of these stereoscopic half-images is visible from each of the viewing zones 16.
- the stereoscopic half-images are then selected so that the two stereoscopic half-images visible from directly adjacent viewing zones 16 each combine to form one stereoscopic image corresponding to a view of the 3D image thus displayed.
- One or more viewers can then each see one of the views of a three-dimensional effect with a depth effect from a viewing plane 17 which is disposed at the nominal distance D n in front of the display.
- the pixel matrix 11 is controlled for an autostereoscopic viewing of the 3D image from a viewing distance D differing from the nominal distance D n .
- the display in the present embodiment has a tracking device which is here given by a stereoscopic camera 18 directed to the viewing space in front of the display and by an evaluation unit 19 for carrying out an image evaluation process.
- a head position of at least one viewer is detected using this tracking device and the viewing distance D is measured as the distance between an eye pair of this viewer and the display.
- the control unit 13 now carries out some of the steps explained in more detail in the following by a corresponding technical program device in dependence on the image data 14 and on the viewing distance D determined by the tracking device for each of the strips of pixels 15 to control the pixel matrix 11 for an autostereoscopic viewing of the 3D image from the viewing distance D in front of the display differing from the nominal distance D n .
- a respective value of a location coordinate x is determined for each of the strips according to a rule which can be described as follows.
- a specific height - which can be selected largely as desired - an imaginary line 20 orientated in the row direction - that is horizontally - is defined at a spacing in front of the display which corresponds to the determined viewing distance D.
- the location coordinate is defined such that it describes a lateral position of locations on the line 20.
- the line 20 is a section of a straight line. It could, however, also extend in a slanted manner or be curved.
- the named distance designate a distance between the display and a defined point, typically a central point disposed in front of the display, on the line 20.
- the value of the location coordinate is now determined by a simple mathematical operation for each of the strips for the location at which light emanating or transmitted from the pixels of this strip is incident with the propagation direction imposed by the optical element 12 onto the named line 20.
- This is illustrated by way of example in Fig. 1 for one of the strips by means of a dashed line, and indeed for a strip of pixels 15 which belongs to the subgroup 7 and which is disposed near the left hand margin of the display.
- the location coordinate x adopts nine discrete possible values for such locations which are disposed exactly in front of the viewing zones 16, seen from the center of the display.
- the location coordinate x is scaled in the present example such that these are the discrete values 1, 2, 3, 4, 5, 6, 7, 8 and 9.
- the value of the location coordinate x is in each case determined on a scale which is finely graduated or is even at least quasi without graduations exactly so that it also adopts intermediate values between these discrete values and adopts a number of different values for the different strips which is considerably larger than the previously named plurality of nine.
- x 3.5 applies rather precisely e.g. to the strip for which the determination of the value of the location coordinate x is illustrated in Fig. 1.
- Fig. 1 As can also be recognized in Fig.
- the location coordinate x is defined so that it adopts the discrete values 1, 2, 3, 4, 5, 6, 7, 8 and 9 for the nine directly adjacent strips which extend centrally over the pixel matrix, whereas it also adopts intermediate values disposed therebetween for at least some of the further outwardly disposed strips.
- Each value of the location coordinate x therefore stands for a specific position on the line and thus for a specific viewing position with which in turn a specific direction of gaze or perspective on the scene defined by the 3D image can be associated.
- intensity values are now determined for each of the strips and are defined by the image data 14 for an image strip corresponding to this strip of a view of the 3D image which corresponds to a direction of gaze from a position defined by the named value.
- the respective view is in this respect defined as the two-dimensional view of the 3D image or of the scene displayed by the 3D image which results from this viewing position or from the direction of gaze thereby predefined.
- the pixels 15 of the respective strip is controlled using the thus determined intensity values which in the present case naturally also depend on color information contained in the image data 14 and on the color of the individual pixels 15.
- the pixel matrix 15 could naturally also have multicolor pixels which are then each controlled using correspondingly determined intensity values and color values.
- the named views are defined as the nine stereoscopic half-images which were named above in connection with the conventional mode of operation of the display for a viewing from the nominal distance D n .
- most views, of which only individual strips are needed are defined for intermediate values of the location coordinate x which each correspond to a direction of gaze disposed between the directions of gaze of those nine stereoscopic half- images.
- rays are drawn by way of example in Fig. 1 of the x values 1, 5, and 9 which show the points on the pixel matrix 11 from which light must emanate to be incident through the optical element 12 onto the location on the line 20 defined by the respective x value.
- the pixels 15 of the subgroup 9, and at the right hand end of the pixel matrix 11 are controlled using the intensity values which belong in the normal case - that is on a viewing from the nominal spacing D n - to the stereoscopic half-image visible in the fifth viewing zone 16.
- the pixels 15 is controlled using intensity values of this view because none of the pixels 15 there is disposed where the corresponding image information would have to be imaged.
- the views are meant by this which would result by taking the displayed scene from camera positions which are disposed at corresponding locations between the camera positions of the fourth and fifth or of the fifth and sixth stereoscopic half-images of the nine stereoscopic half-images named further above.
- the line 20 is fixed in such a limited manner in its length or width - that is in the x direction - that the values of the location coordinate x can be clearly determined in the above-described manner.
- the parameterization of the line 20 by the location coordinate x can naturally also have a different scaling than in the case shown in Fig. 1. It can also be achieved by a stretching of the distances between the positions which correspond here to the discrete x values 1, 2, 3, 4, 5, 6, 7, 8 and 9 with an unchanging association of the views with specific values for x that two views which are visible from two positions on the line 20 remote from one another by an average distance between the eyes of about 65 mm also correspond to two respective perspectives from camera positions correspondingly remote from one another - and not further remote, for instance, with a smaller D.
- the parallax between two views or more exactly between the views which are approached by the described control and which a viewer can see with his two eyes from the viewing distance D should therefore correspond as exactly as possible to the parallax which results on the viewing of the displayed scene by the average eye distance.
- the value of the location coordinate x is respectively rounded up or down to a next closest intermediate value from a limited number of discrete intermediate values. It would e.g. be possible to determine the location coordinate respectively only up to the first decimal point.
- Nine respective possible intermediate images are then disposed between the stereoscopic half-images which correspond to the x values 1, 2, 3, 4, 5, 6, 7, 8 and 9.
- the number of the views needed as a maximum - more precisely the number of which at least individual image strips can be needed is then reduced to at most 90.
- the calculation effort can be advantageously restricted by this restriction to a discrete number of views - which is, howeve larger than the original number of nine views.
- control unit 13 programming of the control unit 13 to construe the needed views or more precisely the required image strips thereof and to determine the intensity values for the different views.
- the image data 14 can e.g. define a depth map and texture values for area segments of a surface reproduced by the depth map.
- the intensity values can then be determined for the different views in that image information for the required image strip or strips of the respective view are determined from the depth map and from the texture values for the area segments of the surface reproduced by the depth map.
- the image data define two or more stereoscopic half-images.
- the intensity values can then be determined for the different views in that disparities between the already defined stereoscopic half-images are determined and image information are determined for the required image strip or strips of the respective view in that this view is defined as an intermediate image between the named stereoscopic half-images in dependence on the disparities and on the respective value of the named location coordinate x by interpolation and/or by transformation - by so-called "morphing".
- a depth map can also be calculated from the disparities which result from the already present half-images. Image information for the required image strip or strips of the respective view can then in turn be determined using this depth map.
- the tracking device is naturally also configured also to determine a lateral location of the at least one eye pair using the head position of the at least one eye pair.
- the control unit 13 can therefore additionally be configured to control the pixel matrix 11 in dependence on the lateral position determined at least by the tracking device so that a region from which the 3D image is autostereoscopically visible also includes the eye pair or the eye pairs of the tracked viewer or of the tracked viewers. If required, this can be done by a lateral displacement of the line 20 or of the viewing zones 16.
- the viewing distance D is determined by the tracking device and the pixel matrix 11 is controlled in dependence on the viewing distance D thus determined.
- the viewing distance D could naturally also be selected by a user - e.g. in dependence on dimensions of a room in which the display is installed - and can be predefined by an input at the control unit 13.
- the optical element 12 can be controllable and form lens elements with refractive properties variable in dependence on a control of the optical element 12.
- Liquid crystals can be used for realizing such structures known per se.
- the control unit 13 Independently of whether the viewing distance D is fixed arbitrarily by an input or in dependence on output signals of a tracking device, the control unit 13 can be configured in this case to control the optical element 12 in dependence on the viewing distance D so that the refractive properties of the lens elements are adapted to this viewing distance D.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
Abstract
The invention relates to an autostereoscopic display for simultaneously displaying more than two different images, comprising a pixel matrix (11) having a multitude of pixels (15) distributed over different subgroups; an optical element (12) which is arranged in front of or behind the pixel matrix (11), which has a grid-like structure and imposes a respective defined propagation direction on light emanating or transmitted from the pixels (15) so that a plurality of viewing zones (16) laterally offset relative to one another is defined so that each of the viewing zones (16) is associated with exactly one of the subsets and so that the light emanating or transmitted from each of the subgroups of pixels (15) is directed into the viewing zone (16) associated with this subset; and a control unit (13) for controlling the pixel matrix (11) in dependence on image data (14) defining a 3D image. In this respect, the control unit (13) is configured to carry out the respective following steps for controlling the pixel matrix (11) for an autostereoscopic viewing of the 3D image from a viewing distance (D) differing from the nominal spacing (Dn) in front of the display for each of a plurality of strips of pixels (15): - determining a value of a location coordinate (x); - determining intensity values which are defined by the image data (14) for an image strip corresponding to this strip of a view of the 3D image which corresponds to a direction of gaze from a position defined by the named value; and - controlling the pixels (15) of this strip using the intensity values determined in this manner.
Description
Autostereoscopic display and
method of displaying a 3D image
The invention relates to an autostereoscopic display for simultaneously displaying more than two different images in accordance with the preamble of the main claim and to a method of displaying a 3D image in accordance with the preamble of the independent claim which can be carried out using such a display.
A generic display includes a pixel matrix having a multitude of pixels which are arranged in different rows, wherein a plurality of more than two disjoint subsets of pixels on the pixel matrix are defined such that each of the subsets forms a band of parallel strips which include a non-zero angle with the rows, wherein the strips belonging to the different subsets are interleaved such that strips and/or pixels of the different subsets alternate cyclically in the row direction. In addition, such a display comprises an optical element which is arranged in front of or behind the pixel matrix, which has a grid-like structure
orientated parallel to the strips and imposes a respective defined propagation direction on light emanating or transmitted from the pixels such that, at a nominal distance in front of the display predefined by a geometry of the display, a number, corresponding to the named plurality, of viewing zones , which are laterally offset relative to one another and of which each is associated with exactly one of the subsets are defined such that the light emanating or transmitted from each of the subsets of pixels is directed in the viewing zone associated with this subset.
Displays of this kind are known per se as so-called multiview displays. On a use of these displays known from the prior art as intended, a respective one of a number, corresponding to the named plurality, of stereoscopic half- images is displayed on the named subgroups of pixels and a respective two of said half-images which are displayed on subgroups having directly adjacent strips combine pairwise to form a stereoscopic image. In this manner, not only a single viewer, but also several viewers positioned next to one another in front of the display can each autostereoscopically perceive an image of the same scene which appears three-dimensional. In addition, a viewer can move in a lateral direction in front of the display without losing the three- dimensional impression. He will rather see the same scene from a perspective changing according to his movement.
It is, however, disadvantageous in this respect that the viewer or each of the viewers can only see a 3D image of satisfactory quality if his eyes maintain the nominal distance from the display predefined by the geometry of the display. Otherwise each eye of the viewer namely sees shares in different regions of the display and in part overlaps of different half-images.
It is the underlying object of the invention to develop an autostereoscopic display on which a respective image of three-dimensional effect of a displayed scene can be seen from distances which should be as freely selectable as possible, wherein it should be possible as in the described prior art that several viewers simultaneously look at the display and there each see an image of three-dimensional effect of the scene and that a viewer moves laterally without him losing the three-dimensional impression. It is
furthermore the object of the invention to propose a corresponding method
of displaying 3D images on an autostereoscopic display which satisfi demands.
This object is satisfied in accordance with the invention by an
autostereoscopic display having the characterizing features of the main claim in conjunction with the features of the dependent claim as well as by a method having the features of the independent claim. Advantageous embodiments result from the features of the dependent claims.
The proposed display therefore has a control unit for controlling the pixel matrix in dependence on image data which define a 3D image. This is configured to carry out the respective following steps for each of the strips of pixels for controlling the pixel matrix for an autostereoscopic viewing of the 3D image from a viewing distance in front of the screen differing from the nominal distance:
- determining a value of a location coordinate which describes a lateral position of locations on a line orientated in the row direction and disposed at a defined height in the viewing distance in front of the display, wherein the value is determined for the location at which light emanating or transmitted from the pixels of this strip - more precisely from centers of area of the respective pixels - is incident on the named line with the propagation direction imposed by the optical element; - determining intensity values which are defined by the image data for an image strip corresponding to this strip of a view of the 3D image which corresponds to a direction of gaze from a position defined by the named value;
- controlling pixels of this strip using the so-called intensity values.
The determination of the respective value of the location coordinate in dependence on the viewing distance is possible in this respect by a simpl arithmetic operation and only represents a use of projective geometry.
Various coordinate systems can be used as the basis for the definition of the location coordinate. In an expedient coordinate system having an x axis orientated in the row direction, a vertical y axis and a z axis orientated in the direction of a normal of the display plane, the named location coordinate can also be selected as the coordinate x. In this coordinate system, the viewing distance can be represented as z and a height by the coordinate y. Instead, however, any parameterization, preferably a constant parameterization, of the named line can be used as the location coordinate. The named line is typically a section of a horizontal straight line orientated parallel to the display. It is, however, also possible that the line is slanted or curved. In this case, let the named distance designate a distance between the display and a defined point, typically a central point, on the line. In addition, let the line be defined as limited in its length - that is only covering a space of such a limited width in front of the display - such that the value of the location coordinate determined in the described manner is unambiguous. That is, light rays should not be taken into consideration which are conducted through the optical element such that they are not led through one of the viewing zones typically lying centrally in front of the display, but rather through so-called secondary zones.
Each value of the location coordinate corresponds to a viewing position on the named line which is here only called a position. The respective view is defined as the two-dimensional view of the 3D image or of the scene displayed by the 3D image which results from this viewing position or from the direction of gaze thereby predefined. An actual - or virtual - camera position can be associated with each value of the named location coordinate. The view which corresponds to a direction of gaze from the position defined by the respective value of the location coordinate means a view which results or would result by a taking of the named scene from the camera position associated with this value of the location coordinate.
The intensity values which are spoken of here can also be called brightness values or control values for the individual pixels. Therefore, they represent image information on the individual picture elements of the respective view to be displayed by the pixels. In the case of multicolor pixels, they can
additionally contain color information or, in the case of a pixel matrix having pixels of different elementary colors, can depend on a color of the respective pixel - then usually called a subpixel. At least some of the views - or more precisely the image strips of the views belonging to the respective strips of pixels, and thus at least parts of the views - have to be calculated in dependence on the named image data to determine the required intensity values. These image data admittedly define the 3D image, but do not contain a priori all the image information of all possible views. They are rather only defined indirectly by the image data and are calculated as required - that is depending on the values of the location coordinate determined for the different strips. In addition, various processes known per se can be considered of which some will be outlined further below.
It is achieved by the proposed measures that a viewer who looks at the correspondingly controlled display from the named viewing distance will also see an image of three-dimensional effect of good image quality when the viewing distance differs from the nominal distance. It is in particular achieved by the described control of the pixel matrix that the viewer sees two mutually complementary stereoscopic images which combine to a stereoscopic image at least in a very good approximation despite the viewing distance actually not matching the geometry of the display, with it simultaneously being avoided that conspicuous and disturbing irregularities or jumps occur on a lateral movement. The latter would not be able to be avoided if, instead of the proposed determination of the intensity values for the different strips of correspondingly defined views, ideally defined in each case, only the intensity values of a plurality of stereoscopic half-images defined in an unchanged manner were redistributed in response to the changed viewing distance between the pixels. The display can therefore be used for completely different viewing distances. An adaptation of the display itself to the viewing distance which under certain circumstances may be predefined by a specific use - e.g. by the positioning in a room of predefined size and division - is not necessary in this respect.
The term "stereoscopic half-images" in the present document should designate respective views of a scene of which two combine to one
stereoscopic image of this scene in that they correspond to views from -
actual or virtual - camera positions or eye positions which are laterally offset to one another by approximately an average distance between eyes. In the case of a band of more than two views having these properties, the individual views should therefore also be called stereoscopic half-images.
The display described can be a simple multiview display which is only equipped with a special control unit or an especially programmed control unit so that, in addition to the normal distance, other viewing distances are also possible at least within certain limits which are freely choosable. The viewing zones in typical embodiments are expediently dimensioned so that their lateral distance in each case approximately corresponds to an average distance between the eyes - e.g. 65 mm - the named plurality can e.g. be 9 or even more. The pixel matrix can be provided e.g. by an LCD or by an OLED display. The optical element can in particular be a parallax barrier or a lenticular screen. A combination of these screen types is also possible. In the case of a lenticular screen, the grid-like structure is typically formed by a group of parallel cylindrical lenses. Barrier screens, in particular slot screens, can be used as the parallax barrier. Finally, the optical element can also be a Fresnel structure or an LC structure which reproduces a slot screen or another screen type. The pixels can be multicolor pixels or subpixels of different elementary colors - e.g. red, green and blue. In the last named case, typically three respective pixels or subpixels from three mutually following rows will combine to form one color-neutral or true-color picture element. The described device of the control unit of a corresponding display is particularly expedient if each of the strips from each of the rows of the pixel matrix contains at most one pixel, that is has a width of only one pixel. There is then namely no possibility of carrying out a lateral displacement of centers of brightness within the individual strips to adapt the control to the changed viewing distance.
An advantageous method is also proposed for displaying a 3D image on an autostereoscopic display of the described type which achieves the object set. This method is a particular use of a display having a pixel matrix and an optical element arranged in front of or behind the pixel matrix, wherein the pixel matrix has a multitude of pixels which are arranged in different rows, wherein
a plurality of more than two disjoint subsets of pixels on the pixel matrix are defined such that each of the subsets forms a band of parallel strips which include a non-zero angle with the rows, wherein the strips of the different subsets alternate cyclically in the row direction and wherein preferably each of the strips from each of the rows of the pixel matrix contains at most one pixel, and wherein the optical element has a grid-like structure orientated parallel to the strips and imposes a respective defined propagation direction on light emanating or transmitted from the pixels such that, at a nominal spacing from the display predefined by a geometry of the display, a number, corresponding to the named plurality, of viewing zones which are laterally offset relative to one another are defined such that each of the viewing zones is associated with exactly one of the subsets and such that the light emanating or transmitted from each of the subsets of pixels is directed in the viewing zone associated with this subset.
The pixel matrix is controlled in the method in dependence on image data which define a 3D image for an autostereoscopic viewing of the 3D image from a viewing distance in front of the display which differs from the nominal distance. For this purpose, the method includes the respective following steps for each of the strips of pixels:
- determining a value of a location coordinate which describes a lateral position of locations on a line orientated in the row direction and disposed at a defined height in the viewing distance in front of the display, wherein the value is respectively determined for the location at which light emanating or transmitted from the pixels of this strip is incident on the named line with the propagation direction imposed by the optical element;
- determining intensity values which are defined by the image data for an image strip corresponding to this strip of a view of the 3D image which corresponds to a direction of gaze from a position defined by the named value;
- controlling pixels of this strip using the intensity values determined in this manner.
What was said on the display above applies accordingly to these method steps. So that an image quality which is as good as possible also results in the viewing distance different from the nominal distance, the location coordinate can be determined - in a scale which is as finely graduated as possible or even has no graduations - in each case so exactly that it adopts a number of different values for the different strips which is larger than then named plurality. The control unit of the proposed display can accordingly be configured to determine the location coordinate on a scale which is as finely graduated as possible or even has no graduations so exactly that it adopts a number of different values for the different strips which is larger than the named plurality.
The location coordinate namely adopts a number, corresponding to the named plurality, of possible values of the location coordinate for those locations whose lateral positions correspond to the previously named viewing zones or which are disposed, viewed from the display, exactly in front of or behind these viewing zones. In the present case, the location coordinate should therefore additionally also adopt or be able to adopt intermediate values between these discrete values. To keep the calculation effort within limits, it can, however, be advantageous if in each case only a limited number of discrete intermediate values is permitted and if the location coordinate is rounded up or down to the respective next closest permitted value or intermediate value. It can thus be achieved that the number of the views required in total, or more exactly of the views of which at least individual image strips are required or may be required, remains manageable.
The named views are typically defined so that, for a number, corresponding to the named plurality, of values, they correspond to a number of stereoscopic half-images corresponding to this plurality of which in each case two, which correspond to values closest from one another from this number of values, combine to one stereoscopic image. They are the half-images which are displayed on the named subgroups on a conventional control of the display or when the viewing distance corresponds to the nominal distance. In the present case of a viewing distance differing from the nominal distance, at least one of the views, which corresponds to an intermediate value of the location coordinate, has to be selected corresponding to a direction of gaze
disposed between the directions of gaze of these two stereoscopic half- images.
The location coordinate is determined in an expedient embodiment of the method such that, for a number, corresponding to the named plurality, of directly adjacent strips which extend centrally over the pixel matrix, it adopts the number of values named in the previous paragraph, whereas it adopts intermediate values for at least some of the strips disposed further outwardly. The control unit can accordingly be configured to determine the location coordinate in this manner. The control of the pixel matrix at the image center then does not differ or only differs insignificantly from the control provided for the nominal distance
Any desired rendering processes known per se can be used to determine the required intensity values for the different views. It is advantageously sufficient in each case in this respect if, for each of the views, the intensity views are only determined of the image strip or of those image strips for which the value of the location coordinate corresponding to this view has been determined. The required computation power therefore remains within limits which also makes the method usable for image sequences which are not defined in advance, but are defined in real time - for example in computer games or in the presentation of live shots which are filmed with stereoscopic cameras. One possibility is that the intensity values are determined for the different views in that image information for the required image strip or strips of the respective view are determined from a depth map - or from several depth maps - defined by the named image data and from texture values defined by the image data for area segments of a surface represented by the depth map. Details on such a process for acquiring image information of a view not present in advance can be seen e.g. from document DE 10 2010 028 668 Al.
In another embodiment, the intensity values for the different views can be determined in that disparities between at least two stereoscopic half images, which are defined by the image data, are determined and image information is determined for the required image strip or strips in that the view is defined
as an intermediate image between the named stereoscopic half-images in dependence on the disparities and on the respective value of the named location coordinate by interpolation and/or by transformation. Instructions on how this can be done can be found e.g. in the document US 6,366,281 Bl. Such processes are also called "morphing".
A further possibility comprises the fact that the intensity values for the different views are determined in that disparities between at least two stereoscopic half-images, which are defined by the image data, are
determined, from which disparities a depth map is calculated and, using this depth map, image information is determined for the required image strip or strips of the respective view.
Accordingly, the control unit of the proposed display can be configured, for determining the intensity values for the different views,
- to determine image information for the required image strip or strips of the respective view from a depth map defined by the named image data and from texture values defined by the image data for surface segments of a surface represented by the depth map; or
- to determine disparities between at least two stereoscopic half-images which are defined by the image data and to determine image data for the image strip or strips of the respective view in that the view is defined as an intermediate image between the named stereoscopic half-images in dependence on the disparities and on the respective value of the named location coordinate by interpolation and/or by transformation; or
- to determine disparities between at least two stereoscopic half-images which are defined by the image data, to calculate a depth map from the disparities and to determine image information for the required image strip or strips of the respective view using this depth map.
So that the control can be automatically adapted to a current distance of a viewer from the display, the display can include a tracking device for determining a distance between the eyes of at least one viewer and the
display and the control unit can be configured to control the pixel matrix for the viewing distance corresponding to this distance. The control unit is therefore then configured to control the pixel matrix - if the measured distance differs from the nominal distance - in the previously described manner so that the named viewing distance used as the basis for the control corresponds to the distance determined by the tracking device. In the correspondingly advantageously designed method, a distance from an eye pair of at least one viewer from the display is therefore detected, wherein the viewing distance is selected as corresponding to the spacing thus detected for determining the values of the location coordinate for the different strips. For this purpose, images of a space in front of the display taken by a stereoscopic camera can e.g. be evaluated using a suitable image evaluation process.
An embodiment of the invention will be explained in the following with reference to Figures 1 to 3. There is shown
Fig. 1 in a schematic representation, a plan view of an autostereoscopic display and a viewing space in front of this display;
Fig. 2 a detail of a pixel matrix of the display of Fig. 1 in a front view; and
Fig. 3 in a representation corresponding to Fig. 1, the same display, with here some components of the display having been omitted and only some beam paths being drawn by way of example to explain the proposed control of the display.
An autostereoscopic display is shown in Fig. 1 which is in particular suitable as a multiview display to display a plurality of different images, nine in the present case, simultaneously. This display has a pixel matrix 11 and an optical element 12 arranged in front of the pixel matrix 11. In addition, the display includes a control unit 13 for controlling the pixel matrix 11 in dependence on image data 14 which define a 3D image. Typically, this 3D image will vary over time so that is is more precisely an image sequence. The image data 14 can in this respect be stored e.g. on a data carrier and can be read from there or can be defined by a computer game in dependence on its course.
The pixel matrix 11 is an LCD or an OLED display having a multitude of pixels 15 which are arranged in different rows. A detail of this pixel matrix 11 is shown in Fig. 2. The individual pixels 15 are each shown by rectangles there. In the present case, the pixels 15 are subpixels of the elementary colors red, green and blue - marked in Fig. 2 by the letters R, G and B respectively.
A plurality of disjoint subsets of pixels 15, nine in the present case - the plurality could naturally also be larger or smaller - are defined on the pixel matrix 11 such that each of these subsets forms a group of parallel strips. The subsets are numbered continuously from 1 to 9 and in Fig. 2 the pixels 15 are each provided with the number of the subset to which the pixel 15 belongs. As can be recognized in Fig. 2, the named strips include a non-zero angle with the rows, with the strips of the different subsets alternating cyclically in the row direction and with each of the strips not containing more than one pixel 15 in each of the rows.
The optical element 12 can e.g. be designed as a slot screen or as a lenticular screen and has a grid-like structure which is oriented parallel to the strips and which is indicated by dashed lines in Fig. 2. In this respect, in the present case, d = 9b Dn/(Dn+a) applies to a period d of this structure in the lateral direction - corresponding to the row direction - where is a lateral distance from the area centers of adjacent pixels 15, a designates a distance between the pixel matrix 11 and the optical element 12 and Dn stands for a so-called nominal distance. The optical element 12 in each case thereby defines a respective defined propagation direction for light emanating or transmitted from the pixels 15 . This is done such that, at the nominal spacing Dn in front of the display, a number, corresponding to the previously named plurality, of nine viewing zones 16 offset laterally relative to one another are defined so that each of the viewing zones 16 is associated with exactly one of the subsets, and such that light emanating or transmitted from each of the subgroups of pixels 15 is directed into the viewing zone 16 associated with this subset. This is illustrated in Fig. 1 by a respective dashed line for two extremely outwardly disposed pixels 15 of the subgroup 2. Modifications in which the optical
element 12 is arranged behind the pixel matrix 11 are just as possible. The viewing zones 16 are shown with their diamond-shaped cross-section in Figure 1 and are numbered continuously from 1 to 9 in accordance with the subgroups. The mutually adjacent viewing zones 16 are each mutually offset laterally by about 65 mm.
On a conventional mode of operation of the display, a respective one of nine stereoscopic half-images is displayed on each of the subgroups of pixels 15 so that one of these stereoscopic half-images is visible from each of the viewing zones 16. The stereoscopic half-images are then selected so that the two stereoscopic half-images visible from directly adjacent viewing zones 16 each combine to form one stereoscopic image corresponding to a view of the 3D image thus displayed. One or more viewers can then each see one of the views of a three-dimensional effect with a depth effect from a viewing plane 17 which is disposed at the nominal distance Dn in front of the display.
Another mode of operation of the display will now be described here in which the pixel matrix 11 is controlled for an autostereoscopic viewing of the 3D image from a viewing distance D differing from the nominal distance Dn.
To measure the viewing distance D, the display in the present embodiment has a tracking device which is here given by a stereoscopic camera 18 directed to the viewing space in front of the display and by an evaluation unit 19 for carrying out an image evaluation process. A head position of at least one viewer is detected using this tracking device and the viewing distance D is measured as the distance between an eye pair of this viewer and the display.
The control unit 13 now carries out some of the steps explained in more detail in the following by a corresponding technical program device in dependence on the image data 14 and on the viewing distance D determined by the tracking device for each of the strips of pixels 15 to control the pixel matrix 11 for an autostereoscopic viewing of the 3D image from the viewing distance D in front of the display differing from the nominal distance Dn.
First, a respective value of a location coordinate x is determined for each of the strips according to a rule which can be described as follows. At a specific
height - which can be selected largely as desired - an imaginary line 20 orientated in the row direction - that is horizontally - is defined at a spacing in front of the display which corresponds to the determined viewing distance D. The location coordinate is defined such that it describes a lateral position of locations on the line 20. In the present case, the line 20 is a section of a straight line. It could, however, also extend in a slanted manner or be curved. In this case, let the named distance designate a distance between the display and a defined point, typically a central point disposed in front of the display, on the line 20. The value of the location coordinate is now determined by a simple mathematical operation for each of the strips for the location at which light emanating or transmitted from the pixels of this strip is incident with the propagation direction imposed by the optical element 12 onto the named line 20. This is illustrated by way of example in Fig. 1 for one of the strips by means of a dashed line, and indeed for a strip of pixels 15 which belongs to the subgroup 7 and which is disposed near the left hand margin of the display.
The location coordinate x adopts nine discrete possible values for such locations which are disposed exactly in front of the viewing zones 16, seen from the center of the display. The location coordinate x is scaled in the present example such that these are the discrete values 1, 2, 3, 4, 5, 6, 7, 8 and 9. The value of the location coordinate x is in each case determined on a scale which is finely graduated or is even at least quasi without graduations exactly so that it also adopts intermediate values between these discrete values and adopts a number of different values for the different strips which is considerably larger than the previously named plurality of nine. Thus, x = 3.5 applies rather precisely e.g. to the strip for which the determination of the value of the location coordinate x is illustrated in Fig. 1. As can also be recognized in Fig. 1, the location coordinate x is defined so that it adopts the discrete values 1, 2, 3, 4, 5, 6, 7, 8 and 9 for the nine directly adjacent strips which extend centrally over the pixel matrix, whereas it also adopts intermediate values disposed therebetween for at least some of the further outwardly disposed strips.
Each value of the location coordinate x therefore stands for a specific position on the line and thus for a specific viewing position with which in turn a specific direction of gaze or perspective on the scene defined by the 3D image can be associated. In a further step, intensity values are now determined for
each of the strips and are defined by the image data 14 for an image strip corresponding to this strip of a view of the 3D image which corresponds to a direction of gaze from a position defined by the named value. The respective view is in this respect defined as the two-dimensional view of the 3D image or of the scene displayed by the 3D image which results from this viewing position or from the direction of gaze thereby predefined.
Finally, the pixels 15 of the respective strip is controlled using the thus determined intensity values which in the present case naturally also depend on color information contained in the image data 14 and on the color of the individual pixels 15. In a modification, the pixel matrix 15 could naturally also have multicolor pixels which are then each controlled using correspondingly determined intensity values and color values.
For the discrete values 1, 2, 3, 4, 5, 6, 7, 8 and 9, the named views are defined as the nine stereoscopic half-images which were named above in connection with the conventional mode of operation of the display for a viewing from the nominal distance Dn. In the operating mode focused on here, however, most views, of which only individual strips are needed, are defined for intermediate values of the location coordinate x which each correspond to a direction of gaze disposed between the directions of gaze of those nine stereoscopic half- images. To illustrate this, rays are drawn by way of example in Fig. 1 of the x values 1, 5, and 9 which show the points on the pixel matrix 11 from which light must emanate to be incident through the optical element 12 onto the location on the line 20 defined by the respective x value. As can easily be recognized in Fig. 1, these points are, however, only central in one of the pixels 15 from which light actually emanates in exceptional cases. The light emanating from the actually present pixels 15 - this always means area centers of the pixels 15 - in contrast is incident on the line 20 in most cases onto locations which correspond to intermediate values of the x coordinate. A representation of best possible quality is realized for a viewer positioned at the viewing distance D in front of the display by the control proposed here of these pixels 15 using image information which correspond to correspondingly selected intermediate views.
Fig. 3, in which recurring features are again provided with the same reference numerals, illustrates the relationships again. In a manner of representation otherwise corresponding to Fig. 1, respective light bundles are drawn by way of example here for three different regions of the pixel matrix 11, said pixel bundles emanating there from three to four respective adjacent subpixels 15 and illuminating a region on the line 20 about the location which is defined by the coordinate value x = 5. So that a viewer's eye can see a view from this location with as few disturbances as possible, said view corresponding to a direction of gaze from a camera position defined by the value x = 5, the pixels 15 of the different subgroups have to be controlled as follows due to the geometrical relationships which are easily recognizable here. At the left hand end of the pixel matrix 11, the pixels 15 of the subgroup 9, and at the right hand end of the pixel matrix 11 the pixels 15 of the subgroup 1 are controlled using the intensity values which belong in the normal case - that is on a viewing from the nominal spacing Dn - to the stereoscopic half-image visible in the fifth viewing zone 16. In the region of the pixel matrix identified by the reference numeral 21 in Fig. 3, in contrast, none of the pixels 15 is controlled using intensity values of this view because none of the pixels 15 there is disposed where the corresponding image information would have to be imaged. Instead, the pixels 15 of the subgroup 3 are controlled there using image information of a calculated view which belongs to a value x = 4.6, whereas the pixels 15 of the subgroup 4 in the region 21 are controlled using image information of another calculated view which belongs to a value x = 5.6. The views are meant by this which would result by taking the displayed scene from camera positions which are disposed at corresponding locations between the camera positions of the fourth and fifth or of the fifth and sixth stereoscopic half-images of the nine stereoscopic half-images named further above. The line 20 is fixed in such a limited manner in its length or width - that is in the x direction - that the values of the location coordinate x can be clearly determined in the above-described manner. The parameterization of the line 20 by the location coordinate x can naturally also have a different scaling than in the case shown in Fig. 1. It can also be achieved by a stretching of the distances between the positions which correspond here to the discrete x values 1, 2, 3, 4, 5, 6, 7, 8 and 9 with an unchanging association of the views
with specific values for x that two views which are visible from two positions on the line 20 remote from one another by an average distance between the eyes of about 65 mm also correspond to two respective perspectives from camera positions correspondingly remote from one another - and not further remote, for instance, with a smaller D. The parallax between two views or more exactly between the views which are approached by the described control and which a viewer can see with his two eyes from the viewing distance D should therefore correspond as exactly as possible to the parallax which results on the viewing of the displayed scene by the average eye distance.
Provision can be made that the value of the location coordinate x is respectively rounded up or down to a next closest intermediate value from a limited number of discrete intermediate values. It would e.g. be possible to determine the location coordinate respectively only up to the first decimal point. Nine respective possible intermediate images are then disposed between the stereoscopic half-images which correspond to the x values 1, 2, 3, 4, 5, 6, 7, 8 and 9. The number of the views needed as a maximum - more precisely the number of which at least individual image strips can be needed is then reduced to at most 90. The calculation effort can be advantageously restricted by this restriction to a discrete number of views - which is, howeve larger than the original number of nine views.
Different processes known per se can be used by a corresponding
programming of the control unit 13 to construe the needed views or more precisely the required image strips thereof and to determine the intensity values for the different views.
In particular the following cases are possible:
The image data 14 can e.g. define a depth map and texture values for area segments of a surface reproduced by the depth map. The intensity values can then be determined for the different views in that image information for the required image strip or strips of the respective view are determined from the depth map and from the texture values for the area segments of the surface reproduced by the depth map.
In other cases, the image data define two or more stereoscopic half-images. The intensity values can then be determined for the different views in that disparities between the already defined stereoscopic half-images are determined and image information are determined for the required image strip or strips of the respective view in that this view is defined as an intermediate image between the named stereoscopic half-images in dependence on the disparities and on the respective value of the named location coordinate x by interpolation and/or by transformation - by so-called "morphing". Instead, a depth map can also be calculated from the disparities which result from the already present half-images. Image information for the required image strip or strips of the respective view can then in turn be determined using this depth map. It must additionally be pointed out that the tracking device is naturally also configured also to determine a lateral location of the at least one eye pair using the head position of the at least one eye pair. The control unit 13 can therefore additionally be configured to control the pixel matrix 11 in dependence on the lateral position determined at least by the tracking device so that a region from which the 3D image is autostereoscopically visible also includes the eye pair or the eye pairs of the tracked viewer or of the tracked viewers. If required, this can be done by a lateral displacement of the line 20 or of the viewing zones 16. In the present embodiment, the viewing distance D is determined by the tracking device and the pixel matrix 11 is controlled in dependence on the viewing distance D thus determined. Instead, the viewing distance D could naturally also be selected by a user - e.g. in dependence on dimensions of a room in which the display is installed - and can be predefined by an input at the control unit 13.
In a special embodiment of the display, the optical element 12 can be controllable and form lens elements with refractive properties variable in dependence on a control of the optical element 12. Liquid crystals can be used for realizing such structures known per se. Independently of whether the viewing distance D is fixed arbitrarily by an input or in dependence on output
signals of a tracking device, the control unit 13 can be configured in this case to control the optical element 12 in dependence on the viewing distance D so that the refractive properties of the lens elements are adapted to this viewing distance D.
Claims
An autostereoscopic display for simultaneously displaying more than two different images, comprising a pixel matrix (11) having a multitude of pixels ( 15) which are arranged in different rows, wherein a plurality of more than two disjoint subsets of pixels (15) are defined on the pixel matrix (11) so that each of the subsets forms a band of parallel strips which include a non-zero angle with the rows, wherein the strips of the different subsets alternate cyclically in the row direction; an optical element (12) which is arranged in front of or behind the pixel matrix (11), which has a grid-like structure orientated parallel to the strips and imposes a respective defined propagation direction on light emanating or transmitted from the pixels (15) such that, at a nominal spacing (D„) from the display predefined by a geometry of the display, a number, corresponding to the named plurality, of viewing zones (16), which are laterally offset relative to one another, is defined so that each of the viewing zones (16) is associated with exactly one of the subsets and such that the light emanating or transmitted from each of the subsets of pixels (15) is directed in the viewing zone (16) associated with this subset; and a control unit (13) for controlling the pixel matrix (11) in dependence on image data (14) which define a 3D image, characterized in that the control unit (13) is configured to carry out the respective following steps for controlling the pixel matrix (11) for an autostereoscopic viewing of the 3D image from a viewing distance (D) differing from the nominal spacing (Dn) in front of the display for each of the strips of pixels (15):
- determining a value of a location coordinate (x) which describes a lateral position of locations on a line (20) orientated in the row direction and disposed at a defined height at the viewing distance (D) in front of the display, wherein the value is respectively determined for the location at which light emanating or transmitted from the pixels (15) of this strip is incident on the named line (20) with the propagation direction imposed by the optical element (12);
- determining intensity values which are defined by the image data (14) for an image strip corresponding to this strip of a view of the 3D image which corresponds to a direction of gaze from a position defined by the named value;
- controlling the pixels (15) of this strip using the intensity values determined in this manner.
2. The display of claim 1, characterized in that the control unit (13) is configured to determine the location coordinate (x) so exactly that it adopts a number of different values for the different strips which is larger than the named plurality.
3. The display of any of the claims 1 or 2, characterized in that the views for a number, corresponding to the named plurality, of values are a number, corresponding to this plurality, of stereoscopic half-images of which a respective two, which correspond to mutually next closest values of this number of values, combine to form a stereoscopic image, whereas at least one of the views which corresponds to an intermediate value of the location coordinate (x) corresponds to a direction of gaze disposed between the directions of gaze of these stereoscopic half-images.
4. The display of claim 3, characterized in that the control unit (13) is configured to determine the location coordinate (x) such that it adopts the named number of values for a number, corresponding to the named plurality, of directly adjacent strips which extend centrally over the pixel matrix (11), whereas it adopts intermediate values for at least some of the further outwardly disposed strips.
The display of any of the claims 1 to 4, characterized in that the control unit (13) is configured to determine the intensity values for the different views in that
- image information for the required image strip or strips of the respective view are determined from at least one depth map defined by the named image data (14) and from texture values defined by the image data (14) for surface segments of a surface represented by the depth map; or
- disparities are determined between at least two stereoscopic half- images which are defined by the image data (14) and image data for the required image strip or strips of the respective view are
determined in that the view is defined as an intermediate image between the named stereoscopic half-images in dependence on the disparities and on the respective value of the named location coordinate (x) by interpolation and/or by transformation; or
- disparities are determined between at least two stereoscopic half- images which are defined by the image data (14), a depth map is calculated from the disparities and image information for the required image strip or strips of the respective view is determined using this depth map.
The display of any of the claims 1 to 5, characterized in that each of the strips contains at most one pixel (15) from each of the rows of the pixel matrix (11).
The display of any of the claims 1 to 6, characterized in that it includes a tracking device for determining a distance between an eye pair of at least one viewer and the display, wherein the control unit (13) is configured to control the pixel matrix (11) so that the named viewing distance (D) corresponds to the distance determined by the tracking device.
The display of claim 7, characterized in that the tracking device is configured also to determine a lateral location of the at least one eye pair, wherein the control unit (13) is configured to control the pixel matrix (11) in dependence on the at least one lateral location determined by the tracking device so that the at least one eye pair is located in a region from which the 3D image is autostereoscopically visible.
The display of any of the claims 1 to 8, characterized in that the optical element (12) is controllable and forms lens elements having refractive properties variable in dependence on a control of the optical element (12), wherein the control unit (13) is configured to control the optical element (12) to adapt the refraction properties of the lens elements to this viewing distance (D).
A method for displaying a 3D image on an autostereoscopic display having a pixel matrix (11) and an optical element (12) arranged in front of or behind the pixel matrix (11); wherein the pixel matrix (11) has a multitude of pixels (15) which are arranged in different rows, wherein a plurality of more than two disjoint subsets of pixels (15) are defined on the pixel matrix (11) so that each of the subsets forms a band of parallel strips which include a non-zero angle with the rows, wherein the strips of the different subsets alternate cyclically in the row direction; and wherein the optical element (12) has a grid-like structure orientated parallel to the strips and imposes a respective defined propagation direction on light emanating or transmitted from the pixels (15) such that, at a nominal spacing (Dn) in front of the display predefined by a geometry of the display, a number, corresponding to the named plurality, of viewing zones (16), which are laterally offset relative to one another, is defined so that each of the viewing zones (16) is associated with exactly one of the subsets and such that the light emanating or transmitted from each of the subsets of pixels (15) is directed in the viewing zone (16) associated with this subset; wherein the pixel matrix (11) is controlled in dependence on image data (14) which define a 3D image, characterized in that the pixel matrix (11) is controlled for an autostereoscopic viewing of the 3D image from a viewing distance (D) in front of the display differing from the nominal distance (Dn), wherein the method includes the following steps for each of the strips of pixels (15):
- determining a value of a location coordinate (x) which describes a lateral position of locations on a line (20) orientated in the row direction and disposed at a defined height in the viewing distance (D) in front of the display, wherein the value is respectively determined for the location at which light emanating or transmitted from the pixels (15) of this strip is incident on the named line (20) with the
propagation direction imposed by the optical element (12);
- determining intensity values which are defined by the image data (14) for an image strip corresponding to this strip of a view of the 3D image which corresponds to a direction of gaze from a position defined by the named value;
- controlling the pixels (15) of this strip using the thus determined intensity values.
11. The method of claim 10, characterized in that the location coordinate (x) is respectively determined so exactly that it adopts a number of different values for the different strips which is larger than the named plurality.
12. The method of any of the claims 10 or 11, characterized in that the views for a number, corresponding to the named plurality, of values are a number, corresponding to this plurality, of stereoscopic half- images of which a respective two, which correspond to mutually next closest values of this number of values, combine to form a
stereoscopic image, whereas at least one of the views which corresponds to an intermediate value of the location coordinate (x) corresponds to a direction of gaze disposed between the directions of gaze of these stereoscopic half-images.
13. The method of claim 12, characterized in that the location coordinate (x) is determined such that it adopts the named number of values for a number, corresponding to the named plurality, of directly adjacent strips which extend centrally over the pixel matrix (11), whereas it adopts intermediate values for at least some of the further outwardly disposed strips.
14. The method of any of the claims 10 to 13, characterized in that the intensity values for the different views are determined in that
- the intensity values for the required image strip or strips of the respective view are determined from at least one depth map defined by the named image data (14) and from texture values defined by the image data (14) for area segments of a surface represented by the depth map; or
- disparities are determined between at least two stereoscopic half- images which are defined by the image data (14) and intensity values for the required image strip or strips of the respective view are determined in that the view is defined as an intermediate image between the named stereoscopic half-images in dependence on the disparities and on the respective value of the named location coordinate (x) by interpolation and/or by transformation; or
- disparities are determined between at least two stereoscopic half- images which are defined by the image data (14), a depth map is calculated from the disparities and the intensity values for the required image strip or strips of the respective view are determined using this depth map.
The method of any of the claims 10 to 14, characterized in that a distance between an eye pair of at least one viewer and the display is detected, wherein the viewing distance (D) is correspondingly selected as the thus detected distance for determining the values of the location coordinate (x) for the different strips.
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020147021110A KR101953112B1 (en) | 2012-01-26 | 2013-01-25 | Autostereoscopic display and method of displaying a 3d image |
CN201380017170.4A CN104221372B (en) | 2012-01-26 | 2013-01-25 | Automatic stereoscopic display device and the method for display 3D rendering |
PCT/EP2013/051487 WO2013110779A1 (en) | 2012-01-26 | 2013-01-25 | Autostereoscopic display and method of displaying a 3d image |
JP2014553737A JP6012764B2 (en) | 2012-01-26 | 2013-01-25 | Autostereoscopic display and 3D image display method |
US14/374,523 US9628784B2 (en) | 2012-01-26 | 2013-01-25 | Autostereoscopic display and method of displaying a 3D image |
EP13701764.6A EP2807828A1 (en) | 2012-01-26 | 2013-01-25 | Autostereoscopic display and method of displaying a 3d image |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102012001902 | 2012-01-26 | ||
DE102012001902.5 | 2012-01-26 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013110297A1 true WO2013110297A1 (en) | 2013-08-01 |
Family
ID=46026768
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP2012/001886 WO2013110297A1 (en) | 2012-01-26 | 2012-04-25 | Autostereoscopic display and method of displaying a 3d image |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2013110297A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9628784B2 (en) | 2012-01-26 | 2017-04-18 | Fraunhofer-Gesellschaft zur Foerderung der angewandt Forschung e.V. | Autostereoscopic display and method of displaying a 3D image |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0637815A2 (en) * | 1993-08-04 | 1995-02-08 | Canon Kabushiki Kaisha | Image processing method and image processing apparatus |
JPH07296195A (en) * | 1994-04-22 | 1995-11-10 | Canon Inc | Device and method for image processing |
EP0769881A2 (en) * | 1995-10-18 | 1997-04-23 | Sharp Kabushiki Kaisha | Method of calibrating a display which tracks an observer |
US6366281B1 (en) | 1996-12-06 | 2002-04-02 | Stereographics Corporation | Synthetic panoramagram |
WO2007043988A1 (en) * | 2005-09-16 | 2007-04-19 | Stereographics Corporation | Method and apparatus for optimizing the viewing of a lenticular stereogram |
DE102010028668A1 (en) | 2009-05-06 | 2010-11-18 | Visumotion Gmbh | Method for spatial representation |
-
2012
- 2012-04-25 WO PCT/EP2012/001886 patent/WO2013110297A1/en active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0637815A2 (en) * | 1993-08-04 | 1995-02-08 | Canon Kabushiki Kaisha | Image processing method and image processing apparatus |
JPH07296195A (en) * | 1994-04-22 | 1995-11-10 | Canon Inc | Device and method for image processing |
EP0769881A2 (en) * | 1995-10-18 | 1997-04-23 | Sharp Kabushiki Kaisha | Method of calibrating a display which tracks an observer |
US6366281B1 (en) | 1996-12-06 | 2002-04-02 | Stereographics Corporation | Synthetic panoramagram |
WO2007043988A1 (en) * | 2005-09-16 | 2007-04-19 | Stereographics Corporation | Method and apparatus for optimizing the viewing of a lenticular stereogram |
DE102010028668A1 (en) | 2009-05-06 | 2010-11-18 | Visumotion Gmbh | Method for spatial representation |
Non-Patent Citations (2)
Title |
---|
KONRAD J ET AL: "3-D Displays and Signal Processing", IEEE SIGNAL PROCESSING MAGAZINE, IEEE SERVICE CENTER, PISCATAWAY, NJ, US, vol. 24, no. 6, 1 November 2007 (2007-11-01), pages 97 - 111, XP011197678, ISSN: 1053-5888, DOI: 10.1109/MSP.2007.905706 * |
WOODGATE G J ET AL: "Autostereoscopic 3D display systems with observer tracking", SIGNAL PROCESSING. IMAGE COMMUNICATION, ELSEVIER SCIENCE PUBLISHERS, AMSTERDAM, NL, vol. 14, no. 1-2, 6 November 1998 (1998-11-06), pages 131 - 145, XP027357224, ISSN: 0923-5965, [retrieved on 19981106] * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9628784B2 (en) | 2012-01-26 | 2017-04-18 | Fraunhofer-Gesellschaft zur Foerderung der angewandt Forschung e.V. | Autostereoscopic display and method of displaying a 3D image |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9628784B2 (en) | Autostereoscopic display and method of displaying a 3D image | |
JP3966830B2 (en) | 3D display device | |
US8531454B2 (en) | Display apparatus and stereoscopic image display method | |
US8384772B2 (en) | Three-dimensional image display device and three-dimensional image display method | |
JP4327758B2 (en) | Stereoscopic image display device | |
EP3241350B1 (en) | Autostereoscopic multi-view system | |
WO2012070103A1 (en) | Method and device for displaying stereoscopic image | |
KR20140073584A (en) | Image processing device, three-dimensional image display device, image processing method and image processing program | |
CN101554060A (en) | Three-dimensional image display device and three-dimensional image display method | |
CN108540791B (en) | Double-view display method and device | |
CN102413352A (en) | Tracking mode free stereo display screen based on RGBW (red, green, blue, white) square sub-pixels | |
US9398291B2 (en) | Autostereoscopic screen and method for reproducing a 3D image | |
KR20160028596A (en) | Three dimensional image display device | |
KR20120104157A (en) | A monitor and a method for representing autostereoscopically perceivable pictures | |
US9992484B2 (en) | Method for reproducing image information, and autostereoscopic screen | |
CN108012556B (en) | Autostereoscopic system | |
KR102271171B1 (en) | Glass-free multiview autostereoscopic display device and method for image processing thereof | |
WO2013110297A1 (en) | Autostereoscopic display and method of displaying a 3d image | |
EP2807828A1 (en) | Autostereoscopic display and method of displaying a 3d image | |
Huang et al. | Glasses-free led Holoscopic 3D wall with effective pixel mapping | |
JP2012157008A (en) | Stereoscopic image determination device, stereoscopic image determination method, and stereoscopic image display device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12718605 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 12718605 Country of ref document: EP Kind code of ref document: A1 |