EP2540089A2 - Verfahren zur visualisierung von dreidimensionalen bildern auf einer 3d-anzeigevorrichtung und 3d-anzeigevorrichtung - Google Patents

Verfahren zur visualisierung von dreidimensionalen bildern auf einer 3d-anzeigevorrichtung und 3d-anzeigevorrichtung

Info

Publication number
EP2540089A2
EP2540089A2 EP11720977A EP11720977A EP2540089A2 EP 2540089 A2 EP2540089 A2 EP 2540089A2 EP 11720977 A EP11720977 A EP 11720977A EP 11720977 A EP11720977 A EP 11720977A EP 2540089 A2 EP2540089 A2 EP 2540089A2
Authority
EP
European Patent Office
Prior art keywords
display device
image
subpixels
display
feature
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP11720977A
Other languages
German (de)
English (en)
French (fr)
Inventor
Ivo-Henning Naske
Sigrid Kamins-Naske
Valerie Antonia Naske
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Expert Treuhand GmbH
Original Assignee
Expert Treuhand GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Expert Treuhand GmbH filed Critical Expert Treuhand GmbH
Publication of EP2540089A2 publication Critical patent/EP2540089A2/de
Withdrawn legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/349Multi-view displays for displaying three or more geometrical viewpoints without viewer tracking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/261Image signal generators with monoscopic-to-stereoscopic image conversion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/001Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
    • G09G3/003Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background to produce spatial visual effects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/111Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/172Processing image signals image signals comprising non-image signal components, e.g. headers or format information
    • H04N13/183On-screen display [OSD] information, e.g. subtitles or menus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • H04N13/305Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using lenticular lenses, e.g. arrangements of cylindrical lenses
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • H04N13/31Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using parallax barriers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/324Colour aspects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • H04N13/383Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/398Synchronisation thereof; Control thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/20Indexing scheme for editing of 3D models
    • G06T2219/2012Colour editing, changing, or manipulating; Use of colour codes
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2300/00Aspects of the constitution of display devices
    • G09G2300/04Structural and physical details of display devices
    • G09G2300/0439Pixel structures
    • G09G2300/0452Details of colour pixel setup, e.g. pixel composed of a red, a blue and two green components
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10TECHNICAL SUBJECTS COVERED BY FORMER USPC
    • Y10TTECHNICAL SUBJECTS COVERED BY FORMER US CLASSIFICATION
    • Y10T29/00Metal working
    • Y10T29/49Method of mechanical manufacture
    • Y10T29/49002Electrical device making
    • Y10T29/49004Electrical device making including measuring or testing of device or component part

Definitions

  • the invention relates to a method for visualizing three-dimensional images on a 3D display device, wherein an image to be visualized is supplied as an input image. Furthermore, the invention relates to an SD display device, in particular a stereoscopic or autostereoscopic display, for the visualization of three-dimensional images.
  • autostereoscopic visualization systems that allow one or more viewers, who are in front of an autostereoscopic display, to view a three-dimensional image without visual aids.
  • e.g. Parallax barrier systems or lenticular lens systems positioned in front of the display panel. Since one or more observers may be at different angles relative to a direction perpendicular to the display, more than two perspectives must always be generated and fed to the respective left and right eyes in order to provide as natural a three-dimensional image impression as possible for the viewer allow respective viewer.
  • These systems are also referred to as multi-viewer systems or multiview systems.
  • the present invention is therefore based on the object, a method for the visualization of three-dimensional images on a 3D display device of the type mentioned in such a way and further, that the
  • CONFIRMATION COPY Visualization of three-dimensional images is improved with simple constructive means. Furthermore, a corresponding 3D display device is to be specified.
  • the above object is solved by the features of claim 1.
  • the present method for visualizing three-dimensional images on a 3D display device is characterized in that using the input image at least one feature matrix is determined, wherein the feature matrices define light / dark information, and that from the input image using the Hell / Dark information, a display image for reproduction is generated on the SD display device.
  • the SD display device in question in particular a stereoscopic or autostereoscopic display, for the visualization of three-dimensional images, is characterized in that the 3D display device has means which determine at least one feature matrix using an input image supplied, wherein the feature matrices Define bright / dark information, and that the means from the input image using the light / dark information to generate a display image for playback on the 3D display device.
  • the input image may comprise two perspectives - partial images - which correspond to a left and a right partial image.
  • the sub-images may be rectified, i. be brought to stereo normal form.
  • features can be extracted from the input image (s).
  • the features can describe local properties. For example, shapes, textures and / or edges can be considered as local properties.
  • a feature extraction Sobel feature extraction can be used.
  • a feature matrix comprises edge information.
  • the human brain essentially uses edges on objects to build up the three-dimensional space image in the brain. Consequently, the edge information greatly facilitates the work of the observer's brain and improves the adaptation to the anatomy of the eye and the downstream information processing in the brain.
  • feature extraction can be carried out using a Speed up Robust Features (SURF) feature extraction.
  • SURF Speed up Robust Features
  • H. Bay, T. Tuytelaars and LV Gool Speeded Up Robust Features
  • Computer Vision and Image Understanding 110 (3), 2008, pp. 346-359.
  • a feature matrix may include information about prominent pixels.
  • a characteristic value can be assigned to a pixel of a perspective of the input image as light / dark information by a feature matrix. Consequently, feature values can be assigned to each pixel of a perspective of the input image as light / dark information.
  • the feature value can be added to the subpixels of the associated pixel in the display image and / or multiplied.
  • the feature value can be weighted with a scaling factor.
  • the scaling factor can be changed interactively by a control unit, preferably with a remote control.
  • RGB subpixels In a display device with RGB subpixels, the features of an edge operator used to emphasize the edges in the RGB subpixels can be used.
  • a pixel e.g. consisting of the RGB subpixels, can be adapted to the anatomy of the eye as follows:
  • R new (i, j): R (i, j) + M (i, j, 1) - s,
  • B new (U): B (i, j) + M (ij, 1) - s, where R (i, j), G (i, j) and B (i, j) are the respective colors red, green and define blue.
  • M (i, j, 1) is the value of the edge operator or edge information feature matrix for the pixel (i, j).
  • s is a freely adjustable scaling parameter. When controlled via a remote control, each viewer can adjust the edge enhancement for their own feelings. This method slightly emphasizes the color values of the stereo image in the edges (but not color distorted) and makes them more easily recognizable for the light / dark receptors.
  • R new (i, j): R (iJ) + s * ZM, (i, j) * s "
  • G new (i, j): G (ij) + s * ZM, (i, j) * s "
  • the light / dark information can be displayed with additionally supplemented light / dark subpixels in the display image.
  • the bright / dark subpixels can significantly improve the viewer's spatial image impression by displaying edge information.
  • an autostereoscopic display could be characterized by a panel of subpixels and an upstream optical element.
  • the subpixels can be both color subpixels such as RGB or CMY and bright / dark subpixels.
  • the color information of the subpixels of the perspectives to be displayed can be displayed.
  • the light / dark subpixels can contain, for example, grayscale image features that support the impression of the image.
  • the human eye has about 1 10 million light / dark receptors and only about 6.5 million color receptors.
  • edge information can be displayed via light / dark subpixels, and thus, this image information becomes taken over the much larger number of light / dark receptors. The work of the brain is relieved.
  • a pseudoholographic display may e.g. contain at least 10 to 20 times as many subpixels as are present in an input image supplied as a stereo image. This greater number of subpixels makes it possible to represent a larger number of pixels per perspective out of the many perspectives that are synthesized.
  • High-definition images and videos of today's generation have about 1920x1080 pixels with 5760 subpixels per line.
  • a display may advantageously have at least 76,800x1080 subpixels. It is taken into account that in the autostereoscopic displays, the assignment of perspectives takes place at the subpixel level. A summary of pixels is not relevant there.
  • a stereoscopic display or an autostereoscopic display can be used as the 3D display device.
  • the input image may have a first perspective and a second perspective, wherein the second perspective is generated by shifting the first perspective by an amount m> 0.
  • the supplied 2D image can be used as a left partial image.
  • the same 2D image can be used to shift to the right by an amount m> 0.
  • a disparity matrix of the following kind can be generated:
  • the viewer could let the viewer choose the amount m interactively, e.g. By means of a remote control, the viewer can adjust the "pop-out” or "pop-in” effect at any time.
  • the 3D display device may comprise subpixels comprising subpixels for representing a color of a predeterminable color system and light / dark subpixels for displaying feature information.
  • subpixels can be advantageously designed as independent elements.
  • the subpixels can have the same extent in the horizontal and vertical directions.
  • the subpixels are square, whereby a higher resolution can be achieved.
  • a round design of the subpixels Thus, the hitherto customary requirement that all subpixels of a pixel together form a square is dropped. Rather, each subpixel is an independent element.
  • Each of these subpixels has a color of the selected color system and has the same extent in the horizontal and vertical directions. With the OLED or Nano technology, this is technically easy to implement.
  • FIG. 1 shows a block diagram for illustrating the overall system according to an embodiment of the method according to the invention and the 3D display device according to the invention
  • FIG. 1 is a flowchart of the overall system of FIG. 1,
  • Fig. 3 is a conventional subpixel layout compared to a new one
  • FIG. 5 shows the subpixel layout of FIG. 4, wherein a larger number of different perspectives are activated.
  • FIG. 1 shows a block diagram for illustrating the overall system according to an exemplary embodiment of the method according to the invention and the 3D display device according to the invention.
  • the exemplary embodiment according to FIG. 1 relates to a method for the visualization of three-dimensional images on an autostereoscopic display as a 3D display device, on which a plurality of perspectives, generally more than 100, from a supplied stereo image in any 3D format are interlaced on the display are displayed.
  • the display consists of an optical element and an image-forming unit. The multitude of perspectives is generated in such a way that only those pixels of a perspective are generated, which also have to be displayed.
  • the imaging unit of the display consists of subpixels emitting a color, eg red, green or blue.
  • the autostereoscopic display is capable of receiving a 3D image or an SD image sequence in any format, such as a stereo image or a stereo image sequence. Other formats, such as stereo image including a disparity card, can also be received and processed.
  • a received stereo image is first rectified, i. brought to stereo normal form or in the epipolar standard configuration. If this is already the case, then the identical figure arises here.
  • the disparity map of the stereo image is calculated. It contains an assignment of the pixels of the left and right field, which are present in both received perspectives. In addition, the left and right occlusions are identified.
  • any number of perspectives are synthesized. This is done so that only those subpixels are synthesized, which must be displayed on the display actually. Thus, for 100 perspectives to be displayed, only 1% of the subpixels are calculated from each perspective.
  • the information about which perspective is to be displayed on which subpixel is defined in the perspective map P.
  • the perspective map is defined and stored in the production of the display by a calibration process between subpixels and optical system. Adjacent subpixels are generally associated with different perspectives. The storage of different subpixels from the different perspectives in the pseudoholographic image B is referred to as mating.
  • the autostereoscopic display is characterized by a panel of subpixels and an upstream optical element.
  • the subpixels are color subpixels, e.g. RGB or CMY.
  • RGB or CMY color subpixels
  • the color information of the subpixels of the perspectives to be displayed is displayed.
  • a pseudoholographic display according to FIG. 1 has at least 10 to 20 times as many subpixels as are present in the received stereo image. This greater number of subpixels makes it possible to represent a larger number of pixels per perspective out of the many perspectives that are synthesized.
  • FIG. 2 shows a flowchart with the individual steps to the overall system from FIG. 1.
  • FIG. 1 and FIG. 2 The steps according to FIG. 1 and FIG. 2 are described in more detail below.
  • a picture sequence of stereo images is received, decoded and made available in the memory areas I, and I r by a receiving module, for example via an antenna or the Internet.
  • a display with a resolution of 19,200 x 10,800 pixels can be considered high-resolution.
  • a stereo HD image enlarged tenfold horizontally and vertically.
  • a first step the rectification is performed.
  • These methods are known from the literature.
  • nine prominent points which are distributed uniformly over the image, are searched for by the SURF method.
  • the coordinate of each distinctive Point is used as the center of a search block in the right field l r .
  • this search block the most similar point in the right field l r is searched.
  • SURF Speed up Robust Features
  • Sobel Edge Detector method For calculating features, e.g. the SURF (Speed up Robust Features) or the Sobel Edge Detector method are used.
  • This method is based on approximating the determinant of the Hesse matrix for each pixel.
  • the procedure is as follows.
  • NZ the number of rows
  • NS the number of columns
  • M r (/, y ' , 1): (/, j) ⁇ Dyy (, j) - 0.81 ⁇ (/, j) ⁇ (, j)
  • the Sobel operator is just one of a large number of edge operators and is therefore described as an example.
  • an edge operator is of particular importance, as it helps to assign more importance to edges than to smooth surfaces. Since an edge is always a regional property, this procedure within a row also allows to take into account the properties of local regions.
  • the Sobel-Prewitt operator works with 3x3 matrices that detect the edges in different directions. Basically, here are horizontal, vertical, left and right diagonal edges to distinguish. For their detection, the following 3x3 matrices are used:
  • each row i is assigned a row calculation unit i.
  • the computational-local field edge (1) to edge (9) is filled from the right-rectified partial image R r as follows:
  • ⁇ ⁇ : 2 ⁇ edge (2) + 2 ⁇ edge (5) + 2 ⁇ edge (8)
  • FIG. 3 shows on the left a conventional pixel layout with the three subpixels R (red), G (green) and B (blue). With these subpixels, the three perspectives 1, 2 and 3 are operated using a lenticular lens as optical element O.
  • FIG. 3 shows a new subpixel layout on the right, wherein the autonomous subpixels according to an embodiment of the 3D display device according to the invention as a autostereoscopic display a square Design have. 9 perspectives can be controlled by the optical element O with 9 subpixels.
  • FIG. 4 once again shows a conventional pixel layout on the left.
  • Fig. 4 right another embodiment of an SD display device according to the invention is shown as autostereoscopic display.
  • a much finer and more detailed subpixel structure is created.
  • 144 subpixels are generated in the subpixel layout of the embodiment.
  • the subpixels R (red), G (green) and B (blue) are another subpixel W (eg white or yellow) for Presentation of light / dark information added.
  • these 144 subpixels 36 perspectives are activated in the illustrated embodiment.
  • FIG. 5 shows the subpixel layout of FIG. 4, wherein the 144 individual, independent subpixels are used to drive 144 perspectives.
  • the procedure for an autostereoscopic display can be as follows.
  • the resolution of the human eye is i.A. between 0.5 'and 1 .5'. Therefore today's displays usually have a dot pitch of 0.2 to 0.3 mm. That is, from a distance of about 1 m, the pixels of the display are no longer visible.
  • the lens width of the lens grid used is in the range of 0.2 mm. That is about 125 LPI (lenses per inch).
  • the lens structure is no longer recognizable from a viewing distance of about 1 m.
  • the number of subpixels behind a lens is on the order of 10 subpixels per lens. That is, the dot pitch of a pseudo-holographic display is on the order of about 0.06 mm. While in conventional displays, e.g. 1920 x 1080 pixels (HD-TV), a pseudo-holographic display presented here consists of at least 19,200 x 1080 pixels.
  • the lenticular may consist of lenticular lenses or hexagonal lenses, for example, which in this case have a diameter of 0.2 mm.
  • lenticular lenses or hexagonal lenses for example, which in this case have a diameter of 0.2 mm.
  • the human brain uses the edge information to a great extent for the generation of the inner space image. Edges give information about the front / back relationship of objects with the existing right and left covers.
  • the known subpixels RGB or YMC are particularly advantageously supplemented by light / dark subpixels which indicate the edge information generated in the feature extraction phase by the edge operators.
  • Homogeneous surfaces contain no edges.
  • the light / dark subpixels do not represent any information in the image.
  • Edges are brightened in the light / dark subpixels according to the intensity of the detected edge. As a result, the edges present in the image are highlighted and more easily recognized by the 100 million light / dark receptors.
  • the brain has it so easier to create the inner space image. Patterns on homogeneous surfaces are recognized as such in the human brain due to the learning effect and do not affect the spatial impression.
  • the geometric arrangement of the light / dark subpixels can be varied according to the invention.
  • Fig. 3 Fig. 4 and Fig. 5, various divisions are shown.
  • further features for spatial image generation can be added.
  • the SURF operator is an example, but not exclusive, called the SURF operator.
  • each color or light / dark subpixel is in itself a self-contained light element associated with a particular perspective and has the same extent in the horizontal and vertical directions. This is already taken into account in FIGS. 3, 4 and 5 on the right-hand side. A backward compatibility is still given, so that all 2D images and videos can be displayed easily.
  • the production of such an autostereoscopic display is possible based on the OLED technology.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Human Computer Interaction (AREA)
  • Architecture (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Devices For Indicating Variable Information By Combining Individual Elements (AREA)
  • Stereoscopic And Panoramic Photography (AREA)
EP11720977A 2010-02-25 2011-02-25 Verfahren zur visualisierung von dreidimensionalen bildern auf einer 3d-anzeigevorrichtung und 3d-anzeigevorrichtung Withdrawn EP2540089A2 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102010009291A DE102010009291A1 (de) 2010-02-25 2010-02-25 Verfahren und Vorrichtung für ein anatomie-adaptiertes pseudoholographisches Display
PCT/DE2011/000187 WO2011103866A2 (de) 2010-02-25 2011-02-25 Verfahren zur visualisierung von dreidimensionalen bildern auf einer 3d-anzeigevorrichtung und 3d-anzeigevorrichtung

Publications (1)

Publication Number Publication Date
EP2540089A2 true EP2540089A2 (de) 2013-01-02

Family

ID=44149833

Family Applications (1)

Application Number Title Priority Date Filing Date
EP11720977A Withdrawn EP2540089A2 (de) 2010-02-25 2011-02-25 Verfahren zur visualisierung von dreidimensionalen bildern auf einer 3d-anzeigevorrichtung und 3d-anzeigevorrichtung

Country Status (6)

Country Link
US (4) US9324181B2 (ja)
EP (1) EP2540089A2 (ja)
JP (3) JP6142985B2 (ja)
KR (2) KR101825759B1 (ja)
DE (1) DE102010009291A1 (ja)
WO (3) WO2011103866A2 (ja)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102010009291A1 (de) * 2010-02-25 2011-08-25 Expert Treuhand GmbH, 20459 Verfahren und Vorrichtung für ein anatomie-adaptiertes pseudoholographisches Display
US9420268B2 (en) 2011-06-23 2016-08-16 Lg Electronics Inc. Apparatus and method for displaying 3-dimensional image
EP2645724A4 (en) * 2011-11-11 2014-08-06 Sony Corp SENDING DEVICE, TRANSMISSION PROCEDURE, RECEPTION DEVICE AND RECEPTION PROCEDURE
US20130293547A1 (en) * 2011-12-07 2013-11-07 Yangzhou Du Graphics rendering technique for autostereoscopic three dimensional display
KR101349784B1 (ko) * 2012-05-08 2014-01-16 엘지디스플레이 주식회사 기능성 패널 합착용 지지부재, 이를 구비한 표시소자 및 표시소자 제조방법
TWI637348B (zh) * 2013-04-11 2018-10-01 緯創資通股份有限公司 影像顯示裝置和影像顯示方法
ITTO20130784A1 (it) 2013-09-30 2015-03-31 Sisvel Technology Srl Method and device for edge shape enforcement for visual enhancement of depth image based rendering
CN103680325A (zh) * 2013-12-17 2014-03-26 京东方科技集团股份有限公司 显示基板、显示面板和立体显示装置
CN105787877B (zh) * 2016-02-18 2018-11-23 精实万维软件(北京)有限公司 一种动态多图排版方法和装置
US20200173619A1 (en) * 2018-12-04 2020-06-04 Sony Interactive Entertainment LLC Fluid display device
US11056081B2 (en) * 2019-08-09 2021-07-06 Wuhan China Star Optoelectronics Semiconductor Display Technology Co., Ltd. Display panel and display device

Family Cites Families (51)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0646461A (ja) * 1992-06-26 1994-02-18 Matsushita Electric Ind Co Ltd 平板型立体画像表示装置およびその製造方法
US5457574A (en) * 1993-05-06 1995-10-10 Dimension Technologies Inc. Autostereoscopic display with high power efficiency
JP2944850B2 (ja) * 1993-05-25 1999-09-06 シャープ株式会社 3次元ディスプレイ装置
JPH08220477A (ja) * 1995-02-13 1996-08-30 Toppan Printing Co Ltd レンチキュラー・ディスプレイの製造方法および製造装置
JP2000081918A (ja) * 1998-09-04 2000-03-21 Canon Inc 加熱装置
JP3593466B2 (ja) * 1999-01-21 2004-11-24 日本電信電話株式会社 仮想視点画像生成方法およびその装置
FR2798708B1 (fr) 1999-09-17 2001-11-16 Snfa Roulement a billes hybride a contact oblique, et butee axiale le comportant
WO2001056265A2 (de) 2000-01-25 2001-08-02 4D-Vision Gmbh Verfahren und anordnung zur räumlichen darstellung
US20020075566A1 (en) * 2000-12-18 2002-06-20 Tutt Lee W. 3D or multiview light emitting display
US7103234B2 (en) * 2001-03-30 2006-09-05 Nec Laboratories America, Inc. Method for blind cross-spectral image registration
GB0119176D0 (en) * 2001-08-06 2001-09-26 Ocuity Ltd Optical switching apparatus
JP2003339060A (ja) * 2002-05-20 2003-11-28 Denso Corp 車両後方・側方画像表示装置
JP2007072476A (ja) * 2002-07-29 2007-03-22 Sharp Corp 視差バリア層付き基板の製造方法
AU2002952874A0 (en) * 2002-11-25 2002-12-12 Dynamic Digital Depth Research Pty Ltd 3D image synthesis from depth encoded source view
JP4042615B2 (ja) * 2003-04-17 2008-02-06 株式会社デンソー 画像処理方法及び画像処理装置
AU2004277273A1 (en) * 2003-09-22 2005-04-07 Gene Dolgoff Omnidirectional lenticular and barrier-grid image display
JP4839598B2 (ja) * 2003-10-30 2011-12-21 ブラザー工業株式会社 画像表示装置
JP4402578B2 (ja) * 2004-03-31 2010-01-20 独立行政法人科学技術振興機構 三次元ディスプレイ
JP4440066B2 (ja) * 2004-10-14 2010-03-24 キヤノン株式会社 立体画像生成プログラム、立体画像生成システムおよび立体画像生成方法
JP4327758B2 (ja) * 2005-03-24 2009-09-09 株式会社東芝 立体画像表示装置
US7787702B2 (en) * 2005-05-20 2010-08-31 Samsung Electronics Co., Ltd. Multiprimary color subpixel rendering with metameric filtering
JP4532372B2 (ja) * 2005-09-02 2010-08-25 トヨタ自動車株式会社 道路区画線検出装置
JP2007081635A (ja) * 2005-09-13 2007-03-29 Sanyo Epson Imaging Devices Corp ディスプレイ装置
JP2007110303A (ja) * 2005-10-12 2007-04-26 Matsushita Electric Ind Co Ltd 輪郭補正装置
US8358330B2 (en) * 2005-10-21 2013-01-22 True Vision Systems, Inc. Stereoscopic electronic microscope workstation
JP4337823B2 (ja) * 2005-11-04 2009-09-30 セイコーエプソン株式会社 プリンタおよび印刷方法
TWI294750B (en) * 2005-11-21 2008-03-11 Whe Yi Chiang Three dimensional organic electroluminescent display
DE102006019169A1 (de) 2006-04-21 2007-10-25 Expert Treuhand Gmbh Autostereoskopische Adapterscheibe mit Echtzeit-Bildsynthese
JP5018778B2 (ja) * 2006-08-23 2012-09-05 富士通株式会社 表示素子並びにそれを用いた電子ペーパーおよび電子端末
JP4933553B2 (ja) * 2006-09-07 2012-05-16 シャープ株式会社 画像表示装置、電子機器およびパララックスバリア素子
WO2008050904A1 (fr) * 2006-10-25 2008-05-02 Tokyo Institute Of Technology Procédé de génération d'image dans un plan de focalisation virtuel haute résolution
JP2008112060A (ja) * 2006-10-31 2008-05-15 Seiko Epson Corp 表示装置
JP2008216971A (ja) * 2007-02-08 2008-09-18 Seiko Epson Corp 画像表示デバイス
KR101058092B1 (ko) * 2007-02-13 2011-08-24 삼성전자주식회사 방향성 표시 장치 및 표시 시스템용 서브픽셀 레이아웃 및 서브픽셀 렌더링 방법
GB0708676D0 (en) * 2007-05-04 2007-06-13 Imec Inter Uni Micro Electr A Method for real-time/on-line performing of multi view multimedia applications
JP5263751B2 (ja) * 2007-06-07 2013-08-14 学校法人立命館 一画面表示装置
JP5115840B2 (ja) * 2007-08-23 2013-01-09 国立大学法人東京農工大学 立体表示装置
JP2009080144A (ja) * 2007-09-25 2009-04-16 Toshiba Corp 立体映像表示装置および立体映像表示方法
JP2009162620A (ja) * 2008-01-07 2009-07-23 Toshiba Corp 検査装置及びその方法
US8072448B2 (en) * 2008-01-15 2011-12-06 Google Inc. Three-dimensional annotations for street view data
GB2457692A (en) * 2008-02-21 2009-08-26 Sharp Kk A display device with a plurality of viewing modes
JP4987767B2 (ja) * 2008-03-18 2012-07-25 株式会社東芝 三次元画像表示装置の製造装置及び三次元画像表示装置の製造方法
JP2010033447A (ja) * 2008-07-30 2010-02-12 Toshiba Corp 画像処理装置および画像処理方法
KR101476219B1 (ko) * 2008-08-01 2014-12-24 삼성디스플레이 주식회사 표시 장치의 제조 방법 및 그를 이용한 표시 장치의 제조장치
KR20100033067A (ko) * 2008-09-19 2010-03-29 삼성전자주식회사 2차원과 3차원 겸용 영상 표시 장치 및 방법
JP4625517B2 (ja) * 2008-10-27 2011-02-02 富士フイルム株式会社 3次元表示装置および方法並びにプログラム
BRPI0922046A2 (pt) * 2008-11-18 2019-09-24 Panasonic Corp dispositivo de reprodução, método de reprodução e programa para reprodução estereoscópica
BRPI1005134B1 (pt) * 2009-01-20 2021-07-20 Koninklijke Philips N.V. Método de transferência de dados de imagem tridimensional [3d], dispositivo de geração de 3d para transferência de dados de imagem tridimensional [3d] para um dispositivo de exibição em 3d e sinal de exibição em 3d
KR20110129903A (ko) * 2009-02-18 2011-12-02 코닌클리케 필립스 일렉트로닉스 엔.브이. 3d 시청자 메타데이터의 전송
KR20120039712A (ko) * 2009-07-13 2012-04-25 켄지 요시다 나안 입체 디스플레이용 패럴랙스 배리어, 나안 입체 디스플레이 및 나안 입체 디스플레이용 패럴랙스 배리어의 설계 방법
DE102010009291A1 (de) * 2010-02-25 2011-08-25 Expert Treuhand GmbH, 20459 Verfahren und Vorrichtung für ein anatomie-adaptiertes pseudoholographisches Display

Also Published As

Publication number Publication date
US9396579B2 (en) 2016-07-19
US20160205393A1 (en) 2016-07-14
JP6060329B2 (ja) 2017-01-18
WO2011103867A1 (de) 2011-09-01
WO2011103866A3 (de) 2012-03-01
US20130135720A1 (en) 2013-05-30
WO2011103865A2 (de) 2011-09-01
JP6142985B2 (ja) 2017-06-07
JP6278323B2 (ja) 2018-02-14
JP2013520890A (ja) 2013-06-06
JP2013527932A (ja) 2013-07-04
KR101825759B1 (ko) 2018-03-22
KR20130036198A (ko) 2013-04-11
US9324181B2 (en) 2016-04-26
WO2011103865A3 (de) 2012-02-09
US20160300517A1 (en) 2016-10-13
US10229528B2 (en) 2019-03-12
WO2011103866A2 (de) 2011-09-01
KR20130008555A (ko) 2013-01-22
KR101852209B1 (ko) 2018-04-25
JP2017078859A (ja) 2017-04-27
DE102010009291A1 (de) 2011-08-25
US10134180B2 (en) 2018-11-20
US20130147804A1 (en) 2013-06-13

Similar Documents

Publication Publication Date Title
EP2540089A2 (de) Verfahren zur visualisierung von dreidimensionalen bildern auf einer 3d-anzeigevorrichtung und 3d-anzeigevorrichtung
EP2027728B1 (de) Verfahren und vorrichtung zur pseudoholographischen bilderzeugung
DE102013113542B4 (de) Autostereoskopische Mehrfachbetrachtungsanzeige und Verfahren zum Steuern optimaler Betrachtungsabstände derselben
DE10145133C1 (de) Verfahren zur räumlichen Darstellung
DE112012000563B4 (de) Verfahren und Vorrichtung zum Senden/Empfangen eines digitalen Übertragungssignals
DE102014205519A1 (de) Verfahren und Vorrichtung zum Anpassen einer Anzeige eines autostereoskopischen Displays für ein Fahrzeug
DE102010028668B4 (de) Verfahren zur räumlichen Darstellung
DE112013004718B4 (de) Bildverarbeitungsvorrichtung und Verfahren, und Programm, Drucker und Anzeigevorrichtung
EP3170307B1 (de) Verfahren zur darstellung einer dreidimensionalen szene auf einem autostereoskopischen monitor
AT513369A2 (de) Verfahren zum Erzeugen, Übertragen und Empfangen stereoskopischer Bilder, und zugehörige Geräte
DE202006013777U1 (de) Autostereoskopische Anzeigevorrichtung
DE102009013912A1 (de) Verfahren und Anordnung zur räumlichen Darstellung
WO2004023348A1 (de) Verfahren zur simulation von optischen bauteilen zur stereoskopischen erzeugung von räumlichen eindrücken
DE102010021550B4 (de) Bildwiedergabegerät und Verfahren zur Bildwiedergabe
EP2478705A1 (de) Verfahren und vorrichtung zum erzeugen von teilansichten und/oder einer raumbildvorlage aus einer 2d-ansicht für eine stereoskopische wiedergabe
DE112015006086T5 (de) Bildverarbeitungsvorrichtung, Bildanzeigevorrichtung und Bildverarbeitungsverfahren
EP2561682A1 (de) Gleichzeitige wiedergabe einer mehrzahl von bildern mittels einer zweidimensionalen bilddarstellungs-matrix
DE102005036744B4 (de) Verfahren und Vorrichtung zur autostereoskopischen Wiedergabe von 3D-Darstellungen
DE102006012059B3 (de) Verfahren zur autostereoskopischen Erzeugung von dreidimensionalen Bildinformationen aus gerasterten Subpixelauszügen mit Dunkeltastung von einzelnen Subpixeln.
DE112012002679T5 (de) Vorrichtung und Verfahren zum Codieren/Decodieren von Mehrfachansicht-Bildern
WO2007085482A1 (de) Verfahren zur erzeugung und darstellung räumlich wahrnehmbarer bilder
WO2020141133A1 (de) Autostereoskopisches display
EP2495978A1 (de) Bildwiedergabeverfahren für ein autostereoskopisches Display
DE102007060461A1 (de) Anordnung zur räumlichen Darstellung
DE102015000178A1 (de) Verfahren zum Steuern eines Bildschirms, Anzeigegerät und Zusatzmodul für ein Anzeigegerät

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20120921

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: EXPERT TREUHAND GMBH

DAX Request for extension of the european patent (deleted)
17Q First examination report despatched

Effective date: 20140626

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20150901