WO2016010246A1 - Dispositif et procédé d'affichage d'images en 3d - Google Patents

Dispositif et procédé d'affichage d'images en 3d Download PDF

Info

Publication number
WO2016010246A1
WO2016010246A1 PCT/KR2015/004822 KR2015004822W WO2016010246A1 WO 2016010246 A1 WO2016010246 A1 WO 2016010246A1 KR 2015004822 W KR2015004822 W KR 2015004822W WO 2016010246 A1 WO2016010246 A1 WO 2016010246A1
Authority
WO
WIPO (PCT)
Prior art keywords
depth
image
irregular
pixel
pixels
Prior art date
Application number
PCT/KR2015/004822
Other languages
English (en)
Korean (ko)
Inventor
샤오후이지아오
밍차이조우
타오홍
웨이밍리
하이타오왕
남동경
씨잉왕
Original Assignee
삼성전자주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN201410338749.7A external-priority patent/CN105323573B/zh
Application filed by 삼성전자주식회사 filed Critical 삼성전자주식회사
Priority to US15/326,594 priority Critical patent/US10666933B2/en
Publication of WO2016010246A1 publication Critical patent/WO2016010246A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers

Definitions

  • the present invention relates to a 3D image display apparatus and method, and more particularly, to a 3D image display apparatus and method for displaying a 3D image using irregular pixels.
  • the Naked eye 3D display system includes a display panel displaying a high resolution 2D interlaced imge and a beam direction modulator (eg, a micro-lens array (MLA)) that refracts the interlaced image in different directions.
  • a beam direction modulator eg, a micro-lens array (MLA)
  • MLA micro-lens array
  • adjacent pixels display image information of different angles, so that each image must be separated through refraction of the lens so that a clear 3D image can be viewed.
  • a so-called crosstalk phenomenon occurs in which light rays interfere with each other between adjacent pixels, which causes the 3D image to overlap and lower resolution.
  • the light emitted from the pixel may be diffused in the process of propagation, and the width of the beam may be changed, thereby causing adjacent pixels to interfere with each other, thereby reducing the resolution quality. Can be dropped.
  • the DOF (Depth of Field) of the 3D image is limited due to the propagation characteristics of the physical light beam, and there is a problem of having different resolutions according to the depth plane.
  • a 3D image display apparatus may include a depth layer divider configured to divide the first 3D image into a plurality of depth layers; A pixel determination unit which determines an irregular pixel corresponding to each of the depth layers; A 3D image generating unit generating a second 3D image corresponding to each of the depth layers using the set irregular pixels; And a 3D image synthesizer configured to synthesize the second 3D images.
  • the depth layer dividing unit may divide the first 3D image into a plurality of depth layers by using a depth peeling algorithm.
  • the pixel determination unit may set a plurality of depth planes based on the optical characteristics of the microlens array, and determine irregular pixels respectively corresponding to the depth layers according to the depth planes to which the depth layers belong.
  • the 3D image display apparatus may further include an outline detail feature extracting unit which extracts a detail detail from the first 3D image and determines a frequency direction and a frequency magnitude of the outline detail.
  • the pixel determining unit may be configured based on at least one of a frequency direction of the contour subfeature corresponding to the depth layers, a frequency magnitude of the contour subfeature corresponding to the depth layers, and a depth plane to which the depth layers respectively belong. It is possible to determine an irregular pixel corresponding to each of the depth layers.
  • the 3D image generating unit renders a plurality of multi-view images using irregular pixels respectively corresponding to the determined depth layers based on multi-view image information, and renders the plurality of rendered multi-view images. Pixels may be rearranged to generate second 3D images corresponding to the depth layers, respectively.
  • the multi-view image information may include at least one of position information of a viewpoint and field angle information of a gaze.
  • the 3D image synthesizing unit determines the front and rear positional relationship in a depth direction with respect to different parts of a plurality of second 3D images respectively corresponding to the plurality of depth layers generated by the 3D image generating unit, and determines the The plurality of 3D images may be synthesized in order from the layer having the deepest depth according to the front and rear positional relationship.
  • the pixel determining unit may select at least one irregular pixel for each depth layer from among a plurality of preset irregular pixels, wherein the irregular pixel includes a pixel block composed of a plurality of adjacent regular pixels or subpixels. It may have a shape and size different from the regular pixel.
  • a 3D image display method includes dividing a first 3D image into a plurality of depth layers; Determining an irregular pixel corresponding to each of the depth layers; Generating second 3D images corresponding to the depth layers using irregular pixels respectively corresponding to the determined depth layers; And synthesizing the generated second 3D images.
  • the first 3D image may be divided into a plurality of depth layers by using a depth peeling algorithm.
  • the determining of the irregular pixels may include setting a plurality of depth planes based on optical characteristics of the microlens array, and determining irregular pixels respectively corresponding to the depth layers according to depth planes to which the depth layers belong. have.
  • the method may further include extracting a contour detail feature from the first 3D image and determining a frequency direction and a frequency magnitude of the contour detail feature.
  • the determining of the irregular pixel may include at least one of a frequency direction of the contour detail feature corresponding to each of the depth layers, a frequency magnitude of the contour detail feature corresponding to each of the depth layers, and a depth plane to which the depth layers belong, respectively.
  • An irregular pixel corresponding to each of the depth layers may be determined based on.
  • the generating of the second 3D images may include: rendering a plurality of multi-view images using irregular pixels respectively corresponding to the determined depth layers based on multi-view image information, and rendering the rendered multi-view images.
  • the pixels may be rearranged with respect to the multi-view images to generate second 3D images corresponding to the depth layers, respectively.
  • the multi-view image information may include at least one of position information of a viewpoint and field angle information of a gaze.
  • the synthesizing of the second 3D images may include a front and rear positional relationship in a depth direction with respect to different portions of a plurality of 3D images corresponding to the plurality of depth layers generated in the generating of the second 3D images.
  • the final 3D image may be formed by synthesizing the plurality of 3D images sequentially from the deepest layer according to the determined front and rear position relationship.
  • the determining of the irregular pixel may include selecting at least one irregular pixel for each depth layer from a plurality of preset irregular pixels, wherein the irregular pixel is a pixel block composed of a plurality of adjacent regular pixels or subpixels. It may be a (pixel block), and may have a shape and size different from the regular pixel.
  • 1 illustrates an example of an MTF for estimating display resolution.
  • FIG. 2 is a block diagram illustrating a 3D image display apparatus according to an exemplary embodiment.
  • 3 shows exemplary results of depth peeling according to one embodiment.
  • FIG. 4 illustrates an example of an irregular pixel, according to an embodiment.
  • 5A is an example of a conventional 3D image.
  • 5B is an example of a 3D image generated by a 3D image display apparatus according to an embodiment.
  • FIG. 6 illustrates a principle of reducing crosstalk by displaying a 3D image using irregular pixels, according to an exemplary embodiment.
  • FIG. 7 is a flowchart illustrating a 3D image display method according to an exemplary embodiment.
  • FIG. 8 is a graph illustrating an MTF value obtained through a display test using different irregular pixels in a 3D image display apparatus according to an exemplary embodiment.
  • MTF Modulation Transfer Function
  • FIG. 1 is a graph showing MTF of a 3D image displayed using a sinusoidal grating.
  • the X-axis direction of the graph represents the depth value of the 3D image
  • the Y-axis direction represents the frequency
  • the Z-axis direction represents the MTF value.
  • Sinusoidal gratings have different frequencies and different depth planes.
  • the resolution may be determined based on a central depth plane (CDP). With the highest resolution on the CDP, the further away from the CDP, the lower the resolution and crosstalk errors and ambiguity will appear. In other words, the naked eye 3D display system is limited in the DOF of the displayed 3D image due to the propagation characteristics of the physical rays.
  • CDP central depth plane
  • the 3D image display apparatus 100 may include a depth layer divider 110, a pixel determiner 120, a 3D image generator 130, and a 3D image synthesizer 140. ).
  • the depth layer dividing unit 110 may divide the first 3D image to be displayed into a plurality of depth layers. In an embodiment, the depth layer dividing unit 110 may divide the first 3D image into a predetermined number of depth layers using a depth peeling algorithm.
  • the depth layer dividing unit 110 converts each pixel included in the first 3D image into fragments, that is, horizontally shifts pixels having a horizontal coordinate value x and a vertical coordinate value y. Switch to a pixel (i.e., a fragment) having a coordinate value x in the direction, a coordinate value y in the vertical direction and a coordinate value z in the depth direction.
  • the coordinate value z in the depth direction of the piece means a depth value of the pixel corresponding to the piece (hereinafter, referred to as a depth value of the piece).
  • the depth layer dividing unit 110 then arranges the pieces in the depth direction based on the depth value of each piece, and generates and outputs a plurality of depth layers according to the result of the depth arrangement and the number of preset depth layers. For example, if the maximum depth value of the 3D image is 4 and the minimum depth value is -4, the fragment having the depth value of 2-4 is divided into the first depth layer when the maximum depth value of the 3D image is to be divided into 4 depth layers. Split this 0-2 piece into a second depth layer, split a piece with a depth value of -2 into a third depth layer, and split a piece with a depth value of -4 into a fourth depth layer To be divided and output into a plurality of depth layers.
  • the number of depth layers and the DOF of each depth layer are not limited to the above examples but may be set as necessary. For example, split a slice having a depth value of 3-4 into a first depth layer, split a slice having a depth value of 0-3 into a second depth layer, and slice a slice having a depth value of -3-0 to a third The depth layer may be divided, and a piece having a depth value of -4--3 may be divided into a fourth depth layer.
  • FIG. 3 shows exemplary results of depth peeling according to one embodiment.
  • a plurality of depth layers shown in FIG. 3 (b) are formed. In each depth layer, a portion corresponding to the 3D image may appear.
  • the depth layer dividing unit 110 divides the 3D image to be displayed into a plurality of depth layers by using the depth peeling algorithm, but various embodiments are not limited thereto and may divide the depth layer by other methods. For example, you can use the salience map method to segment depth.
  • the pixel determiner 120 may determine irregular pixels respectively corresponding to the depth layers divided by the depth layer divider 110.
  • a plurality of depth planes (that is, a plurality of DOFs) may be set based on an optical characteristic of a light direction modulator of the 3D image display apparatus 100, that is, a microlens array included in the 3D image display apparatus.
  • the CDP shown in FIG. 1 is set as the first depth plane, and a plane in which a depth other than the depth corresponding to the CDP is located in the display DOF of the 3D image display apparatus 100 is set as the second depth plane.
  • the pixel determiner 120 may determine a depth plane to which the depth layer belongs based on the DOF of each depth layer, and determine an irregular pixel corresponding to each of the depth layers based on the depth plane to which each depth layer belongs.
  • any depth layer L included in the plurality of divided depth layers will be described.
  • the depth layer L belongs to the first depth plane (i.e., CDP)
  • the image displayed on this display plane has the highest resolution, so that the 3D image portion corresponding to this depth layer can be displayed using the existing regular pixels. have.
  • the depth layer L belongs to the second depth plane (ie, a plane in which a depth other than the depth corresponding to the CDP is located in the display DOF of the 3D image display apparatus 100), or the third depth plane (ie, 3D).
  • the pixel determination unit 120 may determine an irregular pixel corresponding to the depth layer L.
  • the irregular pixel is a pixel block composed of a plurality of adjacent regular pixels or sub-pixels and has a shape and size different from that of the regular pixels.
  • the plurality of irregular pixels may be preset, and the pixel determiner 120 may select at least one irregular pixel for the depth layer L from the plurality of preset irregular pixels.
  • the pixel determining unit 120 may determine which irregular pixels are used by comparing the resolutions displayed when displaying the 3D image part corresponding to the depth layer L using various irregular pixels.
  • the pixel determining unit 120 compares the resolutions displayed when displaying a 3D image part corresponding to the depth layer L using a combination of two or more kinds of irregular pixels among the plurality of preset irregular pixels, and uses any irregular pixels. Can be confirmed.
  • the pixel determiner 120 combines adjacent rule pixels or sub-pixels to form a plurality of candidate irregular pixels having different shapes and sizes, and then uses the candidate rule pixels to form a 3D image corresponding to the depth layer L.
  • FIG. The resolution shown when displaying the part can be confirmed, and the candidate irregular pixels used when the highest resolution appears can be determined as the last used irregular pixels corresponding to the depth layer L.
  • an example of determining an irregular pixel that is, an irregular pixel corresponding to each of the depth layers based on the depth plane to which each depth layer belongs) by the depth application method has been described, but various embodiments are not limited thereto.
  • irregular pixels of each depth layer may be determined using a frequency application method. That is, irregular pixels of each depth layer may be determined according to frequency characteristics (frequency direction and magnitude) of the 3D image to be displayed.
  • the 3D image display apparatus 100 may further include a contour detail feature extracting unit (not shown), extract the contour detail feature from the first 3D image, and the direction (ie, the frequency direction) of the contour detail feature. And the frequency magnitude.
  • the pixel determining unit 120 may include at least one of a direction detail of the contour sub-feature corresponding to the depth layers (that is, the contour sub-feature in which the pixel corresponding to the depth layer appears) and a frequency magnitude of the contour sub-feature corresponding to the depth layers, respectively. Based on one, an irregular pixel corresponding to each of the depth layers is determined and has a different width in different frequency directions (eg, pixel width is inversely proportional to frequency size).
  • the pixel determiner 120 may determine an irregular pixel corresponding to the depth layers by using both the depth application method and the frequency application method.
  • the pixel determiner 120 may be configured based on at least one of a direction of the contour subfeature corresponding to each of the depth layers, a frequency magnitude of the contour subfeature corresponding to each of the depth layers, and a depth plane to which the depth layers belong, respectively. It is possible to determine an irregular pixel corresponding to each of the depth layers.
  • the depth layer divider 110 matches the DOF of the divided depth layer and the set depth plane when the depth layer is divided. You can do that.
  • the 3D image generator 130 may generate a second 3D image corresponding to the depth layers by using irregular pixels corresponding to the depth layers determined by the pixel determiner 120, respectively. have.
  • the 3D image generating unit 130 includes at least one of multi-view image information (view position information and field angle information of the viewpoint) of the 3D image display apparatus 100, and the 3D image display apparatus 100. Based on the hardware performance parameter setting of the plurality of multi-view images (the multi-view images respectively correspond to one viewpoint and visual position) using irregular pixels respectively corresponding to the determined depth layers.
  • the second 3D image corresponding to the depth layer may be generated by rendering and rearranging pixels of the plurality of rendered multi-view images.
  • the method of rendering the multi-view images and rearranging pixels for the plurality of rendered multi-view images is a commonly used method in the art, and a detailed description thereof will be omitted.
  • the 3D image synthesizer 140 may synthesize a plurality of second 3D images corresponding to the plurality of depth layers generated by the 3D image generator 130 to form a final 3D image.
  • the 3D image synthesizing unit 140 includes a front and rear side in a depth direction with respect to different portions of a plurality of second 3D images respectively corresponding to the plurality of depth layers generated by the 3D image generating unit 130.
  • a positional relationship may be determined and the final 3D image may be formed by synthesizing the plurality of second 3D images in order from the layer having the deepest depth (ie, covering from the back to the front) according to the determined front-back positional relationship.
  • synthesizing the second 3D images synthesized from the 3D image of the depth layer having the largest depth, and the image of each position included in the final synthesized 3D image is used to synthesize the final 3D image. It may be determined according to a corresponding position of the second 3D image having a minimum depth among the plurality of second 3D images.
  • FIG. 5A is an example of a conventional 3D image
  • FIG. 5B is an example of a 3D image generated by the 3D image display apparatus 100 according to an embodiment, and is for comparing the two.
  • 5A illustrates a screen displaying a 3D image having a depth using a conventional pixel
  • FIG. 5B illustrates a screen displaying the same 3D image using the 3D image display apparatus 100 according to an exemplary embodiment. It is shown.
  • the 3D image generated using the 3D image display apparatus 100 according to an embodiment may have a higher resolution when compared to a result of displaying a 3D image using a conventional rule pixel. And the display DOF was improved.
  • 6 illustrates a principle of reducing crosstalk by displaying a 3D image using irregular pixels, according to an exemplary embodiment.
  • 6 is a screen showing a 3D image using regular pixels. Display devices that display 3D images using conventional ruled pixels have limited hardware design, and thus observe two pixel luminances adjacent to the viewer's eyes, and the two pixels may display images of different visual angles. Crosstalk occurs.
  • the right side of FIG. 6 illustrates an example of displaying a 3D image using irregular pixels, according to an exemplary embodiment. Since the size of the irregular pixel used on the right side of FIG. 6 is larger than the regular pixel, the crosstalk phenomenon does not occur because only one pixel luminance is observed by the viewer's eye at the same position as the left screen of FIG. 6. Therefore, the crosstalk phenomenon generated during the 3D image display process can be reduced.
  • the depth layer dividing unit 110 included in the 3D image display apparatus 100 displays the first depth to be displayed.
  • the 3D image is divided into a plurality of depth layers.
  • the dividing into depth layers may divide the 3D image into a plurality of depth layers using a depth peeling algorithm.
  • each pixel included in the first 3D image is converted into a fragment, that is, a pixel having a horizontal coordinate value x and a vertical coordinate value y is converted into a horizontal coordinate value x and a vertical direction.
  • the pixel determination unit 120 included in the 3D image display apparatus 100 corresponds to the depth layers divided by the depth layer division unit 110, respectively.
  • Irregular pixels can be determined.
  • the irregular pixel is a pixel block composed of a plurality of adjacent regular pixels or sub-pixels and may have a shape and size different from those of the regular pixels.
  • the determining of the irregular pixel may select at least one irregular pixel for each depth layer from a plurality of preset irregular pixels.
  • the setting of the depth plane and the method of determining the irregular pixels according to the depth plane to which the depth layer belongs are the same as those of FIG. 2 described above, and a detailed description thereof will be omitted.
  • Determining the irregular pixels 730 may determine the irregular pixels of each depth layer according to the frequency characteristics (direction and magnitude of the frequency) of the first 3D image. Specifically, the step 730 of determining the irregular pixel may further include extracting a contour detail feature from the first 3D image and determining a frequency direction and a frequency magnitude of the contour detail feature. In this case, the step 730 of determining the irregular pixel may include at least one of a direction of the contour sub-feature corresponding to the depth layers, a frequency magnitude of the contour sub-feature corresponding to each of the depth layers, and a depth plane to which the depth layers belong. Based on this, irregular pixels respectively corresponding to the depth layers can be determined.
  • the 3D image generator 130 included in the 3D image display apparatus 100 determines the irregular pixels.
  • second 3D images corresponding to the depth layers may be generated using irregular pixels corresponding to the depth layers determined at 730.
  • a plurality of multi-view images are rendered by using irregular pixels respectively corresponding to the determined depth layers.
  • the pixels may be rearranged with respect to the plurality of rendered multi-view images to generate 3D images corresponding to the depth layers, respectively.
  • the multi-view image information includes at least one of position information of a viewpoint and field angle information of a gaze.
  • the 3D image synthesis unit 140 included in the 3D image display apparatus 100 may synthesize the second 3D images generated to form the final 3D image.
  • the front and rear positional relationship in the depth direction is determined with respect to different parts of the plurality of second 3D images respectively corresponding to the plurality of depth layers.
  • the final 3D image may be formed by synthesizing the plurality of second 3D images in the order of covering the layer from the deepest layer to the front to rear according to the determined front and rear position relationship.
  • FIG. 8 is a graph illustrating MTF values obtained through a display test using different irregular pixels in the 3D image display apparatus according to an exemplary embodiment of the present invention.
  • the sign " The graph indicated by ” is the MTF value obtained through the display using the regular pixel, and the graph indicated by other symbols is the MTF value obtained through the display using the irregular pixel.
  • FIG. 9 shows an analysis result of comparing an MTF value obtained through a display test using a regular pixel with an MTF value obtained through a display test using an irregular pixel.
  • the test frequency is 0.097 cycles / mm.
  • the use of an irregular pixel according to an embodiment may further improve the resolution.
  • the 3D image display apparatus and method according to an embodiment can solve the crosstalk error, improve the resolution of the 3D display, and accelerate the processing of the 3D image by increasing the DOF of the 3D image to be displayed.
  • the manufacturing cost of the 3D image display device can be lowered.
  • the embodiments described above may be implemented as hardware components, software components, and / or combinations of hardware components and software components.
  • the devices, methods, and components described in the embodiments may include, for example, processors, controllers, arithmetic logic units (ALUs), digital signal processors, microcomputers, field programmable gates (FPGAs). It may be implemented using one or more general purpose or special purpose computers, such as an array, a programmable logic unit (PLU), a microprocessor, or any other device capable of executing and responding to instructions.
  • the processing device may execute an operating system (OS) and one or more software applications running on the operating system.
  • the processing device may also access, store, manipulate, process, and generate data in response to the execution of the software.
  • OS operating system
  • the processing device may also access, store, manipulate, process, and generate data in response to the execution of the software.
  • processing device includes a plurality of processing elements and / or a plurality of types of processing elements. It can be seen that it may include.
  • the processing device may include a plurality of processors or one processor and one controller.
  • other processing configurations are possible, such as parallel processors.
  • the software may include a computer program, code, instructions, or a combination of one or more of the above, and configure the processing device to operate as desired, or process it independently or collectively. You can command the device.
  • Software and / or data may be any type of machine, component, physical device, virtual equipment, computer storage medium or device in order to be interpreted by or to provide instructions or data to the processing device. Or may be permanently or temporarily embodied in a signal wave to be transmitted.
  • the software may be distributed over networked computer systems so that they may be stored or executed in a distributed manner.
  • Software and data may be stored on one or more computer readable recording media.
  • the method according to the embodiment may be embodied in the form of program instructions that can be executed by various computer means and recorded in a computer readable medium.
  • the computer readable medium may include program instructions, data files, data structures, etc. alone or in combination.
  • the program instructions recorded on the media may be those specially designed and constructed for the purposes of the embodiments, or they may be of the kind well-known and available to those having skill in the computer software arts.
  • Examples of computer-readable recording media include magnetic media such as hard disks, floppy disks, and magnetic tape, optical media such as CD-ROMs, DVDs, and magnetic disks, such as floppy disks.
  • Examples of program instructions include not only machine code generated by a compiler, but also high-level language code that can be executed by a computer using an interpreter or the like.
  • the hardware device described above may be configured to operate as one or more software modules to perform the operations of the embodiments, and vice versa.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

L'invention concerne un dispositif et un procédé d'affichage d'images en 3D. Le dispositif d'affichage d'images en 3D divise une première image en 3D en plusieurs couches de profondeur, détermine des pixels irréguliers correspondant aux couches de profondeur divisées, génère de deuxièmes images en 3D correspondant aux couches de profondeur, respectivement, au moyen des pixels irréguliers correspondants, et synthétise les images générées, afin d'obtenir une image en 3D en haute résolution finale.
PCT/KR2015/004822 2014-07-16 2015-05-14 Dispositif et procédé d'affichage d'images en 3d WO2016010246A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/326,594 US10666933B2 (en) 2014-07-16 2015-05-14 3D image display device and method

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN201410338749.7A CN105323573B (zh) 2014-07-16 2014-07-16 三维图像显示装置和方法
CN201410338749.7 2014-07-16
KR10-2015-0061078 2015-04-30
KR1020150061078A KR102325296B1 (ko) 2014-07-16 2015-04-30 3d 영상 디스플레이 장치 및 방법

Publications (1)

Publication Number Publication Date
WO2016010246A1 true WO2016010246A1 (fr) 2016-01-21

Family

ID=55078704

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2015/004822 WO2016010246A1 (fr) 2014-07-16 2015-05-14 Dispositif et procédé d'affichage d'images en 3d

Country Status (1)

Country Link
WO (1) WO2016010246A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112015357A (zh) * 2020-08-12 2020-12-01 浙江迅实科技有限公司 一种3d立体画的制作方法及其产品
CN116095294A (zh) * 2023-04-10 2023-05-09 深圳臻像科技有限公司 根据深度值渲染分辨率的三维光场图像编码方法及系统

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070171290A1 (en) * 2003-05-08 2007-07-26 Ronald Kroger Pixel patterns
KR20140004115A (ko) * 2011-01-07 2014-01-10 소니 컴퓨터 엔터테인먼트 아메리카 엘엘씨 2차원 화상의 재투영의 형태학적 안티 에일리어싱
KR20140053721A (ko) * 2012-10-26 2014-05-08 한국과학기술원 스테레오 영상의 깊이감 조절 장치 및 방법
KR20140065894A (ko) * 2012-11-22 2014-05-30 삼성전자주식회사 깊이 영상을 이용한 컬러 영상 처리 장치 및 방법
US20140192044A1 (en) * 2013-01-07 2014-07-10 Samsung Electronics Co., Ltd. Display apparatus and display method thereof

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070171290A1 (en) * 2003-05-08 2007-07-26 Ronald Kroger Pixel patterns
KR20140004115A (ko) * 2011-01-07 2014-01-10 소니 컴퓨터 엔터테인먼트 아메리카 엘엘씨 2차원 화상의 재투영의 형태학적 안티 에일리어싱
KR20140053721A (ko) * 2012-10-26 2014-05-08 한국과학기술원 스테레오 영상의 깊이감 조절 장치 및 방법
KR20140065894A (ko) * 2012-11-22 2014-05-30 삼성전자주식회사 깊이 영상을 이용한 컬러 영상 처리 장치 및 방법
US20140192044A1 (en) * 2013-01-07 2014-07-10 Samsung Electronics Co., Ltd. Display apparatus and display method thereof

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112015357A (zh) * 2020-08-12 2020-12-01 浙江迅实科技有限公司 一种3d立体画的制作方法及其产品
CN112015357B (zh) * 2020-08-12 2023-05-05 浙江迅实科技有限公司 一种3d立体画的制作方法及其产品
CN116095294A (zh) * 2023-04-10 2023-05-09 深圳臻像科技有限公司 根据深度值渲染分辨率的三维光场图像编码方法及系统

Similar Documents

Publication Publication Date Title
CN109561294B (zh) 用于渲染图像的方法和设备
RU2562759C2 (ru) Морфологическое сглаживание (мс) при повторном проецировании двухмерного изображения
KR102325296B1 (ko) 3d 영상 디스플레이 장치 및 방법
KR102328128B1 (ko) 통합 영상 디스플레이, 이의 제조 방법, 및 이를 포함하는 시스템
WO2018004154A1 (fr) Dispositif d'affichage de réalité mixte
WO2011112028A2 (fr) Procédé de génération d'image stéréoscopique et dispositif associé
JP7180079B2 (ja) 回路装置及び電子機器
KR102401168B1 (ko) 3차원 디스플레이 장치의 파라미터 캘리브레이션 방법 및 장치
US9396579B2 (en) Method for visualizing three-dimensional images on a 3D display device and 3D display device
KR20170044953A (ko) 무안경 3d 디스플레이 장치 및 그 제어 방법
US9886096B2 (en) Method and apparatus for processing three-dimensional (3D) object based on user interaction
US8619094B2 (en) Morphological anti-aliasing (MLAA) of a re-projection of a two-dimensional image
CN109725701A (zh) 显示面板和装置、图像处理方法和装置、虚拟现实系统
WO2016010246A1 (fr) Dispositif et procédé d'affichage d'images en 3d
WO2019216528A1 (fr) Procédé de fourniture d'espace d'exposition virtuel par utilisation de la dimensionnalisation 2,5
WO2013089369A1 (fr) Appareil et procédé permettant de mesurer la profondeur perçue d'une image tridimensionnelle
WO2011159085A2 (fr) Procédé et appareil pour traçage de rayon dans un système d'image 3d
CN103620667A (zh) 用于使用色场顺序显示器生成图像的方法和设备
WO2011087279A2 (fr) Procédé de conversion d'image stéréoscopique et dispositif de conversion d'image stéréoscopique
KR20200039527A (ko) 디스플레이 패널, 이를 이용한 3차원 디스플레이 장치 및 3차원 hud 장치
KR20170031384A (ko) 광학 레이어, 이를 포함하는 디스플레이 장치 및 백라이트 유닛
WO2012173304A1 (fr) Dispositif et procédé pour le traitement d'images graphiques pour la conversion d'une image graphique à faible résolution en une image graphique à haute résolution en temps réel
EP3467637A1 (fr) Procédé, appareil et système servant à afficher une image
WO2016195167A1 (fr) Procédé de conversion de contenu, appareil de conversion de contenu, et programme de génération d'hologramme multicouche
KR101350641B1 (ko) 무안경 3 차원 영상 디스플레이 평가 방법 및 그 장치

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15821405

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 15326594

Country of ref document: US

122 Ep: pct application non-entry in european phase

Ref document number: 15821405

Country of ref document: EP

Kind code of ref document: A1