EP2761876A1 - Verfahren und vorrichtung zum filtern einer disparitätskarte - Google Patents

Verfahren und vorrichtung zum filtern einer disparitätskarte

Info

Publication number
EP2761876A1
EP2761876A1 EP12775782.1A EP12775782A EP2761876A1 EP 2761876 A1 EP2761876 A1 EP 2761876A1 EP 12775782 A EP12775782 A EP 12775782A EP 2761876 A1 EP2761876 A1 EP 2761876A1
Authority
EP
European Patent Office
Prior art keywords
disparity
pixels
value
values
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP12775782.1A
Other languages
English (en)
French (fr)
Inventor
Cedric Thebault
Philippe Robert
Sylvain Thiebaud
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
InterDigital CE Patent Holdings SAS
Original Assignee
Thomson Licensing SAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Thomson Licensing SAS filed Critical Thomson Licensing SAS
Publication of EP2761876A1 publication Critical patent/EP2761876A1/de
Withdrawn legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/128Adjusting depth or disparity
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/111Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/122Improving the 3D impression of stereoscopic images by modifying image signal contents, e.g. by filtering or adding monoscopic depth cues
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details

Definitions

  • the present invention relates to a stereoscopic image and more particularly to a method and a device for filtering the disparity map associated with such an image.
  • 3D view synthesis or 3D view interpolation is to create a 3D rendering other than that represented by the source stereoscopic image.
  • the 3D rendering then adjusts to the chosen application or the user.
  • the views needed to address multi-view screens are mostly recreated from a single stereoscopic image.
  • this synthesis of view is based on the source stereoscopic image and on a map of disparity.
  • the disparity map is generally associated with a right or left source image in the case of a stereoscopic image and provides for each pixel a datum representing the distance (in number of pixels) between the position of this image element in the right image and in the left image.
  • This view synthesis can also be based on a depth map because it is possible to return to a depth value from the disparity value, or vice versa, by introducing certain data, such as the size of the screen or the distance between the observer and the screen using an algorithm known to those skilled in the art.
  • this disparity map is projected at the level of the view you want to generate. After this projection, we therefore have a modified disparity map corresponding to the desired view, each pixel of this disparity map modified indicates a disparity value which will then be used to adapt the information corresponding to the right or left image to regenerate.
  • the disparity vector of each pixel to be interpolated must point to the same points of objects in the right and left images.
  • Document US2010 / 0215251 describes a method and a device for processing a depth map.
  • the method consists in obtaining a depth map based on a compressed depth map including the depth information of a captured scene.
  • the scene has an object.
  • Occlusion information including information occluded by the object in the depth map is provided. At least a portion of the occlusion information is then used to reduce the distortions of the artifacts in the depth map.
  • This document therefore relates to an information processing method for the correction of a depth map related to an area obscured by an object.
  • our invention aims to overcome a different problem which concerns so-called transition pixels located on the edges of objects. If, for example, a scene has two flat objects located at fixed and distinct depths, the disparity map associated with the elements of the right view or the left view should contain only two values, the value of the disparity of the first object. and that distinct from the disparity of the second object since there are only two possibilities in the scene with two different disparities.
  • the disparity map containing only two disparity values it is in this case easy to project the disparity map on an intermediate view between the left view and the right view and thus to interpolate one of the two images.
  • transitions results from a mixture between the disparity value of the pixel of the first object and that of the second object.
  • This problem of erroneous or outlier disparity value is present in the disparity maps obtained by a disparity map estimator and in disparity maps of the CGI contents.
  • a possible solution to the above-mentioned problem would be to use unfiltered disparity maps (that is, crenellated at the contours of the objects). Unfortunately if it is possible to consider this kind of map for CGI content, disparity estimators will most often naturally generate intermediate values between different objects.
  • the present invention provides a method for overcoming the disadvantages mentioned.
  • the invention consists of a filtering method of a disparity map associated with one of the views of a stereoscopic image comprising at least one object partially covering an area obscured by this object.
  • the method comprises the steps of determining at least one transition zone around the object, identifying the pixels of this transition zone whose disparity value is considered as erroneous compared with the disparity values of the neighboring pixels.
  • This method therefore allows intelligent filtering that only filters out doubtful disparity values and thus preserves the majority of the disparity map.
  • the step of identifying pixels having an erroneous disparity value consists in identifying as erroneous any pixel whose disparity value is not between the disparity values of the neighboring pixels (x n- i, x n + i) ⁇
  • the step of identifying pixels having an erroneous disparity value consists in identifying as erroneous any pixel whose disparity value is different from the disparity values of the neighboring pixels (x n- i, x n + i) of a determined threshold value.
  • the step of correcting the erroneous disparity values of the pixels consists in replacing the disparity value of the pixels identified by a value calculated as a function of the disparities of the neighboring pixels.
  • the step of correcting the erroneous pixel disparity values consists in eliminating the disparity value of the identified pixels.
  • the step of correcting the disparity values of the pixels consists in determining which disparity value of the neighboring pixels ( ⁇ ⁇ - ⁇ , ⁇ ⁇ + ⁇ ), the disparity value of the identified pixel is the closest, then to replace the pixel disparity value identified by a replacement value corresponding to the value of one of the neighboring pixels (x i n, x n + i), the disparity value is closest.
  • the method comprises the additional step, after the
  • the invention also consists in a device for generating three-dimensional image display signals comprising at least one object partially covering a zone obscured by this object comprising a device for transmitting a first flow of data corresponding to one of the right or left images to be transmitted, and a second data stream corresponding to the disparity map associated with one of the right or left images to be transmitted and a device for receiving data streams for displaying three-dimensional images.
  • It comprises a device for filtering the disparity map comprising means for determining at least one transition zone around the object, means for identifying the pixels (x n ) of this transition zone whose value of disparity (d (x n )) is considered erroneous by comparison with disparity values of neighboring pixels (x n- i, x n + i)
  • FIG. 1 represents a system for delivering 3D content and containing a device for filtering the elements of a disparity card according to the invention, either at the level of the signal generation device or at the level of the receiver.
  • FIG. 2 represents a block diagram explaining a filtering method according to the invention.
  • Figure 3 shows the disparity values of a disparity map of a system without a filtering device.
  • FIG. 4 represents the disparity values of a disparity card of a system with filtering device according to the invention.
  • the system as represented by FIG. 1 is a device for generating display signals for at least one three-dimensional image (3D). It comprises in the transmission device a generation means 2 of two data streams representing the pixels, or picture elements (in English "picture elements" or "pixels") of the images. left and right respectively, a means 3 for generating a data stream corresponding to the difference information of the pixels of the right or left image represented by a disparity map for each image, a selection means 4 of one of the streams of data of the right or left image to be transmitted, a transmission means 5 of one of the data streams corresponding to the selected right or left image and the data flow corresponding to the disparity map.
  • a generation means 2 of two data streams representing the pixels, or picture elements (in English "picture elements" or "pixels" of the images. left and right respectively, a means 3 for generating a data stream corresponding to the difference information of the pixels of the right or left image represented by a disparity map for each image, a selection means 4 of one of the streams of data of the
  • a filtering device 6 disparity values of the various elements of the disparity map for the detection of erroneous values and the replacement of these values by other disparity values depending on the disparities of neighboring elements of the disparity map associated with the means of generating the data of the disparity map.
  • a data receiving device receives the two data streams corresponding to one of the right or left images and the disparity map respectively and provides an interface 7 with the 3D image display signals to a screen or display device 8
  • the filtering device 6 of the disparity card is situated at the level of the signal receiver device and not at the level of the transmission device.
  • the method according to the invention consists in filtering the disparity map 6 so as to determine, eliminate or replace the "dubious" disparity values.
  • the term "dubious" disparity value is defined as the pixel disparity values of the transition zones that do not correspond to one of the disparity values of the neighboring pixels.
  • This method allows intelligent filtering that only filters out questionable disparity values and thus preserves the majority of the disparity map.
  • the disparity map represents sampled information. We must therefore solve the sampling problems. This is why the method according to the invention consists in filtering the disparity map to improve the contours of objects that suffer from this lack of sampling. This filtering method must first correctly identify the erroneous disparity values, and secondly correct these values.
  • the erroneous disparity values are intermediate values between actual disparity values in that these disparity values correspond to depths where an object is located. But the disparity values can be locally increasing or decreasing in the presence of a surface not parallel to the plane of the screen. The definition of the detection algorithm must take into account these different cases.
  • the method proposed here is associated with a study of pixels belonging to a horizontal line along the x-axis, but it would be possible to use vertical information as well.
  • the disparity values d (x n ) of pixels x n are determined as erroneous if these values differ from the disparity values d (x n- i) and d (x n + i) of the two neighboring pixels horizontally direct x n- i and x n + i of a value greater than the difference
  • Another criteria can be used to define what will be considered as erroneous disparity value.
  • One method consists in comparing the disparity value of a pixel with respect to a determined threshold value as a function of the disparities of the neighboring pixels. Threshold values can thus be added. For example, it is possible to determine as erroneous pixels whose disparity values (expressed in pixels) different from more than 2 pixels with its two neighbors.
  • the determination of the erroneous disparity values is therefore done by comparing these values d (x n ) with the disparity values d (x n- 2), d (x n- i), d ( x n + i) or d (x n- i), d (Xn + i), d (x n + 2) of four other neighboring pixels x n- 2 x n -i n + ix n + 2 ⁇
  • the first step 100 corresponds to identifying the pixels N pixels having erroneous disparity values by comparison with the disparity values of the neighboring pixels N-1 and N + 1:
  • Steps 101 and 102 then make it possible to determine whether the disparity d (x n ) of the pixel x n lies between those d (x n- i) and d (x n + 1) of the pixels x n- i and x n + i and, in step 101 and step 102, comparing the disparities of pixels x n- i and x n + i : d (x n- i) ⁇ d (Xn) ⁇ d (x n + i) or (x n- i)> d (Xn)> d (x n + i)
  • Step 103 corresponds to the case where the value of the disparity is not between those of the neighboring pixels. A correction of this value is necessary. This value can be replaced by the nearest disparity value or deleted.
  • steps 104 and 105 the disparity gap between the pixels x n and x n- i is compared with that between x n and x n + i. These steps correspond to a comparison study determining if the disparity value of x n is closer to that of x n + i or x n- i. In the case of a pixel located on the border between two objects, this amounts to determining which object this pixel is closest to:
  • Step 106 makes it possible to take into account a variable disparity (increasing or
  • This step therefore makes it possible to determine a replacement disparity value of (x n ) corresponding to the maximum value of the sum between the disparity d (x n- i) of the pixel x n- i and the difference between the disparities d (x n- 2) and d (x n- i) pixesl x n- 2 and x n- i.
  • the disparity of replacement values (x n) are determined taking into account a variable disparity for objects corresponding to the adjacent pixels n x 2 and x n i or x n + i and x n + 2- Maximum values are determined if d (x n ) is closer to the larger value of the two disparity values. And minimal values are determined if d (x n ) is closer to the smaller value of the two disparity values.
  • step 1 if the erroneous disparity value of the pixel is not greater than this maximum value determined in step 106, the erroneous disparity value of the pixel x n is replaced by the disparity value of the pixel.
  • the erroneous disparity value of the pixel x n is replaced by the minimum or maximum value determined during the preceding step or by the disparity value of the next pixel x n + 1 or preceding x n i.
  • the process described by this mimic can be refined by taking into account a large change of disparity or particular cases.
  • the last steps 1 10 - 1 16 therefore preferably assign a new value to the disparity value according to the preceding criteria. It is alternatively possible to replace this step with a deletion of the disparity value.
  • the filtering method proposed by the invention is symmetrical and therefore treats the right and left edges of the objects in the same way, although the problems related to these edges are different.
  • the examples concern the interpolation of the right edge of objects in the foreground from the disparity map associated with the left view or the interpolation of the left edge of the objects in the foreground from the map of disparity associated with the right view because artifact problems are the most important.
  • the method provides processing on the right edge of the objects in the foreground for the interpolation from the disparity map associated with the left view. Similarly and according to another variant of the invention, the method provides processing on the left edge of the objects in the foreground for the interpolation from the disparity map associated with the right view.
  • This dissociation taking place in the first step (d (xn-1) ⁇ d (xn) ⁇ d (xn + 1) or d (xn-1)> d (xn)> d (xn + 1)), It would suffice to retain only one of these conditions to pre-filter only one side.
  • FIG. 3 represents the disparity values of a disparity map of an image comprising two elements of disparities different from a system without a filtering device. These two values are represented by a light gray and a dark gray
  • FIG. 4 represents the disparity values of a disparity card of a system with filtering device according to the invention.
  • the erroneous disparity values have been corrected and replaced by one of the two existing values according to the method according to the invention.
  • the disparity map has only two values.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)
  • Processing Or Creating Images (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
EP12775782.1A 2011-09-29 2012-09-27 Verfahren und vorrichtung zum filtern einer disparitätskarte Withdrawn EP2761876A1 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FR1158739 2011-09-29
PCT/FR2012/052195 WO2013045853A1 (fr) 2011-09-29 2012-09-27 Méthode et dispositif de filtrage d'une carte de disparité

Publications (1)

Publication Number Publication Date
EP2761876A1 true EP2761876A1 (de) 2014-08-06

Family

ID=47071367

Family Applications (1)

Application Number Title Priority Date Filing Date
EP12775782.1A Withdrawn EP2761876A1 (de) 2011-09-29 2012-09-27 Verfahren und vorrichtung zum filtern einer disparitätskarte

Country Status (7)

Country Link
US (1) US9299154B2 (de)
EP (1) EP2761876A1 (de)
JP (1) JP2014534665A (de)
KR (1) KR20140069266A (de)
CN (1) CN103828355B (de)
BR (1) BR112014007263A2 (de)
WO (1) WO2013045853A1 (de)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9818232B2 (en) * 2015-08-26 2017-11-14 Adobe Systems Incorporated Color-based depth smoothing of scanned 3D model to enhance geometry in 3D printing
JP6991700B2 (ja) 2016-04-28 2022-01-12 キヤノン株式会社 情報処理装置、情報処理方法、プログラム

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010007285A1 (fr) * 2008-06-24 2010-01-21 France Telecom Procede et dispositif de remplissage des zones d'occultation d'une carte de profondeur ou de disparites estimee a partir d'au moins deux images
WO2010037512A1 (en) * 2008-10-02 2010-04-08 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Intermediate view synthesis and multi-view data signal extraction
US20110063420A1 (en) * 2009-09-11 2011-03-17 Tomonori Masuda Image processing apparatus

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3551467B2 (ja) * 1994-04-13 2004-08-04 松下電器産業株式会社 視差演算装置、視差演算方法及び画像合成装置
JP3769850B2 (ja) * 1996-12-26 2006-04-26 松下電器産業株式会社 中間視点画像生成方法および視差推定方法および画像伝送方法
US7015926B2 (en) 2004-06-28 2006-03-21 Microsoft Corporation System and process for generating a two-layer, 3D representation of a scene
KR101545008B1 (ko) 2007-06-26 2015-08-18 코닌클리케 필립스 엔.브이. 3d 비디오 신호를 인코딩하기 위한 방법 및 시스템, 동봉된 3d 비디오 신호, 3d 비디오 신호용 디코더에 대한 방법 및 시스템
CN101822068B (zh) * 2007-10-11 2012-05-30 皇家飞利浦电子股份有限公司 用于处理深度图的方法和设备
CA2704479C (en) 2007-11-09 2016-01-05 Thomson Licensing System and method for depth map extraction using region-based filtering
US8106924B2 (en) 2008-07-31 2012-01-31 Stmicroelectronics S.R.L. Method and system for video rendering, computer program product therefor
KR20110059790A (ko) 2008-09-25 2011-06-03 코닌클리케 필립스 일렉트로닉스 엔.브이. 3차원 이미지 데이터 처리
WO2010087751A1 (en) * 2009-01-27 2010-08-05 Telefonaktiebolaget Lm Ericsson (Publ) Depth and video co-processing
KR101590763B1 (ko) 2009-06-10 2016-02-02 삼성전자주식회사 Depth map 오브젝트의 영역 확장을 이용한 3d 영상 생성 장치 및 방법
US8933925B2 (en) 2009-06-15 2015-01-13 Microsoft Corporation Piecewise planar reconstruction of three-dimensional scenes
WO2011097306A1 (en) * 2010-02-04 2011-08-11 Sony Corporation 2d to 3d image conversion based on image content
US20110199469A1 (en) * 2010-02-15 2011-08-18 Gallagher Andrew C Detection and display of stereo images
US8428342B2 (en) * 2010-08-12 2013-04-23 At&T Intellectual Property I, L.P. Apparatus and method for providing three dimensional media content
US20140035909A1 (en) * 2011-01-20 2014-02-06 University Of Iowa Research Foundation Systems and methods for generating a three-dimensional shape from stereo color images
US8837816B2 (en) * 2011-09-27 2014-09-16 Mediatek Inc. Method and apparatus for generating final depth information related map that is reconstructed from coarse depth information related map through guided interpolation

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010007285A1 (fr) * 2008-06-24 2010-01-21 France Telecom Procede et dispositif de remplissage des zones d'occultation d'une carte de profondeur ou de disparites estimee a partir d'au moins deux images
WO2010037512A1 (en) * 2008-10-02 2010-04-08 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Intermediate view synthesis and multi-view data signal extraction
US20110063420A1 (en) * 2009-09-11 2011-03-17 Tomonori Masuda Image processing apparatus

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
LAI-MAN PO ET AL: "A new multidirectional extrapolation hole-filling method for Depth-Image-Based Rendering", IMAGE PROCESSING (ICIP), 2011 18TH IEEE INTERNATIONAL CONFERENCE ON, IEEE, 11 September 2011 (2011-09-11), pages 2589 - 2592, XP032080200, ISBN: 978-1-4577-1304-0, DOI: 10.1109/ICIP.2011.6116194 *
See also references of WO2013045853A1 *

Also Published As

Publication number Publication date
BR112014007263A2 (pt) 2017-03-28
WO2013045853A1 (fr) 2013-04-04
CN103828355B (zh) 2017-05-31
KR20140069266A (ko) 2014-06-09
US20140226899A1 (en) 2014-08-14
CN103828355A (zh) 2014-05-28
JP2014534665A (ja) 2014-12-18
US9299154B2 (en) 2016-03-29

Similar Documents

Publication Publication Date Title
US10948726B2 (en) IPD correction and reprojection for accurate mixed reality object placement
US9041709B2 (en) Saliency based disparity mapping
Chamaret et al. Adaptive 3D rendering based on region-of-interest
US8798160B2 (en) Method and apparatus for adjusting parallax in three-dimensional video
WO2010007285A1 (fr) Procede et dispositif de remplissage des zones d'occultation d'une carte de profondeur ou de disparites estimee a partir d'au moins deux images
EP2162794A1 (de) Verfahren und geräte zum herstellen und anzeigen von stereobildern mit farbfiltern
AU2011200146A1 (en) Method and apparatus for processing video games
EP3114831A1 (de) Optimierte videorauschunterdrückung für heterogenes multisensorsystem
FR3002104A1 (fr) Procede pour generer, transmettre et recevoir des images stereoscopiques, et dispositifs connexes
EP2469868B1 (de) Verfahren zur Korrektur der Hyperstereoskopie, und entsprechendes Visualisierungssystem mit Helm
WO2013045853A1 (fr) Méthode et dispositif de filtrage d'une carte de disparité
FR2873214A1 (fr) Procede et dispositif d'obtention d'un signal stereoscopique
EP2801075A1 (de) Bildverarbeitungsverfahren für eine bordkamera in einem fahrzeug und entsprechende verarbeitungsvorrichtung
FR2968108A1 (fr) Procede de reduction de la taille d’une image stereoscopique
US20130093754A1 (en) 3d image processing method with a reduced ghost effect and 3d display device thereof
US20130201186A1 (en) Film grain for stereoscopic or multi-view images
EP2629533B1 (de) Filmkorn für Stereoskopie- oder Mehrfachansichtsbilder
Guilluy Video stabilization: A synopsis of current challenges, methods and performance evaluation
FR2984665A1 (fr) Procede de reduction du crosstalk dans les images stereoscopiques.
WO2024194540A1 (fr) Procede de correction d'images
FR3028703A1 (fr) Procede de rendu stereoscopique d'une scene 3d sur un ecran, dispositif, systeme et programme d'ordinateur associes
WO2020260034A1 (fr) Procede et dispositif de traitement de donnees de video multi-vues
FR2888346A1 (fr) Procede et dispositif d'obtention d'une sequence d'images stereoscopiques a partir d'une sequence d'images monoscopiques
JP2010258886A (ja) 撮像装置、プログラム、撮像方法
WO2008142235A1 (fr) Procede de traitement d'image pour la synthese d'image autostereoscopique

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20140424

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAX Request for extension of the european patent (deleted)
17Q First examination report despatched

Effective date: 20160909

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

RIC1 Information provided on ipc code assigned before grant

Ipc: H04N 13/128 20140806ALI20181106BHEP

Ipc: H04N 13/00 20060101AFI20181106BHEP

Ipc: H04N 13/122 20140806ALI20181106BHEP

Ipc: H04N 13/111 20140806ALI20181106BHEP

INTG Intention to grant announced

Effective date: 20181206

RIC1 Information provided on ipc code assigned before grant

Ipc: H04N 13/122 20180101ALI20181106BHEP

Ipc: H04N 13/111 20180101ALI20181106BHEP

Ipc: H04N 13/128 20180101ALI20181106BHEP

Ipc: H04N 13/00 20180101AFI20181106BHEP

RIC1 Information provided on ipc code assigned before grant

Ipc: H04N 13/111 20180101ALI20181106BHEP

Ipc: H04N 13/122 20180101ALI20181106BHEP

Ipc: H04N 13/00 20180101AFI20181106BHEP

Ipc: H04N 13/128 20180101ALI20181106BHEP

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: INTERDIGITAL CE PATENT HOLDINGS

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20190417