WO2010018880A1 - Procédé et appareil d’estimation de profondeur à partir d’une image unique en temps réel - Google Patents

Procédé et appareil d’estimation de profondeur à partir d’une image unique en temps réel Download PDF

Info

Publication number
WO2010018880A1
WO2010018880A1 PCT/KR2008/004664 KR2008004664W WO2010018880A1 WO 2010018880 A1 WO2010018880 A1 WO 2010018880A1 KR 2008004664 W KR2008004664 W KR 2008004664W WO 2010018880 A1 WO2010018880 A1 WO 2010018880A1
Authority
WO
WIPO (PCT)
Prior art keywords
images
windows
local deviation
deviation
image
Prior art date
Application number
PCT/KR2008/004664
Other languages
English (en)
Inventor
Hong Jeong
Jihee Choi
Youngmin Ha
Original Assignee
Postech Academy-Industry Foundation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Postech Academy-Industry Foundation filed Critical Postech Academy-Industry Foundation
Priority to PCT/KR2008/004664 priority Critical patent/WO2010018880A1/fr
Publication of WO2010018880A1 publication Critical patent/WO2010018880A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/529Depth or shape recovery from texture

Definitions

  • the present invention relates to an apparatus and a method for estimating depths for pixels from a single two-dimensional image.
  • Camera lenses for obtaining an image with shallow depth of field are in use to estimate depths for respective pixels from a two-dimensional image. Upon use of such an image, focused portions of the image are clear, and the remaining portions thereof are blurred. Moreover, as a target object becomes far away from a focused object, an image of the target object becomes more blurred. Then, upon focusing on an object closest to a camera lens, how far away target objects are from the closest object in terms of image depth may be estimated by measuring a blur degree from the images of the target objects. As a result, the depth from the center of a camera lens to a target object may be recognized for each pixel by adding the distance between a focused object and the camera lens to the distance value estimated using the blur degree of an image of the target object.
  • Documents 1 to 3 as follows. In Documents 1 to 3, depths for respective pixels are estimated from a single image with a shallow depth of field.
  • a method for estimating depths for pixels in a single two-dimensional image which includes: creating local deviation images by applying different sizes of windows to the two- dimensional image; creating equalized images by applying windows of different sizes to the respective local deviation images and equalizing portions of the respective local deviation images that have different intensities; and creating a depth map using the equalized deviation images.
  • a depth estimation apparatus which includes: a local deviation image creation unit for creating local deviation images by applying windows of different sizes to a single two- dimensional image applied thereto; an intensity equalization unit for creating equalized deviation images by applying windows of different sizes to the local deviation images and equalizing portions of the local deviation images that have different intensities; and a depth map creation unit for creating a depth map using the equalized deviation images.
  • real-time depth estimation is possible by carrying out operations according to different sizes of windows in the process of estimating depths for respective pixels from a single two-dimensional image. Furthermore, no separate camera for production of a stereo image is necessary, and a real-time depth estimation may be easily employed in a general camera.
  • FIG. 1 is a block diagram schematically illustrating an apparatus for estimating depths for pixels in an image in accordance with an embodiment of the present invention.
  • FIG. 2 is a detailed block diagram illustrating the depth estimation apparatus illustrated in Fig. 1. Best Mode for Carrying Out the Invention
  • the basic principle of depth estimation in the present invention is that deviations of pixel values are measured according to different sizes of the windows and then are compared with each other in the same image coordinates.
  • a target object in the corresponding image coordinate is considered to be close to a focused object.
  • a target object at the corresponding image coordinate is considered to be far away from a focused object.
  • the depth estimation apparatus of the present invention includes a local deviation image creation unit 12, an intensity equalization unit 14, and a depth map creation unit 16.
  • the local deviation image creation unit 12 obtains local deviation images according to different sizes of windows using a single two-dimensional image provided thereto.
  • the intensity equalization unit 14 equalizes the width with a large intensity in the respective deviation images provided from the local deviation image creation unit 12 to produce equalized deviation images.
  • the depth map creation unit 16 creates a depth map for determining depth values for respective pixels using the equalized images obtained by the intensity equalization unit.
  • All the components of the depth estimation apparatus of the present invention such as the local deviation image creation unit 12, an intensity equalization unit 14, and a depth map creation unit 16, and all the processes carried out by the components may be realized by hardware in which a three-stage pipeline structure may be employed for parallel processing.
  • a single depth map is produced thereform.
  • the size of the depth image produced from the depth estimation apparatus is the same as that of the single two-dimensional image, and the pixel values in the depth map reflect how far a target object including the pixels is spaced apart from a focused object. If the pixel values are small in the depth map, it means that the target object is near to the focused object. On the contrary, if the pixel values are large in the depth map, it means that the target object is far away from the focused object.
  • the pixel values in the depth map are discontinuous and finite. Assuming the set of the pixel values is 'A', the number of elements in the set 'A' can be adjusted, and is defined as the depth resolution of 'A' and is indicated as 'R'.
  • FIG. 2 shows a detailed block diagram of the depth estimation apparatus illustrated in
  • the deviation image creation unit 12 includes R- number of local deviation calculators 21, 23, ..., and 25.
  • the local deviation calculators 21, 23, ..., and 25 have different sizes of windows 22, 24, ..., and 26 allocated thereto and create local deviation images according to the sizes of windows from a single two-dimensional image, respectively.
  • the term 'local' indicates that a local deviation image is created not by obtaining the deviation on the intensities of the entire pixels in the two-dimensional image but by obtaining the deviation for pixels in a window applied to the two-dimensional image. The obtained deviation is applied to the respective pixels in the two-dimensional image, thereby producing a local deviation image.
  • the following rules are used to determine the sizes of windows used in the respective local deviation calculators. More particularly, the windows are sequentially allocated to the local deviation calculators from the one of the smallest size to the one of the largest size, respectively.
  • the size of a window for obtaining a local deviation image of i-th index is (i+2) by (i+2).
  • the range of the indices is ⁇ 0, 1, 2, -, (R-3), (R-2), and (R-l) ⁇ .
  • the local deviation images created by the local deviation image creating unit 12 are provided to the intensity equalization unit 14.
  • the intensity equalization unit 14 includes the same number of maximum filters 31,
  • the maximum filters 31, 33, ... and 35 have different sizes of windows 32, 34, ..., and 36 allocated thereto and equalize the widths with high intensities in the deviation images provided from the corresponding local deviation calculators 21, 23, ..., and 25, respectively.
  • the maximum filters 31, 33, ..., 35 are used to equalize the widths of the portions with high intensities in the local deviation images, respectively.
  • the maximum filters select the maximum values of the pixels in the windows applied to the local deviation images, so that portions with high intensities in the local deviation images are made thicker.
  • the degree by which a portion with a high intensity in a local deviation image makes thicker in width thereof can be adjusted by making the sizes of windows different from each other.
  • the size of a window used in a maximum filter is small, the degree by which a portion with a high intensity in a local deviation image becomes thicker is very small.
  • the size of a window used in a maximum filter is large, the degree by which a portion with a high intensity in a local deviation image becomes thicker is increased.
  • a rule is applied when the sizes of windows used in maximum filters are determined.
  • the windows are sequentially applied from the one of the largest size to the one of the smallest size opposite to the sequence of allocation of the sizes of windows by the deviation calculators.
  • the window size having an i-th index for obtaining an equalized deviation image from a maximum filter is (R-i) by (R-i).
  • the range of the indices is ⁇ 0, 1, 2, •••, (R-3), (R-2), and (R-l) ⁇ .
  • the local deviation image to which a 2 by 2 window is applied is equalized by a maximum filter having the window size of 10 by 10. Accordingly, the equalized images are obtained by applying local deviations and maximum filtering in a single two-dimensional image.
  • the equalized deviation images created by the maximum filters are provided to the depth map creating unit 16.
  • the depth map creating unit 16 creates a depth map by using the equalized deviation images provided from the intensity equalization unit 14.
  • the intensities of pixels represent absolute distances between points on an object and a focal plane, respectively.
  • the depth map is obtained by comparing the intensities of pixels in the equalized deviation images at the same image coordinates.
  • the depth map creation unit 16 compares the intensities of the pixels, selects a largest pixel among them and determines which index the largest pixel is located at.
  • the embodiment of the present invention employs a belief propagation algorithm in the depth map creation unit 16 to solve the above-mentioned disadvantages.
  • the belief propagation algorithm enables acquisition of a depth map by using a plurality of equalized images obtained by the filter module 14.
  • a process to obtain a depth map in the depth map creation unit 16 is the same that of described above, that is, the depth map is also obtained by comparing intensities of equalized images in the same image coordinates.
  • the belief propagation algorithm allows adjacent pixels to have similar depth map values to thereby obtain a more accurate depth map.
  • depths for respective pixels can be estimated from a single two-dimensional image, and depth information can be obtained by processing depth estimating operations in parallel.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

L'invention concerne un procédé destiné à estimer des profondeurs de pixels dans une image bidimensionnelle unique, comportant les étapes consistant à créer des images en déviation locale en appliquant des fenêtres de différentes tailles à l’image bidimensionnelle, à créer des images égalisées en appliquant des fenêtres de différentes tailles aux images respectives en déviation locale et en égalisant les parties des images respectives en déviation locale qui présentent des intensités différentes. Une carte de profondeur est alors créée à l’aide des images en déviation égalisées.
PCT/KR2008/004664 2008-08-11 2008-08-11 Procédé et appareil d’estimation de profondeur à partir d’une image unique en temps réel WO2010018880A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/KR2008/004664 WO2010018880A1 (fr) 2008-08-11 2008-08-11 Procédé et appareil d’estimation de profondeur à partir d’une image unique en temps réel

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/KR2008/004664 WO2010018880A1 (fr) 2008-08-11 2008-08-11 Procédé et appareil d’estimation de profondeur à partir d’une image unique en temps réel

Publications (1)

Publication Number Publication Date
WO2010018880A1 true WO2010018880A1 (fr) 2010-02-18

Family

ID=41669015

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2008/004664 WO2010018880A1 (fr) 2008-08-11 2008-08-11 Procédé et appareil d’estimation de profondeur à partir d’une image unique en temps réel

Country Status (1)

Country Link
WO (1) WO2010018880A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8743180B2 (en) 2011-06-28 2014-06-03 Cyberlink Corp. Systems and methods for generating a depth map and converting two-dimensional data to stereoscopic data
EP2747028A1 (fr) 2012-12-18 2014-06-25 Universitat Pompeu Fabra Procédé de récupération d'une carte de profondeur relative à partir d'une image unique ou d'une séquence d'images fixes

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020110273A1 (en) * 1997-07-29 2002-08-15 U.S. Philips Corporation Method of reconstruction of tridimensional scenes and corresponding reconstruction device and decoding system
US20070019883A1 (en) * 2005-07-19 2007-01-25 Wong Earl Q Method for creating a depth map for auto focus using an all-in-focus picture and two-dimensional scale space matching
US20070024614A1 (en) * 2005-07-26 2007-02-01 Tam Wa J Generating a depth map from a two-dimensional source image for stereoscopic and multiview imaging
WO2008016882A2 (fr) * 2006-08-01 2008-02-07 Qualcomm Incorporated Capture en temps réel et génération d'images et de vidéos stéréo au moyen d'un dispositif mobile monoscopique à faible puissance

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020110273A1 (en) * 1997-07-29 2002-08-15 U.S. Philips Corporation Method of reconstruction of tridimensional scenes and corresponding reconstruction device and decoding system
US20070019883A1 (en) * 2005-07-19 2007-01-25 Wong Earl Q Method for creating a depth map for auto focus using an all-in-focus picture and two-dimensional scale space matching
US20070024614A1 (en) * 2005-07-26 2007-02-01 Tam Wa J Generating a depth map from a two-dimensional source image for stereoscopic and multiview imaging
WO2008016882A2 (fr) * 2006-08-01 2008-02-07 Qualcomm Incorporated Capture en temps réel et génération d'images et de vidéos stéréo au moyen d'un dispositif mobile monoscopique à faible puissance

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8743180B2 (en) 2011-06-28 2014-06-03 Cyberlink Corp. Systems and methods for generating a depth map and converting two-dimensional data to stereoscopic data
US9077963B2 (en) 2011-06-28 2015-07-07 Cyberlink Corp. Systems and methods for generating a depth map and converting two-dimensional data to stereoscopic data
EP2747028A1 (fr) 2012-12-18 2014-06-25 Universitat Pompeu Fabra Procédé de récupération d'une carte de profondeur relative à partir d'une image unique ou d'une séquence d'images fixes

Similar Documents

Publication Publication Date Title
Mishiba Fast depth estimation for light field cameras
Pertuz et al. Generation of all-in-focus images by noise-robust selective fusion of limited depth-of-field images
EP3582488B1 (fr) Mise au point automatique de caméra stéréoscopique
CN103003665B (zh) 立体测距装置
CN112819772A (zh) 一种高精度快速图形检测识别方法
KR100911814B1 (ko) 스테레오 영상 매칭오류 제거장치 및 이를 이용한 제거방법
CN116309757B (zh) 基于机器视觉的双目立体匹配方法
EP2926558B1 (fr) Procédé et système de calcul de profondeur de champ étendue pour des images microscopiques
US20110128282A1 (en) Method for Generating the Depth of a Stereo Image
Mutahira et al. Focus measurement in color space for shape from focus systems
Jang et al. Optimizing image focus for shape from focus through locally weighted non-parametric regression
Jang et al. Removal of non-gaussian jitter noise for shape from focus through improved maximum correntropy criterion kalman filter
CN111179333B (zh) 一种基于双目立体视觉的散焦模糊核估计方法
CN115631223A (zh) 基于自适应学习和聚合的多视图立体重建方法
EP3963546A1 (fr) Volume de coûts pouvant s'apprendre et permettant de déterminer une correspondance de pixels
CN105243673A (zh) 一种基于块匹配的运动估计方法、运动估计系统及其应用
WO2010018880A1 (fr) Procédé et appareil d’estimation de profondeur à partir d’une image unique en temps réel
Hao et al. Improving the performances of autofocus based on adaptive retina-like sampling model
CN104754316A (zh) 一种3d成像方法、装置及成像系统
Tung et al. Multiple depth layers and all-in-focus image generations by blurring and deblurring operations
CN113344988B (zh) 立体匹配方法、终端及存储介质
Tung et al. Depth extraction from a single image and its application
JP2018133064A (ja) 画像処理装置、撮像装置、画像処理方法および画像処理プログラム
JP2018081378A (ja) 画像処理装置、撮像装置、画像処理方法および画像処理プログラム
CN109360161B (zh) 一种基于梯度域先验的多光谱图像去模糊方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 08793178

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 08793178

Country of ref document: EP

Kind code of ref document: A1