CN111179333A - Defocus fuzzy kernel estimation method based on binocular stereo vision - Google Patents

Defocus fuzzy kernel estimation method based on binocular stereo vision Download PDF

Info

Publication number
CN111179333A
CN111179333A CN201911247805.5A CN201911247805A CN111179333A CN 111179333 A CN111179333 A CN 111179333A CN 201911247805 A CN201911247805 A CN 201911247805A CN 111179333 A CN111179333 A CN 111179333A
Authority
CN
China
Prior art keywords
fuzzy
image
kernel
binocular
calculating
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911247805.5A
Other languages
Chinese (zh)
Other versions
CN111179333B (en
Inventor
陈则津
葛宝臻
陈雷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin University
Original Assignee
Tianjin University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin University filed Critical Tianjin University
Priority to CN201911247805.5A priority Critical patent/CN111179333B/en
Publication of CN111179333A publication Critical patent/CN111179333A/en
Application granted granted Critical
Publication of CN111179333B publication Critical patent/CN111179333B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • G06F18/232Non-hierarchical techniques
    • G06F18/2321Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
    • G06F18/23213Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions with fixed number of clusters, e.g. K-means clustering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20228Disparity calculation for image-based rendering

Abstract

The invention relates to a defocus blur kernel estimation method based on binocular stereo vision. The invention comprises the following steps: calculating an initial fuzzy core; obtaining equivalent fuzzy images; and (3) stereo matching calculation: integrating the gray information and the fuzzy kernel information as data items of an energy function of a global matching method, and performing stereo matching on the equivalent fuzzy images by optimizing the energy function by using the global matching method to obtain a dense disparity map; and (3) parallax post-processing: carrying out weighted mean filtering processing on the dense disparity map; and (3) calculating a final fuzzy core: and taking the disparity map subjected to disparity postprocessing as an initial disparity map, and performing blur kernel calculation again to obtain a final blur kernel. The method fully considers the relation between parallax and the fuzzy kernel in the binocular stereo vision, improves the accuracy of fuzzy kernel calculation by using the binocular image, and is suitable for the defocusing fuzzy kernel estimation problem based on the binocular stereo vision.

Description

Defocus fuzzy kernel estimation method based on binocular stereo vision
Technical Field
The invention relates to a defocus blur kernel estimation method, in particular to a defocus blur kernel estimation problem based on binocular stereo vision.
Background
In the image acquisition process, the image degradation can occur to different degrees due to the imperfection of the imaging system and the interference of the external environment. Defocus blur is a common form of blur and refers to image blur due to depth of field reasons or focus errors of the imaging system. In addition, the image is affected by atmospheric turbulence in the case of long-distance imaging, which also results in the generation of defocus blur. Due to the reasons, clear images are difficult to acquire by an imaging system, defocusing fuzzy degradation of the binocular images at different degrees exists, and the precision of three-dimensional reconstruction is reduced. Whether the deblurring algorithm or the stereo matching algorithm aiming at the blurred image is adopted, the estimation of the blur kernel is always a key problem in the field of blurred images.
The existing fuzzy kernel estimation method mostly uses a single image or an image sequence shot from a single visual angle to perform calculation, calculates a fuzzy kernel corresponding to a pixel in an edge area by extracting edge texture information of the image, and then calculates a fuzzy kernel map of a whole image by a propagation algorithm. On the basis, Tang et al propose a method for calculating a blur kernel according to spectral contrast, and since defocus blur affects the spectral amplitude of an object edge region, a blur kernel map can be calculated according to the relationship between the blur degree of a secondary blurred image and a spectrum[2]. The method is only suitable for defocusing fuzzy kernel estimation of the monocular image, and can not effectively utilize more information provided by the binocular image to carry out fuzzy kernel estimation.
Disclosure of Invention
The invention provides a defocus blur kernel estimation method based on binocular stereo vision for overcoming the defects of the prior art and aiming at the defocus blur kernel estimation problem of a binocular degraded image.
The technical scheme of the invention is described in detail as follows:
the defocus blur kernel estimation method based on binocular stereo vision comprises the following five steps:
step one, calculating an initial fuzzy core. Performing stereo matching calculation on the binocular image to obtain an initial disparity map, and calculating an initial blur kernel of the binocular image by using disparity values corresponding to the edge pixels;
and step two, obtaining the equivalent blurred image. Calculating a relative fuzzy core according to the initial fuzzy core of the binocular image, and performing self-adaptive Gaussian filtering on the binocular image to obtain an equivalent fuzzy image;
and step three, calculating stereo matching. And integrating the gray information and the fuzzy kernel information as data items of an energy function of the global matching method. Performing stereo matching on the equivalent blurred images by using a global matching method through an optimized energy function to obtain a dense disparity map;
and step four, performing parallax post-processing. Carrying out weighted mean filtering processing on the dense disparity map obtained in the step;
and step five, finally calculating the fuzzy core. And taking the disparity map subjected to disparity postprocessing as an initial disparity map, and performing blur kernel calculation again to obtain a final blur kernel.
The initial fuzzy core calculating step specifically comprises the following steps:
calculating the average parallax value d of edge pixels in the binocular images by using a K-means clustering algorithm according to the initial parallax map of the binocular imageskWherein k is 1,2 represents the corresponding information of the left and right images respectively, according to the camera calibration parameter and the average parallax dkCalculating initial fuzzy kernel sigma corresponding to each pixel of binocular imagek
The step of obtaining the equivalent blurred image specifically comprises the following steps:
calculating relative fuzzy core according to initial fuzzy core of each pixel in binocular image
Figure BDA0002308179170000021
In the formula sigma1(p)、σ2(q) represents initial blur kernels for arbitrary pixels p for the left and right images, respectively. According to the relative blur kernel delta sigma (p), Gaussian filtering is carried out on the region with smaller blur kernel in the corresponding region of the binocular image to obtain the same blur image, and the relative blur kernel delta sigma (p) is selected as the region of the Gaussian function filtering processAnd standard deviation parameters are adopted, so that equivalent blurred binocular images with similar blurring degrees of corresponding areas are obtained.
The global matching method specifically comprises the following steps:
the global matching method is a basic method in stereo matching, and stereo matching calculation is carried out by optimizing an energy function. The energy function of the global matching method consists of a data item and a smoothing item, the weighted sum of the relative blurring kernel information and the pixel gray scale information of the binocular image is used as the data item of the energy function, and the pixel gray scale information is used as the smoothing item of the energy function. And constructing a corresponding graph model according to the equivalent fuzzy binocular image and the energy function, processing the parallax value as a label, and selecting an optimal label for each pixel by calculating the minimum value of the global energy function so as to obtain a dense parallax graph.
The weighted mean filtering specifically comprises:
and selecting a support window from the disparity map, taking the difference of the fuzzy cores corresponding to each pixel and the central pixel in the window as a weight, and taking the weighted average of the disparities of all pixels in the window as the disparity value of the central pixel to obtain the final disparity map.
The invention has the following beneficial effects:
1. the method fully considers the relation between the fuzzy kernel and the parallax, and improves the accuracy of fuzzy kernel calculation through the parallax constraint of the binocular stereo vision. The method provided by the invention has clear principle and more accurate and reliable calculation result, and can provide a new idea for calculating the defocusing fuzzy core in the binocular stereo vision.
2. In the process of calculating the fuzzy core, the method uses the K-means algorithm and the weighted mean filtering post-processing algorithm, can select more accurate supporting points, neglects the interference of partial irrelevant points and improves the accuracy of calculating the fuzzy core.
Drawings
FIG. 1 is a flow chart of a defocus blur kernel estimation method based on binocular stereo vision according to the present invention;
FIG. 2 is a defocused binocular image used in an example of the present invention, wherein (a) is a left view and (b) is a right view;
fig. 3 shows the output results of the present invention, wherein (a), (b) are the left and right views of the equivalent blurred image obtained in step 2, and (c) is the defocus map obtained in the present invention.
Detailed Description
In order to make the technical scheme of the present invention clearer, the following will further describe a specific implementation method of the present invention with reference to the accompanying drawings.
The following explains a specific processing procedure of the defocus blur kernel estimation method based on binocular stereo vision, taking the defocus binocular images shown in fig. 2(a) and fig. 2(b) as an example:
1. an initial fuzzy core calculation step:
defocused binocular images shown in fig. 2(a) and 2(b) are respectively denoted as I1、I2Calculating a disparity map P of the binocular image by using a local matching algorithm based on SAD0Extracting edge pixels of the binocular image by using a Canny edge detection algorithm, and combining disparity values of the edge pixels into a set psikWhere k is 1 and 2 represents the corresponding information of the left and right images, respectively. Computing a set of edge pixels psi using a K-means clustering algorithmkAverage disparity value d ofkFurther calculating the initial fuzzy core corresponding to each pixel of the binocular image
Figure BDA0002308179170000041
And composing the initial fuzzy core map. In the formula, DkDenotes the diameter of the entrance pupil of the imaging system, B denotes the system baseline, d denotes the parallax value of the pixel in the initial parallax map, dkSetting a parameter D for the parallax value of the edge pixel according to the calibration information of the image1=D2=5.6,B=3080.117。
2. Obtaining an equivalent blurred image:
calculating relative fuzzy core according to the initial fuzzy core of each pixel obtained in the above steps
Figure BDA0002308179170000042
In the formula sigma1(p)、σ2(q) shows the left and right diagrams respectivelyLike a blur kernel for the corresponding pixel. According to the relative fuzzy core, performing Gaussian filtering G on the area with smaller fuzzy core in the corresponding area of the binocular imageΔσObtaining an equivalent blurred image:
Figure BDA0002308179170000043
Figure BDA0002308179170000044
in the formula I1(p)、I2(p) respectively represents corresponding pixels of the left and right original images,
Figure BDA0002308179170000045
respectively representing corresponding pixels of equally blurred images, GΔσRepresents a gaussian point spread function with a standard deviation of delta sigma,
Figure BDA0002308179170000048
representing a convolution operation. For binocular images I as shown in fig. 3(a) and 3(b), respectively1、I2Carrying out adaptive fuzzy compensation to obtain equivalent fuzzy image
Figure BDA0002308179170000046
3. And stereo matching calculation:
using global matching method to optimize energy function to obtain equivalent fuzzy image
Figure BDA0002308179170000047
Performing stereo matching experiment to obtain dense parallax image P1
In the global matching algorithm, a corresponding graph model can be constructed according to the image and energy function, and the parallax value is used as a label for processing. The network graph G is composed of a node set V and an edge set E, and a partition may partition the network graph G to include a source node s0Set S of (2) and containing sink node t0Set T of (2), capacity of edge to be cut into setAnd the sum is called the cut capacity, the minimum cut capacity is obtained by a method of calculating the maximum flow of the network, so that a global energy function is optimized, and an optimal label is selected for each pixel to obtain a dense disparity map.
The energy function of the global matching algorithm consists of a data item and a smooth item, and the fuzzy kernel information and the pixel gray level information of the binocular image are combined to be used as a data item E of the energy functiondata(f) Smoothing term E using pixel gray scale information as an energy functionsmoothness(f) Obtaining a disparity map P by calculating the minimum value of the energy function1
Data item Edata(f) Can be defined as:
Edata(f)=∑D(a)
Figure BDA0002308179170000054
where α is the scaling factor for adjusting the weight and T is the threshold for the data item.
Smoothing term Esmoothness(f) Can be defined as:
Figure BDA0002308179170000051
Figure BDA0002308179170000052
in the formula, λ is a grayscale threshold parameter.
in this embodiment, the algorithm parameters α is 0.15, T is 30, and λ is 5.
4. And (3) parallax post-processing:
using weighted mean filtering parallax post-processing method to obtain the parallax map P1And (6) carrying out optimization treatment. In the disparity map P1Selecting a window, and taking the difference of the fuzzy cores corresponding to each pixel and the central pixel in the window as a weight omegapq=exp[-(σpq)2]Weighted average of all pixel disparities within a window
Figure BDA0002308179170000053
Obtaining a disparity map P as the disparity value of the central pixel2. Wherein σp、σqRespectively representing a central pixel and a support pixel, N (p) representing the support window of the central pixel p, dqRepresenting the disparity value of pixel q within the support window.
5. And a final fuzzy core calculation step:
a disparity map P obtained by post-processing the disparity2As an initial disparity map, the blur kernel calculation step in step 1 is repeated, the blur kernel is recalculated as a final blur kernel, and the obtained blur kernel map is shown in fig. 3 (c).
The present invention is described with reference to the following examples, which are provided as preferred embodiments to describe the steps of the present invention and not to limit the scope of the present invention. All equivalent changes made within the principles of the invention are intended to be included within the scope thereof.

Claims (1)

1. A defocus blur kernel estimation method based on binocular stereo vision is characterized by comprising the following five steps:
step one, calculating an initial fuzzy core: performing stereo matching calculation on the binocular image to obtain an initial disparity map, and calculating an initial blur kernel of the binocular image by using disparity values corresponding to the edge pixels;
step two, obtaining the equivalent blurred image: calculating a relative fuzzy core according to the initial fuzzy core of the binocular image, and performing self-adaptive Gaussian filtering on the binocular image to obtain an equivalent fuzzy image;
step three, stereo matching calculation: integrating the gray information and the fuzzy kernel information as data items of an energy function of a global matching method, and performing stereo matching on the equivalent fuzzy images by optimizing the energy function by using the global matching method to obtain a dense disparity map;
step four, parallax post-processing: carrying out weighted mean filtering processing on the dense disparity map obtained in the step;
step five, calculating a final fuzzy core: taking the disparity map subjected to disparity postprocessing as an initial disparity map, and performing fuzzy kernel calculation again to obtain a final fuzzy kernel;
the calculating of the initial blur kernel of the binocular image specifically comprises the following steps:
calculating the average parallax value d of edge pixels in the binocular images by using a K-means clustering algorithm according to the initial parallax map of the binocular imageskWherein k is 1,2 represents the corresponding information of the left and right images respectively, according to the camera calibration parameter and the average parallax dkCalculating initial fuzzy kernel sigma corresponding to each pixel of binocular imagek
The step of obtaining the equivalent blurred image specifically comprises the following steps:
calculating relative fuzzy core according to initial fuzzy core of each pixel in binocular image
Figure FDA0002308179160000011
In the formula sigma1(p)、σ2(q) respectively representing initial fuzzy kernels of the left image and the right image corresponding to any pixel p, carrying out Gaussian filtering on a region with a smaller fuzzy kernel in a corresponding region of the binocular image according to a relative fuzzy kernel delta sigma (p) to obtain an equivalent fuzzy image, and selecting the relative fuzzy kernel delta sigma (p) as a standard deviation parameter of a Gaussian function filtering process to obtain the equivalent fuzzy binocular image with approximately the same fuzzy degree of the corresponding region;
the global matching method specifically comprises the following steps:
performing stereo matching calculation by optimizing an energy function, wherein the energy function of the global matching method consists of a data item and a smooth item, the weighted sum of relative fuzzy kernel information and pixel gray scale information of a binocular image is used as the data item of the energy function, the pixel gray scale information is used as the smooth item of the energy function, a corresponding image model is constructed according to the equivalent fuzzy binocular image and the energy function, a parallax value is used as a label for processing, and an optimal label is selected for each pixel by calculating the minimum value of the global energy function, so that a dense parallax image is obtained;
the weighted mean filtering process specifically includes:
and selecting a support window from the disparity map, taking the difference of the fuzzy cores corresponding to each pixel and the central pixel in the window as a weight, and taking the weighted average of the disparities of all pixels in the window as the disparity value of the central pixel to obtain the final disparity map.
CN201911247805.5A 2019-12-09 2019-12-09 Defocus blur kernel estimation method based on binocular stereo vision Active CN111179333B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911247805.5A CN111179333B (en) 2019-12-09 2019-12-09 Defocus blur kernel estimation method based on binocular stereo vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911247805.5A CN111179333B (en) 2019-12-09 2019-12-09 Defocus blur kernel estimation method based on binocular stereo vision

Publications (2)

Publication Number Publication Date
CN111179333A true CN111179333A (en) 2020-05-19
CN111179333B CN111179333B (en) 2024-04-26

Family

ID=70657245

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911247805.5A Active CN111179333B (en) 2019-12-09 2019-12-09 Defocus blur kernel estimation method based on binocular stereo vision

Country Status (1)

Country Link
CN (1) CN111179333B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112258635A (en) * 2020-10-26 2021-01-22 北京石油化工学院 Three-dimensional reconstruction method and device based on improved binocular matching SAD algorithm
CN112950698A (en) * 2021-03-18 2021-06-11 北京拙河科技有限公司 Depth estimation method, device, medium, and apparatus based on binocular defocused image
CN113965742A (en) * 2021-02-28 2022-01-21 北京中科慧眼科技有限公司 Dense disparity map extraction method and system based on multi-sensor fusion and intelligent terminal

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100201798A1 (en) * 2009-02-06 2010-08-12 Robert Bosch Gmbh depth-aware blur kernel estimation method for iris deblurring
KR20110024966A (en) * 2009-09-03 2011-03-09 포항공과대학교 산학협력단 Stereo matching apparatus and its method
US20130071028A1 (en) * 2011-09-16 2013-03-21 Stepen N. Schiller System and Method for Estimating Spatially Varying Defocus Blur in a Digital Image
CN105809185A (en) * 2015-12-31 2016-07-27 天津大学 High-spectrum image nonlinear demixing method based on neural network and differential search
CN105975912A (en) * 2016-04-27 2016-09-28 天津大学 Hyperspectral image nonlinearity solution blending method based on neural network
CN106225774A (en) * 2016-08-26 2016-12-14 武汉大学 A kind of unmanned agriculture tractor road measurement apparatus based on computer vision and method
CN106683174A (en) * 2016-12-23 2017-05-17 上海斐讯数据通信技术有限公司 3D reconstruction method, apparatus of binocular visual system, and binocular visual system
CN107749051A (en) * 2017-10-17 2018-03-02 长沙全度影像科技有限公司 A kind of unzoned lens space-variant blur core smoothing method based on mean filter
CN108389171A (en) * 2018-03-08 2018-08-10 深圳市唯特视科技有限公司 A kind of light field deblurring and depth estimation method based on Combined estimator fuzzy variable
CN108510529A (en) * 2018-03-14 2018-09-07 昆明理工大学 A kind of figure based on adaptive weight cuts solid matching method
CN110223377A (en) * 2019-05-28 2019-09-10 上海工程技术大学 One kind being based on stereo visual system high accuracy three-dimensional method for reconstructing

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100201798A1 (en) * 2009-02-06 2010-08-12 Robert Bosch Gmbh depth-aware blur kernel estimation method for iris deblurring
KR20110024966A (en) * 2009-09-03 2011-03-09 포항공과대학교 산학협력단 Stereo matching apparatus and its method
US20130071028A1 (en) * 2011-09-16 2013-03-21 Stepen N. Schiller System and Method for Estimating Spatially Varying Defocus Blur in a Digital Image
CN105809185A (en) * 2015-12-31 2016-07-27 天津大学 High-spectrum image nonlinear demixing method based on neural network and differential search
CN105975912A (en) * 2016-04-27 2016-09-28 天津大学 Hyperspectral image nonlinearity solution blending method based on neural network
CN106225774A (en) * 2016-08-26 2016-12-14 武汉大学 A kind of unmanned agriculture tractor road measurement apparatus based on computer vision and method
CN106683174A (en) * 2016-12-23 2017-05-17 上海斐讯数据通信技术有限公司 3D reconstruction method, apparatus of binocular visual system, and binocular visual system
CN107749051A (en) * 2017-10-17 2018-03-02 长沙全度影像科技有限公司 A kind of unzoned lens space-variant blur core smoothing method based on mean filter
CN108389171A (en) * 2018-03-08 2018-08-10 深圳市唯特视科技有限公司 A kind of light field deblurring and depth estimation method based on Combined estimator fuzzy variable
CN108510529A (en) * 2018-03-14 2018-09-07 昆明理工大学 A kind of figure based on adaptive weight cuts solid matching method
CN110223377A (en) * 2019-05-28 2019-09-10 上海工程技术大学 One kind being based on stereo visual system high accuracy three-dimensional method for reconstructing

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
孙士洁;赵怀慈;郝明国;李波: "噪声模糊图像盲目反降晰的模糊核准确估计", 计算机辅助设计与图形学学报, vol. 28, no. 5, 31 December 2016 (2016-12-31) *
洪汉玉;范艳;时愈;华夏;张天序: "立体视觉双目图像MAP的优化复原方法", 武汉科技大学学报, no. 001, 31 December 2016 (2016-12-31) *
湛西羊;李良群;张富有: "融入局部信息的直觉模糊核聚类图像分割算法", 信号处理, vol. 33, no. 3, 31 December 2017 (2017-12-31) *
耿冬冬;罗娜: "一种基于多邻域非线性扩散的动态规划全局立体匹配算法", 华东理工大学学报(自然科学版), no. 005, 31 December 2017 (2017-12-31) *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112258635A (en) * 2020-10-26 2021-01-22 北京石油化工学院 Three-dimensional reconstruction method and device based on improved binocular matching SAD algorithm
CN112258635B (en) * 2020-10-26 2023-07-21 北京石油化工学院 Three-dimensional reconstruction method and device based on improved binocular matching SAD algorithm
CN113965742A (en) * 2021-02-28 2022-01-21 北京中科慧眼科技有限公司 Dense disparity map extraction method and system based on multi-sensor fusion and intelligent terminal
CN112950698A (en) * 2021-03-18 2021-06-11 北京拙河科技有限公司 Depth estimation method, device, medium, and apparatus based on binocular defocused image
CN112950698B (en) * 2021-03-18 2024-03-26 北京拙河科技有限公司 Depth estimation method, device, medium and equipment based on binocular defocused image

Also Published As

Publication number Publication date
CN111179333B (en) 2024-04-26

Similar Documents

Publication Publication Date Title
EP3673461B1 (en) Systems and methods for hybrid depth regularization
CN107767413B (en) Image depth estimation method based on convolutional neural network
US8385630B2 (en) System and method of processing stereo images
CN111179333B (en) Defocus blur kernel estimation method based on binocular stereo vision
CN107170008B (en) Depth map creating method and system and image blurring method and system
CN110223377A (en) One kind being based on stereo visual system high accuracy three-dimensional method for reconstructing
Raihan A et al. Review of underwater image restoration algorithms
CN109887021B (en) Cross-scale-based random walk stereo matching method
CN108564620B (en) Scene depth estimation method for light field array camera
CN111160229B (en) SSD network-based video target detection method and device
CN111223059B (en) Robust depth map structure reconstruction and denoising method based on guide filter
Ali et al. Robust focus volume regularization in shape from focus
CN109671031B (en) Multispectral image inversion method based on residual learning convolutional neural network
Zhu et al. Image guided depth enhancement via deep fusion and local linear regularizaron
CN107220945B (en) Restoration method of multiple degraded extremely blurred image
CN108805841B (en) Depth map recovery and viewpoint synthesis optimization method based on color map guide
CN112991504A (en) Improved method for filling holes based on TOF camera three-dimensional reconstruction
KR20180014149A (en) Apparatus and method for generating depth information
CN109816710B (en) Parallax calculation method for binocular vision system with high precision and no smear
CN110490877B (en) Target segmentation method for binocular stereo image based on Graph Cuts
Guan et al. DiffWater: Underwater Image Enhancement Based on Conditional Denoising Diffusion Probabilistic Model
Sahay et al. Shape extraction of low‐textured objects in video microscopy
Mahmood et al. Measuring focus quality in vector valued images for shape from focus
Liu et al. Bokeh rendering based on adaptive depth calibration network
Shuang et al. Algorithms for improving the quality of underwater optical images: A comprehensive review

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant