WO2018161270A1 - 时空结合的散斑三维成像方法及装置 - Google Patents

时空结合的散斑三维成像方法及装置 Download PDF

Info

Publication number
WO2018161270A1
WO2018161270A1 PCT/CN2017/075942 CN2017075942W WO2018161270A1 WO 2018161270 A1 WO2018161270 A1 WO 2018161270A1 CN 2017075942 W CN2017075942 W CN 2017075942W WO 2018161270 A1 WO2018161270 A1 WO 2018161270A1
Authority
WO
WIPO (PCT)
Prior art keywords
speckle image
pixel
corresponding point
value
sub
Prior art date
Application number
PCT/CN2017/075942
Other languages
English (en)
French (fr)
Inventor
刘晓利
赵恢和
汤其剑
彭翔
蔡泽伟
Original Assignee
深圳大学
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳大学 filed Critical 深圳大学
Priority to PCT/CN2017/075942 priority Critical patent/WO2018161270A1/zh
Publication of WO2018161270A1 publication Critical patent/WO2018161270A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects

Definitions

  • the invention belongs to the field of image processing, and in particular relates to a method and a device for speckle three-dimensional imaging combined with time and space.
  • the three-dimensional imaging technology based on speckle structure light illumination is a non-contact, optical three-dimensional digital imaging and measurement method. It is widely used in three-dimensional deformation strain measurement of objects. The performance of the material of the measured object can be better understood and analyzed by the speckle structured light illumination three-dimensional imaging technology. With the rapid development of 3D imaging and measurement technology, shortening measurement time and improving measurement accuracy have become the main research directions. Since 3D reconstruction accuracy can affect measurement accuracy, how to improve the accuracy of 3D reconstruction becomes more important.
  • the three-dimensional reconstruction method based on random speckle images mainly adopts a spatial correlation method, and the spatial correlation method can realize three-dimensional reconstruction by using only a single image, but the spatial correlation method is based on the gray-scale variation of the matching region.
  • the three-dimensional reconstruction results of the spatial correlation method are less accurate due to the difference in the discharge position of the imaging device and the uneven gradient of the surface gradient of the object to be measured.
  • the invention provides a space-time combined speckle three-dimensional imaging method and device, aiming at solving the problem of low accuracy of three-dimensional reconstruction of the existing spatial correlation method.
  • the present invention provides a space-time combined speckle three-dimensional imaging method, comprising:
  • Three-dimensional reconstruction is performed on the time node by the corresponding point to be three-dimensionally reconstructed.
  • the present invention provides a space-time combined speckle three-dimensional imaging apparatus, comprising:
  • An obtaining module configured to select a time node from the preset time series, and obtain, from the selected time node, a set of left speckle image sequences and a set of right speckle image sequences respectively output by the left and right imaging devices, wherein The left speckle image sequence and the right speckle image sequence contain the same number of images;
  • Corresponding point search module configured to perform the left speckle image sequence and the right speckle image sequence a time correlation operation to determine an integer pixel level corresponding point in the sequence of right speckle images
  • the pixel corresponds to a point search operation to obtain a sub-pixel corresponding point
  • a three-dimensional reconstruction module configured to perform three-dimensional reconstruction on the time node by using the corresponding point to be three-dimensionally reconstructed.
  • the space-time combined speckle three-dimensional imaging method and device selects a time node from a preset time series, and obtains a set of left speckle image sequences respectively output by the left and right imaging devices from the selected time node.
  • a set of right speckle image sequences wherein the left speckle image sequence and the right speckle image sequence contain the same number of images, and the left speckle image sequence and the right speckle image sequence are time-correlated operation to Determining an integer pixel-level corresponding point in the right speckle image sequence, and the right speckle image according to the pixel-level corresponding point, the spatial correlation function, and the pixel coordinate of each left speckle image in the left speckle image sequence
  • the sub-pixel corresponding point search operation is performed on each right speckle image in the sequence to obtain a sub-pixel corresponding point, and the corresponding point to be reconstructed in three dimensions is calculated according to the time average operation of the corresponding point of the sub-pixel, and the three-dimensional reconstruction is performed on the time node.
  • the corresponding points are three-dimensionally reconstructed, so that by combining spatial correlation operations with time-correlated operations, multiple frames can be initiated for any time node.
  • FIG. 1 is a schematic flow chart showing the implementation of a space-time combined speckle three-dimensional imaging method according to a first embodiment of the present invention
  • FIG. 2 is a schematic diagram showing the position of a projection device and an imaging device according to an embodiment of the present invention
  • FIG. 3 is a schematic diagram of a three-dimensional digital model of a blade-reconstruction of a conventional spatial correlation method
  • FIG. 4 is a schematic diagram of a three-dimensional digital model of a leaf reconstruction by a space-time combined speckle three-dimensional imaging method according to an embodiment of the present invention
  • FIG. 5 is a schematic structural diagram of a space-time combined speckle three-dimensional imaging apparatus according to a second embodiment of the present invention.
  • FIG. 1 is a schematic flowchart showing the implementation of a three-dimensional imaging method for spatial and temporal integration of a speckle according to a first embodiment of the present invention, which can be applied to an optical three-dimensional scanning system, and the space-time combined speckle three-dimensional imaging method shown in FIG. It mainly includes the following steps:
  • two acquisition conditions need to be satisfied: one is that the left speckle image sequence and the right speckle image sequence contain the same number of images; the second is to acquire the time node of each speckle image in the left speckle image sequence.
  • the time nodes of each speckle image in the sequence of right speckle images are obtained to be consistent.
  • acquiring a set of left speckle image sequences and a set of right speckle image sequences respectively outputted by the left and right imaging devices from the selected time node does not limit the order of acquiring the speckle images, and only needs to satisfy the above two acquisition conditions. Just fine. For example, if the time series is t 0 , t 1 , . . . , t 5 , and if the time node t 3 is acquired, the image may be acquired in reverse order, or the image may be acquired in the positive order, and the image may be acquired in the positive sequence and the reverse sequence respectively. .
  • the manner of acquiring the image sequence has multiple representations.
  • the left speckle image sequence output by the left imaging device is t 1 , t 2 , . . . t n+1 .
  • the method further includes: projecting a random digital speckle pattern onto the surface of the object by using the projection device, and separately collecting the left and right sides with the object by the left and right imaging devices placed on both sides of the projection device Spot image.
  • FIG. 2 is a schematic view showing the position of the projection device and the imaging device.
  • two imaging devices such as cameras, etc., are located on either side of the projection device.
  • an imaging device located on the left side of the projection device is referred to as a left imaging device; a right imaging device located on the right side of the projection device is provided
  • the set of images output by the left imaging device is a sequence of left speckle images, and a set of images output from the right imaging device is a sequence of right speckle images.
  • the projection device and the two imaging devices constitute a conventional binocular stereoscopic device.
  • the speckle area of each of the left speckle image sequence and the right speckle image sequence is a photographed object.
  • Time correlation is also known as time domain correlation.
  • time correlation calculation formula Performing time on the left speckle image sequence and the right speckle image sequence according to a time correlation calculation formula Correlation operation to determine a corresponding point corresponding to each pixel point in the sequence of the left speckle image in the sequence of the right speckle image, wherein the time correlation calculation formula is:
  • X i,j,t is the gray value of the left speckle image of the image plane point (i,j) of the left imaging device
  • X′ i′,j′,t represents the image plane of the right imaging device.
  • Corresponding point (i', j') in the gray value of the tth right speckle image with Representing the grayscale average of the left-and-right imaging device image plane point (i, j) and the corresponding point (i', j') in k, and the k-right speckle image sequence Gray average value;
  • the corresponding point corresponding to the maximum value is selected as the corresponding point of the entire pixel level by the calculation result value of the time correlation calculation formula.
  • k is greater than or equal to 2.
  • k represents k images in the left speckle image sequence and k images in the right speckle image sequence.
  • Spatial correlation functions are also known as spatial correlation functions.
  • subpixel correspondence is performed on each right flutter image of the right speckle image sequence according to the pixel-level corresponding point, the spatial correlation function, and the pixel coordinate of each left speckle image in the left speckle image sequence.
  • the point search operation obtains the sub-pixel corresponding point as follows:
  • the nonlinear spatial correlation function w(s) under the second-order parallax model is used as the function to be optimized for the N-R iterative operation;
  • the average value of the pixel points in the reference sub-window on the left speckle image For the right speckle image, the gray level average of the pixel points in the reference sub-window, p R (u R , v R ) is the gray value of the pixel point p R in the reference sub-window of the left speckle image, p G (u G , v G ) is the gray value of the corresponding point p G on the right speckle image to be matched;
  • N is an integer greater than or equal to 1.
  • s 0 is the initial estimated value of the preset iteration
  • Is the gradient value of the correlation function at s N-1
  • M represents the number of s parameters
  • the sub-pixel corresponding point is calculated based on the result value and the second-order parallax model.
  • the preset number of iteration steps is a preset value, and the value of the preset iteration step may be one step or multiple steps.
  • the sub-pixel corresponding point calculated here is a plurality of sub-pixel corresponding points, that is, a set of sub-pixel corresponding point sequences.
  • the nonlinear spatial correlation function w(s) under the second-order parallax model is used as the function to be optimized for the N-R iterative operation;
  • the average value of the pixel points in the reference sub-window on the left speckle image For the right speckle image, the gray level average of the pixel points in the reference sub-window, p R (u R , v R ) is the gray value of the pixel point p R in the reference sub-window of the left speckle image, p G (u G , v G ) is the gray value of the corresponding point p G on the right speckle image to be matched;
  • N is an integer greater than or equal to 1.
  • s 0 is the initial estimated value of the preset iteration
  • Is the gradient value of the correlation function at s N-1
  • M represents the number of s parameters
  • the iterative operation is stopped, and the coordinate value corresponding to the correlation function value s N calculated by the last iteration operation is taken as the sub-pixel corresponding point.
  • an accurate corresponding point that is, a corresponding point to be three-dimensionally reconstructed, is calculated.
  • calculating a corresponding point to be three-dimensionally reconstructed according to a time average operation of the corresponding point of the sub-pixel is specifically:
  • the points are selected in the t th left speckle image in the speckle image sequence. This point Corresponding sub-pixel corresponding points on the t-th right speckle image are Then the corresponding point to be reconstructed in three dimensions is
  • the process of performing three-dimensional reconstruction on the corresponding node to be three-dimensionally reconstructed at the time node is performed by using the principle of stereoscopic vision, which is a prior art and will not be described herein.
  • the time series and the speckle image sequence are consistent.
  • the time node selected in the time series is t
  • the left speckle image sequence is the t-th speckle image and the right speckle image sequence. It is also the t-th image of the speckle.
  • the images described in the embodiments of the present invention are both speckle images.
  • FIGS. 3 and 4 are schematic diagrams of a three-dimensional digital model reconstructed from a blade using the existing spatial correlation method and the method described in the embodiment of the present invention, respectively. Compare Figure 3 and Figure 4, It can be seen that the method provided by the embodiment of the present invention has higher precision than the existing spatial correlation method.
  • a time node is selected from a preset time series, and a set of left speckle image sequences and a set of right speckle image sequences respectively output by the left and right imaging devices are acquired from the selected time node.
  • Performing a time correlation operation on the left speckle image sequence and the right speckle image sequence to determine an integer pixel level corresponding point in the right speckle image sequence, according to the integer pixel level corresponding point, the spatial correlation function, and the left speckle a pixel point coordinate of each left speckle image in the image sequence and performing a sub-pixel corresponding point search operation on each right speckle image in the sequence of the right speckle image to obtain a sub-pixel corresponding point, according to a time average of the corresponding point of the sub-pixel
  • the operation calculates a corresponding point to be reconstructed in three dimensions, and performs three-dimensional reconstruction on the time node through the corresponding point to be reconstructed in three dimensions, so that by combining the spatial correlation operation with the time correlation operation, multiple frames
  • the image is searched for a corresponding point operation, and the corresponding point of the high-precision three-dimensional reconstruction is searched, and the corresponding point to be three-dimensionally reconstructed according to the high precision is searched for.
  • Line reconstruction and to improve the accuracy of three-dimensional reconstruction.
  • FIG. 5 is a schematic structural diagram of a space-time combined speckle three-dimensional imaging apparatus according to a second embodiment of the present invention. For convenience of description, only parts related to the embodiment of the present invention are shown.
  • the spatiotemporal combined speckle three-dimensional imaging apparatus illustrated in FIG. 5 may be the execution subject of the spatiotemporal combined speckle three-dimensional imaging method provided by the foregoing embodiment shown in FIG.
  • the space-time combined speckle three-dimensional imaging apparatus illustrated in FIG. 5 mainly includes an acquisition module 501, a corresponding point search module 502, and a three-dimensional reconstruction module 503. The above functional modules are described in detail as follows:
  • the obtaining module 501 is configured to select a time node from the preset time series, and obtain, from the selected time node, a set of left speckle image sequences and a set of right speckle image sequences respectively output by the left and right imaging devices.
  • two acquisition conditions need to be satisfied: one is that the left speckle image sequence and the right speckle image sequence contain the same number of images; the second is to acquire the time node of each speckle image in the left speckle image sequence.
  • the time nodes of each speckle image in the sequence of right speckle images are obtained to be consistent.
  • acquiring a set of left speckle image sequences and a set of right speckle image sequences respectively outputted by the left and right imaging devices from the selected time node does not limit the order of acquiring the speckle images, and only needs to satisfy the above two acquisition conditions. Just fine. For example, if the time series is t 0 , t 1 , . . . , t 5 , and if the time node t 3 is acquired, the image may be acquired in reverse order, or the image may be acquired in the positive order, and the image may be acquired in the positive sequence and the reverse sequence respectively. .
  • the manner of acquiring the image sequence has multiple representations.
  • the left speckle image sequence output by the left imaging device is t 1 , t 2 , . . . t n+1 .
  • the corresponding point search module 502 is configured to perform a time correlation operation on the left speckle image sequence and the right speckle image sequence to determine an integer pixel level corresponding point in the right speckle image sequence.
  • the corresponding point search module 502 is further configured to perform the following steps:
  • X i,j,t is the gray value of the left speckle image of the image plane point (i,j) of the left imaging device
  • X′ i′j′, t represents the corresponding image plane of the right imaging device.
  • the gray value of the right speckle image at point t (i', j') with Representing the grayscale average of the left-and-right imaging device image plane point (i, j) and the corresponding point (i', j') in k, and the k-right speckle image sequence Gray average value, where k is greater than or equal to 2;
  • the corresponding point corresponding to the maximum value is selected as the corresponding point of the entire pixel level by the calculation result value of the time correlation calculation formula.
  • the corresponding point search module 502 is further configured to: according to the pixel-level corresponding point of the whole pixel level, the spatial correlation function, and the pixel point coordinates of each left speckle image in the left speckle image sequence, each frame of the right speckle image sequence is scattered right The spot image performs a sub-pixel corresponding point search operation to obtain a sub-pixel corresponding point.
  • the corresponding point search module 502 is further configured to perform the following steps:
  • the nonlinear spatial correlation function w(s) under the second-order parallax model is used as the function to be optimized for the N-R iterative operation;
  • the average value of the pixel points in the reference sub-window on the left speckle image For the right speckle image, the gray level average of the pixel points in the reference sub-window, p R (u R , v R ) is the gray value of the pixel point p R in the reference sub-window of the left speckle image, p G (u G , v G ) is the gray value of the corresponding point p G on the right speckle image to be matched;
  • N is an integer greater than or equal to 1.
  • s 0 is the initial estimated value of the preset iteration
  • Is the gradient value of the correlation function at s N-1
  • M represents the number of s parameters
  • the sub-pixel corresponding point is calculated based on the result value and the second-order parallax model.
  • the corresponding point search module 502 is further configured to perform the following steps:
  • the nonlinear spatial correlation function w(s) under the second-order parallax model is used as the function to be optimized for the N-R iterative operation;
  • the average value of the pixel points in the reference sub-window on the left speckle image For the right speckle image, the gray level average of the pixel points in the reference sub-window, p R (u R , v R ) is the gray value of the pixel point p R in the reference sub-window of the left speckle image, p G (u G , v G ) is the gray value of the corresponding point p G on the right speckle image to be matched;
  • N is in the range of greater than or equal to an integer; in the initial state, N the value 1, s 0 is an initial estimate for the iterative preset; Is the gradient value of the correlation function at s N-1 , For the second partial derivative of the correlation function at s N-1 , M represents the number of s parameters;
  • the iterative operation is stopped, and the coordinate value corresponding to the correlation function value s N calculated by the last iteration operation is taken as the sub-pixel corresponding point.
  • the corresponding point search module 502 is further configured to calculate a corresponding point to be three-dimensionally reconstructed according to a time average operation of the corresponding point of the sub-pixel.
  • corresponding point search module 502 is further configured to correspond to the sub-pixel. Perform a time averaging operation to calculate the corresponding point to be reconstructed in three dimensions
  • the three-dimensional reconstruction module 503 is configured to perform three-dimensional reconstruction on the time node by the corresponding point to be three-dimensionally reconstructed.
  • the apparatus further includes an acquisition module (not shown) for projecting a random digital speckle pattern onto the surface of the object through the projection device, and respectively by the left and right imaging devices placed on both sides of the projection device The left and right speckle images with the object are acquired.
  • an acquisition module (not shown) for projecting a random digital speckle pattern onto the surface of the object through the projection device, and respectively by the left and right imaging devices placed on both sides of the projection device The left and right speckle images with the object are acquired.
  • two imaging devices such as cameras, etc.
  • an imaging device located on the left side of the projection device is referred to as a left imaging device; a right imaging device located on the right side of the projection device is provided
  • the set of images output by the left imaging device is a sequence of left speckle images
  • a set of images output from the right imaging device is a sequence of right speckle images.
  • the projection device and the two imaging devices constitute a conventional binocular stereoscopic device.
  • the speckle area of each of the left speckle image sequence and the right speckle image sequence is a photographed object.
  • the obtaining module 501 selects a time node from the preset time series, and acquires a set of left speckle image sequences and a set of right speckle images respectively output by the left and right imaging devices from the selected time node.
  • the corresponding point search module 502 performs a time correlation operation on the left speckle image sequence and the right speckle image sequence to determine an integer pixel level corresponding point in the right speckle image sequence, according to the corresponding pixel level corresponding point, a spatial correlation function and pixel coordinates of each left speckle image in the left speckle image sequence, performing a sub-pixel corresponding point search operation on each right speckle image in the right speckle image sequence to obtain a sub-pixel corresponding point, according to The time average operation of the sub-pixel corresponding point calculates a corresponding point to be reconstructed in three dimensions, and the three-dimensional reconstruction module 503 performs three-dimensional reconstruction on the time node through the corresponding point to be three-dimensionally reconstructed, thereby combining the spatial correlation operation with the time correlation operation It is possible to search for a corresponding point operation on a plurality of images starting at any time node, and search for a pair of high-precision pairs to be reconstructed in three dimensions. Point, The three-dimensional reconstruction is
  • the disclosed systems, devices, and methods may be implemented in other ways.
  • the device embodiments described above are merely illustrative.
  • the division of the modules is only a logical function division.
  • there may be another division manner for example, multiple modules or components may be combined or Can be integrated into another system, or some features can be ignored or not executed.
  • the mutual coupling or direct coupling or communication link shown or discussed may be an indirect coupling or communication link through some interface, device or module, and may be electrical, mechanical or otherwise.
  • the modules described as separate components may or may not be physically separated.
  • the components displayed as modules may or may not be physical modules, that is, may be located in one place, or may be distributed to multiple network modules. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the embodiment.
  • each functional module in each embodiment of the present invention may be integrated into one processing module, or each module may exist physically separately, or two or more modules may be integrated into one module.
  • the above integrated modules can be implemented in the form of hardware or in the form of software functional modules.
  • the integrated modules if implemented in the form of software functional modules and sold or used as separate products, may be stored in a computer readable storage medium.
  • the technical solution of the present invention which is essential or contributes to the prior art, or all or part of the technical solution, may be embodied in the form of a software product stored in a storage medium.
  • a number of instructions are included to cause a computer device (which may be a personal computer, server, or network device, etc.) to perform all or part of the steps of the methods described in various embodiments of the present invention.
  • the foregoing storage medium includes: a U disk, a mobile hard disk, a read-only memory (ROM), a random access memory (RAM), a magnetic disk, or an optical disk, and the like. .

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

一种时空结合的散斑三维成像方法及装置,该方法包括:从预置时间序列中选取时间节点,并从选取的该时间节点开始获取左、右成像装置分别输出的一组左散斑图像序列和一组右散斑图像序列(S101),对该左散斑图像序列和该右散斑图像序列进行时间相关运算,以在该右散斑图像序列中确定整像素级对应点(S102),根据该整像素级对应点、空间相关函数以及该左散斑图像序列中各左散斑图像的像素点坐标,对该右散斑图像序列中每帧右散斑图像进行亚像素对应点搜索运算,得到亚像素对应点(S103),根据该亚像素对应点的时间平均运算算出待三维重建的对应点(S104),在该时间节点上通过该待三维重建的对应点进行三维重建(S105)。

Description

时空结合的散斑三维成像方法及装置 技术领域
本发明属于图像处理领域,尤其涉及一种时空结合的散斑三维成像方法及装置。
背景技术
基于散斑结构光照明三维成像技术是一种非接触式、光学三维数字成像与测量方法。被广泛应用于物体三维形变应变测量。通过散斑结构光照明三维成像技术可以更好理解和分析被测物体的材料的性能。随着三维成像和测量技术的快速发展,缩短测量时间和提高测量精度成为目前主要的研究方向,由于三维重建精度能够影响测量精度,因此如何提高三维重建的精度变得尤为重要。
现有技术中,基于随机散斑图像的三维重建方法主要采用空间相关方法,空间相关方法只需使用单幅图像即可实现三维重建,但是空间相关方法是建立在匹配区域灰度变化基础之上,由于成像装置排放位置的不同,以及待测物体表面梯度变化不均匀等因素的影响,空间相关方法的三维重建结果精度较低。
发明内容
本发明提供一种时空结合的散斑三维成像方法及装置,旨在解决现有的空间相关方法的三维重建精度较低的问题。
本发明提供的一种时空结合的散斑三维成像方法,包括:
从预置时间序列中选取时间节点,并从选取的所述时间节点开始获取左、右成像装置分别输出的一组左散斑图像序列和一组右散斑图像序列,其中所述左散斑图像序列和所述右散斑图像序列中包含的图像的数量相同;
对所述左散斑图像序列和所述右散斑图像序列进行时间相关运算,以在所述右散斑图像序列中确定整像素级对应点;
根据所述整像素级对应点、空间相关函数以及所述左散斑图像序列中各左散斑图像的像素点坐标,对所述右散斑图像序列中每帧右散斑图像进行亚像素对应点搜索运算,得到亚像素对应点;
根据所述亚像素对应点的时间平均运算算出待三维重建的对应点;
在所述时间节点上通过所述待三维重建的对应点进行三维重建。
本发明提供的一种时空结合的散斑三维成像装置,包括:
获取模块,用于从预置时间序列中选取时间节点,并从选取的所述时间节点开始获取左、右成像装置分别输出的一组左散斑图像序列和一组右散斑图像序列,其中所述左散斑图像序列和所述右散斑图像序列中包含的图像的数量相同;
对应点搜索模块,用于对所述左散斑图像序列和所述右散斑图像序列进行 时间相关运算,以在所述右散斑图像序列中确定整像素级对应点;
以及,根据所述整像素级对应点、空间相关函数以及所述左散斑图像序列中各左散斑图像的像素点坐标,对所述右散斑图像序列中每帧右散斑图像进行亚像素对应点搜索运算,得到亚像素对应点;
以及,根据所述亚像素对应点的时间平均运算算出待三维重建的对应点;
三维重建模块,用于在所述时间节点上通过所述待三维重建的对应点进行三维重建。
本发明提供的时空结合的散斑三维成像方法及装置,从预置时间序列中选取时间节点,并从选取的该时间节点开始获取左、右成像装置分别输出的一组左散斑图像序列和一组右散斑图像序列,其中该左散斑图像序列和该右散斑图像序列中包含的图像的数量相同,对该左散斑图像序列和该右散斑图像序列进行时间相关运算,以在该右散斑图像序列中确定整像素级对应点,根据该整像素级对应点、空间相关函数以及该左散斑图像序列中各左散斑图像的像素点坐标,对该右散斑图像序列中每帧右散斑图像进行亚像素对应点搜索运算,得到亚像素对应点,根据该亚像素对应点的时间平均运算算出待三维重建的对应点,在该时间节点上通过该待三维重建的对应点进行三维重建,这样通过将空间相关运算与时间相关运算相结合,可以对任一时间节点起始的多幅图像进行搜索对应点运算,搜索到精度高的待三维重建的对应点,在依据该精度高的待三维重建的对应点进行三维重建,进而提高了三维重建的精度。
附图说明
为了更清楚地说明本发明实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本发明的一些实施例。
图1是本发明第一实施例提供的时空结合的散斑三维成像方法的实现流程示意图;
图2是本发明实施例提供的投影装置和成像装置的位置示意图;
图3是现有的空间相关方法对一扇叶重建的三维数字模型的示意图;
图4是本发明实施例提供的时空结合的散斑三维成像方法对一扇叶重建的三维数字模型的示意图;
图5是本发明第二实施例提供的时空结合的散斑三维成像装置的结构示意图。
具体实施方式
为使得本发明的发明目的、特征、优点能够更加的明显和易懂,下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本发明一部分实施例,而非全部实施例。基于本发明中的实施例,本领域技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
请参阅图1,图1为本发明第一实施例提供时空结合的散斑三维成像方法的实现流程示意图,可应用于光学三维扫描系统,图1所示的时空结合的散斑三维成像方法,主要包括以下步骤:
S101、从预置时间序列中选取时间节点,并从选取的该时间节点开始获取左、右成像装置分别输出的一组左散斑图像序列和一组右散斑图像序列。
其中,需要满足两个获取条件:一是该左散斑图像序列和该右散斑图像序列中包含的图像的数量相同;二是获取左散斑图像序列中每一散斑图像的时间节点与获取该右散斑图像序列中每一散斑图像的时间节点保持一致。举例说明如下:
如果设从时间节点t0开始获取且获取的顺序为t0,t1,...tn,则t0时,分别从左、右成像装置中获取t0时刻拍摄的一帧左散斑图像和一帧右散斑图像,下一个时间节点t1,再次分别从左、右成像装置中获取t1时刻拍摄的一帧左散斑图像和一帧右散斑图像,依次类推不再赘述。
此外,从选取的该时间节点开始获取左、右成像装置分别输出的一组左散斑图像序列和一组右散斑图像序列不限定获取散斑图像的顺序,只需满足上述两个获取条件即可。例如,设时间序列为t0,t1,...t5,且若从时间节点t3开始获取,则可以倒序获取图像,也可以正序获取图像,还可以正序和倒序分别获取图像。
在满足上述两个获取条件下,获取图像序列的方式有多种表示方式,例如,以一组左散斑图像序列为例,该左散斑图像序列可以表示为:ti(i=0,1,2,...n),设在时间节点t0开始获取为n,则左成像装置输出的左散斑图像序列为t0,t1,...tn,同理在时间节点t1开始获取为n+1,则左成像装置输出的左散斑图像序列为t1,t2,...tn+1
进一步地,步骤S101之前还包括:通过投影装置向物体的表面投影随机数字散斑图案,并通过放置于该投影装置两侧的该左、右成像装置分别采集带有该物体的左、右散斑图像。
如图2所示,图2为投影装置和成像装置的位置示意图。从图2中可以看出,两个成像装置,如相机等位于投影装置的两侧。需要说明的是,为了便于说明,在本发明的所有实施例中将位于该投影装置的左侧的成像装置称为左成像装置;位于该投影装置右侧的称为右成像装置,设从该左成像装置输出的一组图像为左散斑图像序列,从该右成像装置输出的一组图像为右散斑图像序列。其中该投影装置和两个成像装置组成了传统的双目立体视觉装置。其中该左散斑图像序列和该右散斑图像序列中各图像的散斑区域为拍摄的物体。
S102、对该左散斑图像序列和该右散斑图像序列进行时间相关运算,以在该右散斑图像序列中确定整像素级对应点。
时间相关又称为时域相关。
进一步地,对该左散斑图像序列和该右散斑图像序列进行时间相关运算,以在该右散斑图像序列中确定整像素级对应点具体为:
根据时间相关计算公式对该左散斑图像序列和该右散斑图像序列进行时间 相关运算,以在右散斑图像序列中确定该左散斑图像序列中各像素点对应的对应点,其中该时间相关计算公式为:
Figure PCTCN2017075942-appb-000001
其中,Xi,j,t表示为左成像装置图像平面点(i,j)在第t幅左散斑图像的灰度值,X′i′,j′,t表示右成像装置图像平面内对应点(i′,j′)在第t幅右散斑图像的灰度值,
Figure PCTCN2017075942-appb-000002
Figure PCTCN2017075942-appb-000003
分别表示左、右成像装置图像平面内点(i,j)和对应点(i′,j′)在k幅该左散斑图像序列的灰度平均值和在k幅该右散斑图像序列的灰度平均值;
在通过该时间相关计算公式的计算结果值中选取最大值对应的对应点作为整像素级对应点。
其中k为大于或等于2。这里k表示左散斑图像序列中有k幅图像,右散斑图像序列中有k幅图像。
S103、根据该整像素级对应点、空间相关函数以及该左散斑图像序列中各左散斑图像的像素点坐标,对该右散斑图像序列中每帧右散斑图像进行亚像素对应点搜索运算,得到亚像素对应点。
空间相关函数又称为空域相关函数。
进一步地,根据该整像素级对应点、空间相关函数以及该左散斑图像序列中各左散斑图像的像素点坐标,对该右散斑图像序列中每帧右散斑图像进行亚像素对应点搜索运算,得到亚像素对应点具体为:
在该左散斑图像序列中各左散斑图像内创建窗口大小为(2wm+1)×(2wm+1)的参考子窗口;
将二阶视差模型下的非线性空间相关函数w(s)作为N-R迭代运算的待优化函数;
Figure PCTCN2017075942-appb-000004
其中,
Figure PCTCN2017075942-appb-000005
为该左散斑图像上该参考子窗口内像素点灰度平均值,
Figure PCTCN2017075942-appb-000006
为该右散斑图像上参考子窗口内像素点灰度平均值,pR(uR,vR)为左散斑图像该参考子窗口内像素点pR的灰度值,pG(uG,vG)为待匹配的右散斑图像上的对应点pG的灰度值;
按照预置迭代步数,并根据N-R迭代运算公式
Figure PCTCN2017075942-appb-000007
进行迭代运算,确定最后一次迭代运算算出的相关函数值sN为结果值,其中,
Figure PCTCN2017075942-appb-000008
Figure PCTCN2017075942-appb-000009
Figure PCTCN2017075942-appb-000010
Figure PCTCN2017075942-appb-000011
其中,N的取值范围为大于或等于1的整数;初始状态下,N取值为1,则s0为预置迭代初始估计值;
Figure PCTCN2017075942-appb-000012
为相关函数在sN-1处的梯度值,
Figure PCTCN2017075942-appb-000013
为相关函数在sN-1处的二次偏导,M表示s参数的个数;
根据该结果值和二阶视差模型算出该亚像素对应点。
迭代运算是从N=1开始迭代,后续是N=2,3…。该预置迭代步数为预设值,该预置迭代步数的取值可以为1步,也可以为多步。
需要说明的是,
Figure PCTCN2017075942-appb-000014
是依据上述w(s)公式变形得到的,故w(s)=w(sN-1)。
这里算出的亚像素对应点为多个亚像素对应点,即为一组亚像素对应点序列。
与上述方法并列,进一步地,根据该整像素级对应点、空间相关函数以及该左散斑图像序列中各左散斑图像的像素点坐标,对该右散斑图像序列中每帧右散斑图像进行亚像素对应点搜索运算,得到亚像素对应点具体为:
在该左散斑图像序列中各左散斑图像内创建窗口大小为(2wm+1)×(2wm+1)的参考子窗口;
将二阶视差模型下的非线性空间相关函数w(s)作为N-R迭代运算的待优化函数;
Figure PCTCN2017075942-appb-000015
其中,
Figure PCTCN2017075942-appb-000016
为该左散斑图像上该参考子窗口内像素点灰度平均值,
Figure PCTCN2017075942-appb-000017
为该右散斑图像上参考子窗口内像素点灰度平均值,pR(uR,vR)为左散斑图像该参考子窗口内像素点pR的灰度值,pG(uG,vG)为待匹配的右散斑图像上的对应点pG的灰度值;
根据N-R迭代运算公式
Figure PCTCN2017075942-appb-000018
进行迭代运算,算出相关函数值sN,其中,
Figure PCTCN2017075942-appb-000019
Figure PCTCN2017075942-appb-000020
Figure PCTCN2017075942-appb-000021
Figure PCTCN2017075942-appb-000022
其中,N的取值范围为大于或等于1的整数;初始状态下,N取值为1,则s0为预置迭代初始估计值;
Figure PCTCN2017075942-appb-000023
为相关函数在sN-1处的梯度值,
Figure PCTCN2017075942-appb-000024
为相关函数在sN-1处的二次偏导,M表示s参数的个数;
根据算出的相关函数值sN和二阶视差模型算出坐标值,并对相邻两次迭代运算算出的相关函数值sN对应的坐标值求差,算出差值;
若该差值小于预置阈值,则停止迭代运算,并将最后一次迭代运算算出的相关函数值sN对应的坐标值作为该亚像素对应点。
S104、根据该亚像素对应点的时间平均运算算出待三维重建的对应点。
根据该亚像素对应点的时间平均运算,算出精确的对应点,即该待三维重建的对应点。
进一步地,根据该亚像素对应点的时间平均运算算出待三维重建的对应点具体为:
对该亚像素对应点
Figure PCTCN2017075942-appb-000025
进行时间平均运算,算出该待三维重建的对应点
Figure PCTCN2017075942-appb-000026
若左散斑图像序列和右散斑图像序列分别共有k幅图像,则在该散斑图像序列中的第t幅左散斑图像内选取点
Figure PCTCN2017075942-appb-000027
该点
Figure PCTCN2017075942-appb-000028
对应的在第t幅右散斑图像上的亚像素对应点为
Figure PCTCN2017075942-appb-000029
则该待三维重建的对应点为
Figure PCTCN2017075942-appb-000030
S105、在该时间节点上通过该待三维重建的对应点进行三维重建。
在该时间节点上通过该待三维重建的对应点进行三维重建的过程,是利用立体视觉原理进行计算,该立体视觉原理是现有技术,此处不再赘述。
需要说明的是,时间序列和散斑图像序列是保持一致的,例如,时间序列中选取的时间节点为t,则左散斑图像序列中是第t幅散斑图像以及右散斑图像序列中也是第t幅散斑图像。本发明实施例中所描述的图像均为散斑图像。
如图3和图4所示,图3和图4是分别使用现有的空间相关方法和本发明实施例中描述的方法对一扇叶重建的三维数字模型的示意图。对比图3和图4, 可见本发明实施例提供的方法的精度高于现有的空间相关方法。
本发明实施例中,从预置时间序列中选取时间节点,并从选取的该时间节点开始获取左、右成像装置分别输出的一组左散斑图像序列和一组右散斑图像序列,对该左散斑图像序列和该右散斑图像序列进行时间相关运算,以在该右散斑图像序列中确定整像素级对应点,根据该整像素级对应点、空间相关函数以及该左散斑图像序列中各左散斑图像的像素点坐标,对该右散斑图像序列中每帧右散斑图像进行亚像素对应点搜索运算,得到亚像素对应点,根据该亚像素对应点的时间平均运算算出待三维重建的对应点,在该时间节点上通过该待三维重建的对应点进行三维重建,这样通过将空间相关运算与时间相关运算相结合,可以对任一时间节点起始的多幅图像进行搜索对应点运算,搜索到精度高的待三维重建的对应点,在依据该精度高的待三维重建的对应点进行三维重建,进而提高了三维重建的精度。
请参阅图5,图5是本发明第二实施例提供的时空结合的散斑三维成像装置的结构示意图,为了便于说明,仅示出了与本发明实施例相关的部分。图5示例的时空结合的散斑三维成像装置可以是前述图1所示实施例提供的时空结合的散斑三维成像方法的执行主体。图5示例的时空结合的散斑三维成像装置,主要包括:获取模块501、对应点搜索模块502和三维重建模块503。以上各功能模块详细说明如下:
获取模块501,用于从预置时间序列中选取时间节点,并从选取的该时间节点开始获取左、右成像装置分别输出的一组左散斑图像序列和一组右散斑图像序列。
其中,需要满足两个获取条件:一是该左散斑图像序列和该右散斑图像序列中包含的图像的数量相同;二是获取左散斑图像序列中每一散斑图像的时间节点与获取该右散斑图像序列中每一散斑图像的时间节点保持一致。举例说明如下:
如果设从时间节点t0开始获取且获取的顺序为t0,t1,...tn,则t0时,分别从左、右成像装置中获取t0时刻拍摄的一帧左散斑图像和一帧右散斑图像,下一个时间节点t1,再次分别从左、右成像装置中获取t1时刻拍摄的一帧左散斑图像和一帧右散斑图像,依次类推不再赘述。
此外,从选取的该时间节点开始获取左、右成像装置分别输出的一组左散斑图像序列和一组右散斑图像序列不限定获取散斑图像的顺序,只需满足上述两个获取条件即可。例如,设时间序列为t0,t1,...t5,且若从时间节点t3开始获取,则可以倒序获取图像,也可以正序获取图像,还可以正序和倒序分别获取图像。
在满足上述两个获取条件下,获取图像序列的方式有多种表示方式,例如,以一组左散斑图像序列为例,该左散斑图像序列可以表示为:ti(i=0,1,2,...n),设在时间节点t0开始获取为n,则左成像装置输出的左散斑图像序列为t0,t1,...tn,同理在时间节点t1开始获取为n+1,则左成像装置输出的左散斑图像序列为t1,t2,...tn+1
对应点搜索模块502,用于对该左散斑图像序列和该右散斑图像序列进行时间相关运算,以在该右散斑图像序列中确定整像素级对应点。
进一步地,该对应点搜索模块502还用于执行以下步骤:
根据时间相关计算公式对该左散斑图像序列和该右散斑图像序列进行时间相关运算,以在右散斑图像序列中确定该左散斑图像序列中各像素点对应的对应点,该时间相关计算公式为:
Figure PCTCN2017075942-appb-000031
其中,Xi,j,t表示为左成像装置图像平面点(i,j)在第t幅左散斑图像的灰度值,X′i′j′,t表示右成像装置图像平面内对应点(i′,j′)在第t幅右散斑图像的灰度值,
Figure PCTCN2017075942-appb-000032
Figure PCTCN2017075942-appb-000033
分别表示左、右成像装置图像平面内点(i,j)和对应点(i′,j′)在k幅该左散斑图像序列的灰度平均值和在k幅该右散斑图像序列的灰度平均值,其中k为大于或等于2;
在通过该时间相关计算公式的计算结果值中选取最大值对应的对应点作为整像素级对应点。
对应点搜索模块502,还用于根据该整像素级对应点、空间相关函数以及该左散斑图像序列中各左散斑图像的像素点坐标,对该右散斑图像序列中每帧右散斑图像进行亚像素对应点搜索运算,得到亚像素对应点。
进一步地,该对应点搜索模块502还用于执行以下步骤:
在该左散斑图像序列中各左散斑图像内创建窗口大小为(2wm+1)×(2wm+1)的参考子窗口;
将二阶视差模型下的非线性空间相关函数w(s)作为N-R迭代运算的待优化函数;
Figure PCTCN2017075942-appb-000034
其中,
Figure PCTCN2017075942-appb-000035
为该左散斑图像上该参考子窗口内像素点灰度平均值,
Figure PCTCN2017075942-appb-000036
为该右散斑图像上参考子窗口内像素点灰度平均值,pR(uR,vR)为左散斑图像该参考子窗口内像素点pR的灰度值,pG(uG,vG)为待匹配的右散斑图像上的对应点pG的灰度值;
按照预置迭代步数,并根据N-R迭代运算公式
Figure PCTCN2017075942-appb-000037
进行迭代运算,确定最后一次迭代运算算出的相关函数值sN为结果值,其中,
Figure PCTCN2017075942-appb-000038
Figure PCTCN2017075942-appb-000039
Figure PCTCN2017075942-appb-000040
Figure PCTCN2017075942-appb-000041
其中,N的取值范围为大于或等于1的整数;初始状态下,N取值为1,则s0为预置迭代初始估计值;
Figure PCTCN2017075942-appb-000042
为相关函数在sN-1处的梯度值,
Figure PCTCN2017075942-appb-000043
为相关函数在sN-1处的二次偏导,M表示s参数的个数;
根据该结果值和二阶视差模型算出该亚像素对应点。
需要说明的是,
Figure PCTCN2017075942-appb-000044
是依据上述w(s)公式变形得到的,故w(s)=w(sN-1)。可选地,该对应点搜索模块502还用于执行以下步骤:
在该左散斑图像序列中各左散斑图像内创建窗口大小为(2wm+1)×(2wm+1)的参考子窗口;
将二阶视差模型下的非线性空间相关函数w(s)作为N-R迭代运算的待优化函数;
Figure PCTCN2017075942-appb-000045
其中,
Figure PCTCN2017075942-appb-000046
为该左散斑图像上该参考子窗口内像素点灰度平均值,
Figure PCTCN2017075942-appb-000047
为该右散斑图像上参考子窗口内像素点灰度平均值,pR(uR,vR)为左散斑图像该参考子窗口内像素点pR的灰度值,pG(uG,vG)为待匹配的右散斑图像上的对应点pG的灰度值;
根据N-R迭代运算公式
Figure PCTCN2017075942-appb-000048
进行迭代运算,算出相关函数值sN,其中,
Figure PCTCN2017075942-appb-000049
Figure PCTCN2017075942-appb-000050
Figure PCTCN2017075942-appb-000051
Figure PCTCN2017075942-appb-000052
其中,N的取值范围为大于或等于1的整数;初始状态下,N取值为1,则s0为预置迭代初始估计值;
Figure PCTCN2017075942-appb-000053
为相关函数在sN-1处的梯度值,
Figure PCTCN2017075942-appb-000054
为相关函数在sN-1处的二次偏导,M表示s参数的个数;
根据算出的相关函数值sN和二阶视差模型算出坐标值,并对相邻两次迭代运算算出的相关函数值sN对应的坐标值求差,算出差值;
若该差值小于预置阈值,则停止迭代运算,并将最后一次迭代运算算出的相关函数值sN对应的坐标值作为该亚像素对应点。
对应点搜索模块502,还用于根据该亚像素对应点的时间平均运算算出待三维重建的对应点。
进一步地,对应点搜索模块502还用于对该亚像素对应点
Figure PCTCN2017075942-appb-000055
进行时间平均运算,算出该待三维重建的对应点
Figure PCTCN2017075942-appb-000056
三维重建模块503,用于在该时间节点上通过该待三维重建的对应点进行三维重建。
进一步地,该装置还包括采集模块(图中未示出),用于通过投影装置向物体的表面投影随机数字散斑图案,并通过放置于该投影装置两侧的该左、右成像装置分别采集带有该物体的左、右散斑图像。
从图2中可以看出,两个成像装置,如相机等位于投影装置的两侧。需要说明的是,为了便于说明,在本发明的所有实施例中将位于该投影装置的左侧的成像装置称为左成像装置;位于该投影装置右侧的称为右成像装置,设从该左成像装置输出的一组图像为左散斑图像序列,从该右成像装置输出的一组图像为右散斑图像序列。其中该投影装置和两个成像装置组成了传统的双目立体视觉装置。其中该左散斑图像序列和该右散斑图像序列中各图像的散斑区域为拍摄的物体。
本实施例未尽之细节,请参阅前述图1所示实施例的描述,此处不再赘述。
本发明实施例中,获取模块501从预置时间序列中选取时间节点,并从选取的该时间节点开始获取左、右成像装置分别输出的一组左散斑图像序列和一组右散斑图像序列,对应点搜索模块502对该左散斑图像序列和该右散斑图像序列进行时间相关运算,以在该右散斑图像序列中确定整像素级对应点,根据该整像素级对应点、空间相关函数以及该左散斑图像序列中各左散斑图像的像素点坐标,对该右散斑图像序列中每帧右散斑图像进行亚像素对应点搜索运算,得到亚像素对应点,根据该亚像素对应点的时间平均运算算出待三维重建的对应点,三维重建模块503在该时间节点上通过该待三维重建的对应点进行三维重建,这样通过将空间相关运算与时间相关运算相结合,可以对任一时间节点起始的多幅图像进行搜索对应点运算,搜索到精度高的待三维重建的对应点, 在依据该精度高的待三维重建的对应点进行三维重建,进而提高了三维重建的精度。
在本申请所提供的多个实施例中,应该理解到,所揭露的系统、装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述模块的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个模块或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信链接可以是通过一些接口,装置或模块的间接耦合或通信链接,可以是电性,机械或其它的形式。
所述作为分离部件说明的模块可以是或者也可以不是物理上分开的,作为模块显示的部件可以是或者也可以不是物理模块,即可以位于一个地方,或者也可以分布到多个网络模块上。可以根据实际的需要选择其中的部分或者全部模块来实现本实施例方案的目的。
另外,在本发明各个实施例中的各功能模块可以集成在一个处理模块中,也可以是各个模块单独物理存在,也可以两个或两个以上模块集成在一个模块中。上述集成的模块既可以采用硬件的形式实现,也可以采用软件功能模块的形式实现。
所述集成的模块如果以软件功能模块的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本发明的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行本发明各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(ROM,Read-Only Memory)、随机存取存储器(RAM,Random Access Memory)、磁碟或者光盘等各种可以存储程序代码的介质。
需要说明的是,对于前述的各方法实施例,为了简便描述,故将其都表述为一系列的动作组合,但是本领域技术人员应该知悉,本发明并不受所描述的动作顺序的限制,因为依据本发明,某些步骤可以采用其它顺序或者同时进行。其次,本领域技术人员也应该知悉,说明书中所描述的实施例均属于优选实施例,所涉及的动作和模块并不一定都是本发明所必须的。
在上述实施例中,对各个实施例的描述都各有侧重,某个实施例中没有详述的部分,可以参见其它实施例的相关描述。
以上为对本发明所提供的时空结合的散斑三维成像方法及装置的描述,对于本领域的技术人员,依据本发明实施例的思想,在具体实施方式及应用范围上均会有改变之处,综上,本说明书内容不应理解为对本发明的限制。

Claims (10)

  1. 一种时空结合的散斑三维成像方法,其特征在于,包括:
    从预置时间序列中选取时间节点,并从选取的所述时间节点开始获取左、右成像装置分别输出的一组左散斑图像序列和一组右散斑图像序列,其中所述左散斑图像序列和所述右散斑图像序列中包含的图像的数量相同;
    对所述左散斑图像序列和所述右散斑图像序列进行时间相关运算,以在所述右散斑图像序列中确定整像素级对应点;
    根据所述整像素级对应点、空间相关函数以及所述左散斑图像序列中各左散斑图像的像素点坐标,对所述右散斑图像序列中每帧右散斑图像进行亚像素对应点搜索运算,得到亚像素对应点;
    根据所述亚像素对应点的时间平均运算算出待三维重建的对应点;
    在所述时间节点上通过所述待三维重建的对应点进行三维重建。
  2. 根据权利要求1所述的方法,其特征在于,所述对所述左散斑图像序列和所述右散斑图像序列进行时间相关运算,以在所述右散斑图像序列中确定整像素级对应点包括:
    根据时间相关计算公式对所述左散斑图像序列和所述右散斑图像序列进行时间相关运算,以在右散斑图像序列中确定所述左散斑图像序列中各像素点对应的对应点,其中所述时间相关计算公式为:
    Figure PCTCN2017075942-appb-100001
    其中,Xi,j,t表示为左成像装置图像平面点(i,j)在第t幅左散斑图像的灰度值,X′i′,j′,t表示右成像装置图像平面内对应点(i′,j′)在第t幅右散斑图像的灰度值,
    Figure PCTCN2017075942-appb-100002
    Figure PCTCN2017075942-appb-100003
    分别表示左、右成像装置图像平面内点(i,j)和对应点(i′,j′)在k幅所述左散斑图像序列的灰度平均值和在k幅所述右散斑图像序列的灰度平均值,其中k为大于或等于2;
    在通过所述时间相关计算公式的计算结果值中选取最大值对应的对应点作为整像素级对应点。
  3. 根据权利要求2所述的方法,其特征在于,所述根据所述整像素级对应点、空间相关函数以及所述左散斑图像序列中各左散斑图像的像素点坐标,对所述右散斑图像序列中每帧右散斑图像进行亚像素对应点搜索运算,得到亚像素对应点包括:
    在所述左散斑图像序列中各左散斑图像内创建窗口大小为(2wm+1)×(2wm+1)的参考子窗口;
    将二阶视差模型下的非线性空间相关函数w(s)作为N-R迭代运算的待优化函数;
    Figure PCTCN2017075942-appb-100004
    其中,
    Figure PCTCN2017075942-appb-100005
    为所述左散斑图像上所述参考子窗口内像素点灰度平均值,
    Figure PCTCN2017075942-appb-100006
    为所述右散斑图像上参考子窗口内像素点灰度平均值,pR(uR,vR)为左散斑图像所述参考子窗口内像素点pR的灰度值,pG(uG,vG)为待匹配的右散斑图像上的对应点pG的灰度值;
    按照预置迭代步数,并根据N-R迭代运算公式
    Figure PCTCN2017075942-appb-100007
    进行迭代运算,确定最后一次迭代运算算出的相关函数值sN为结果值,其中,
    Figure PCTCN2017075942-appb-100008
    Figure PCTCN2017075942-appb-100009
    Figure PCTCN2017075942-appb-100010
    Figure PCTCN2017075942-appb-100011
    其中,N的取值范围为大于或等于1的整数;初始状态下,N取值为1,则s0为预置迭代初始估计值;
    Figure PCTCN2017075942-appb-100012
    为相关函数在sN-1处的梯度值,
    Figure PCTCN2017075942-appb-100013
    为相关函数在sN-1处的二次偏导,M表示s参数的个数;
    根据所述结果值和二阶视差模型算出所述亚像素对应点。
  4. 根据权利要求2所述的方法,其特征在于,所述根据所述整像素级对应点、空间相关函数以及所述左散斑图像序列中各左散斑图像的像素点坐标,对所述右散斑图像序列中每帧右散斑图像进行亚像素对应点搜索运算,得到亚像素对应点包括:
    在所述左散斑图像序列中各左散斑图像内创建窗口大小为(2wm+1)×(2wm+1)的参考子窗口;
    将二阶视差模型下的非线性空间相关函数w(s)作为N-R迭代运算的待优化函数;
    Figure PCTCN2017075942-appb-100014
    其中,
    Figure PCTCN2017075942-appb-100015
    为所述左散斑图像上所述参考子窗口内像素点灰度平均值,
    Figure PCTCN2017075942-appb-100016
    为所述右散斑图像上参考子窗口内像素点灰度平均值,pR(uR,vR)为左散斑图像所述参考子窗口内像素点pR的灰度值,pG(uG,vG)为待匹配的右散斑图像上的对应点pG的灰度值;
    根据N-R迭代运算公式
    Figure PCTCN2017075942-appb-100017
    进行迭代运算,算出相关函数值sN,其中,
    Figure PCTCN2017075942-appb-100018
    Figure PCTCN2017075942-appb-100019
    Figure PCTCN2017075942-appb-100020
    Figure PCTCN2017075942-appb-100021
    其中,N的取值范围为大于或等于1的整数;初始状态下,N取值为1,则s0为预置迭代初始估计值;
    Figure PCTCN2017075942-appb-100022
    为相关函数在sN-1处的梯度值,
    Figure PCTCN2017075942-appb-100023
    为相关函数在sN-1处的二次偏导,M表示s参数的个数;
    根据算出的相关函数值sN和二阶视差模型算出坐标值,并对相邻两次迭代运算算出的相关函数值sN对应的坐标值求差,算出差值;
    若所述差值小于预置阈值,则停止迭代运算,并将最后一次迭代运算算出的相关函数值sN对应的坐标值作为所述亚像素对应点。
  5. 根据权利要求3或4所述的方法,其特征在于,所述根据所述亚像素对应点的时间平均运算算出待三维重建的对应点包括:
    对所述亚像素对应点Pt G(i′,j′)进行时间平均运算,算出所述待三维重建的对应点
    Figure PCTCN2017075942-appb-100024
  6. 一种时空结合的散斑三维成像装置,其特征在于,所述装置包括:
    获取模块,用于从预置时间序列中选取时间节点,并从选取的所述时间节点开始获取左、右成像装置分别输出的一组左散斑图像序列和一组右散斑图像序列,其中所述左散斑图像序列和所述右散斑图像序列中包含的图像的数量相同;
    对应点搜索模块,用于对所述左散斑图像序列和所述右散斑图像序列进行时间相关运算,以在所述右散斑图像序列中确定整像素级对应点;
    以及,根据所述整像素级对应点、空间相关函数以及所述左散斑图像序列 中各左散斑图像的像素点坐标,对所述右散斑图像序列中每帧右散斑图像进行亚像素对应点搜索运算,得到亚像素对应点;
    以及,根据所述亚像素对应点的时间平均运算算出待三维重建的对应点;
    三维重建模块,用于在所述时间节点上通过所述待三维重建的对应点进行三维重建。
  7. 根据权利要求6所述的装置,其特征在于,所述对应点搜索模块还用于执行以下步骤:
    根据时间相关计算公式对所述左散斑图像序列和所述右散斑图像序列进行时间相关运算,以在右散斑图像序列中确定所述左散斑图像序列中各像素点对应的对应点,所述时间相关计算公式为:
    Figure PCTCN2017075942-appb-100025
    其中,Xi,j,t表示为左成像装置图像平面点(i,j)在第t幅左散斑图像的灰度值,X′i′,j′,t表示右成像装置图像平面内对应点(i′,j′)在第t幅右散斑图像的灰度值,
    Figure PCTCN2017075942-appb-100026
    Figure PCTCN2017075942-appb-100027
    分别表示左、右成像装置图像平面内点(i,j)和对应点(i′,j′)在k幅所述左散斑图像序列的灰度平均值和在k幅所述右散斑图像序列的灰度平均值,其中k为大于或等于2;
    在通过所述时间相关计算公式的计算结果值中选取最大值对应的对应点作为整像素级对应点。
  8. 根据权利要求7所述的装置,其特征在于,所述对应点搜索模块还用于执行以下步骤:
    在所述左散斑图像序列中各左散斑图像内创建窗口大小为(2wm+1)×(2wm+1)的参考子窗口;
    将二阶视差模型下的非线性空间相关函数w(s)作为N-R迭代运算的待优化函数;
    Figure PCTCN2017075942-appb-100028
    其中,
    Figure PCTCN2017075942-appb-100029
    为所述左散斑图像上所述参考子窗口内像素点灰度平均值,
    Figure PCTCN2017075942-appb-100030
    为所述右散斑图像上参考子窗口内像素点灰度平均值,pR(uR,vR)为左散斑图像所述参考子窗口内像素点pR的灰度值,pG(uG,vG)为待匹配的右散斑图像上的对应点pG的灰度值;
    按照预置迭代步数,并根据N-R迭代运算公式
    Figure PCTCN2017075942-appb-100031
    进行迭代运算,确定最后一次迭代运算算出的相关函数值sN为结果值,其中,
    Figure PCTCN2017075942-appb-100032
    Figure PCTCN2017075942-appb-100033
    Figure PCTCN2017075942-appb-100034
    Figure PCTCN2017075942-appb-100035
    其中,N的取值范围为大于或等于1的整数;初始状态下,N取值为1,则s0为预置迭代初始估计值;
    Figure PCTCN2017075942-appb-100036
    为相关函数在sN-1处的梯度值,
    Figure PCTCN2017075942-appb-100037
    为相关函数在sN-1处的二次偏导,M表示s参数的个数;
    根据所述结果值和二阶视差模型算出所述亚像素对应点。
  9. 根据权利要求7所述的装置,其特征在于,所述对应点搜索模块还用于执行以下步骤:
    在所述左散斑图像序列中各左散斑图像内创建窗口大小为(2wm+1)×(2wm+1)的参考子窗口;
    将二阶视差模型下的非线性空间相关函数w(s)作为N-R迭代运算的待优化函数;
    Figure PCTCN2017075942-appb-100038
    其中,
    Figure PCTCN2017075942-appb-100039
    为所述左散斑图像上所述参考子窗口内像素点灰度平均值,
    Figure PCTCN2017075942-appb-100040
    为所述右散斑图像上参考子窗口内像素点灰度平均值,pR(uR,vR)为左散斑图像所述参考子窗口内像素点pR的灰度值,pG(uG,vG)为待匹配的右散斑图像上的对应点pG的灰度值;
    根据N-R迭代运算公式
    Figure PCTCN2017075942-appb-100041
    进行迭代运算,算出相关函数值sN,其中,
    Figure PCTCN2017075942-appb-100042
    Figure PCTCN2017075942-appb-100043
    Figure PCTCN2017075942-appb-100044
    Figure PCTCN2017075942-appb-100045
    其中,N的取值范围为大于或等于1的整数;初始状态下,N取值为1,则s0为预置迭代初始估计值;
    Figure PCTCN2017075942-appb-100046
    为相关函数在sN-1处的梯度值,
    Figure PCTCN2017075942-appb-100047
    为相关函数在sN-1处的二次偏导,M表示s参数的个数;
    根据算出的相关函数值sN和二阶视差模型算出坐标值,并对相邻两次迭代运算算出的相关函数值sN对应的坐标值求差,算出差值;
    若所述差值小于预置阈值,则停止迭代运算,并将最后一次迭代运算算出的相关函数值sN对应的坐标值作为所述亚像素对应点。
  10. 根据权利要求8或9所述的装置,其特征在于,
    所述对应点搜索模块,还用于对所述亚像素对应点Pt G(i′,j′)进行时间平均运算,算出所述待三维重建的对应点
    Figure PCTCN2017075942-appb-100048
PCT/CN2017/075942 2017-03-08 2017-03-08 时空结合的散斑三维成像方法及装置 WO2018161270A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/075942 WO2018161270A1 (zh) 2017-03-08 2017-03-08 时空结合的散斑三维成像方法及装置

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/075942 WO2018161270A1 (zh) 2017-03-08 2017-03-08 时空结合的散斑三维成像方法及装置

Publications (1)

Publication Number Publication Date
WO2018161270A1 true WO2018161270A1 (zh) 2018-09-13

Family

ID=63447072

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/075942 WO2018161270A1 (zh) 2017-03-08 2017-03-08 时空结合的散斑三维成像方法及装置

Country Status (1)

Country Link
WO (1) WO2018161270A1 (zh)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050271300A1 (en) * 2004-06-02 2005-12-08 Pina Robert K Image registration system and method
CN103279982A (zh) * 2013-05-24 2013-09-04 中国科学院自动化研究所 鲁棒的快速高深度分辨率的散斑三维重建方法
CN103310482A (zh) * 2012-03-12 2013-09-18 中兴通讯股份有限公司 一种三维重建方法及系统
CN104331897A (zh) * 2014-11-21 2015-02-04 天津工业大学 基于极线校正的亚像素级相位立体匹配方法
CN104596439A (zh) * 2015-01-07 2015-05-06 东南大学 一种基于相位信息辅助的散斑匹配三维测量方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050271300A1 (en) * 2004-06-02 2005-12-08 Pina Robert K Image registration system and method
CN103310482A (zh) * 2012-03-12 2013-09-18 中兴通讯股份有限公司 一种三维重建方法及系统
CN103279982A (zh) * 2013-05-24 2013-09-04 中国科学院自动化研究所 鲁棒的快速高深度分辨率的散斑三维重建方法
CN104331897A (zh) * 2014-11-21 2015-02-04 天津工业大学 基于极线校正的亚像素级相位立体匹配方法
CN104596439A (zh) * 2015-01-07 2015-05-06 东南大学 一种基于相位信息辅助的散斑匹配三维测量方法

Similar Documents

Publication Publication Date Title
US10346997B2 (en) Depth estimation method based on light-field data distribution
JP7403528B2 (ja) シーンの色及び深度の情報を再構成するための方法及びシステム
Čech et al. Scene flow estimation by growing correspondence seeds
Ferstl et al. Image guided depth upsampling using anisotropic total generalized variation
JP5887267B2 (ja) 3次元画像補間装置、3次元撮像装置および3次元画像補間方法
CN112862877B (zh) 用于训练图像处理网络和图像处理的方法和装置
WO2018171008A1 (zh) 一种基于光场图像的高光区域修复方法
US9615081B2 (en) Method and multi-camera portable device for producing stereo images
KR100681320B1 (ko) 헬름홀츠 교환조건으로부터 유도되는 편미분 방정식의레벨셋 풀이 방법을 이용한 물체의 3차원 형상 모델링 방법
EP2887310B1 (en) Method and apparatus for processing light-field image
JP2016024052A (ja) 3次元計測システム、3次元計測方法及びプログラム
JP2017527042A (ja) 深度マップの改善
CN106910246B (zh) 时空结合的散斑三维成像方法及装置
JP7312026B2 (ja) 画像処理装置、画像処理方法およびプログラム
WO2018133027A1 (zh) 基于灰度约束的三维数字散斑的整像素搜索方法及装置
Satapathy et al. Robust depth map inpainting using superpixels and non-local Gauss–Markov random field prior
JP6359985B2 (ja) デプス推定モデル生成装置及びデプス推定装置
CN108062765A (zh) 双目图像处理方法、成像装置及电子设备
KR101852085B1 (ko) 깊이 정보 획득 장치 및 깊이 정보 획득 방법
WO2016001920A1 (en) A method of perceiving 3d structure from a pair of images
JP2019120590A (ja) 視差値算出装置、視差値算出方法及びプログラム
WO2018161270A1 (zh) 时空结合的散斑三维成像方法及装置
EP2866446B1 (en) Method and multi-camera portable device for producing stereo images
Fan et al. A multi-view super-resolution method with joint-optimization of image fusion and blind deblurring
Hoeltgen et al. Optimised photometric stereo via non-convex variational minimisation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17900018

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17900018

Country of ref document: EP

Kind code of ref document: A1