CN106910246B - Space-time combined speckle three-dimensional imaging method and device - Google Patents

Space-time combined speckle three-dimensional imaging method and device Download PDF

Info

Publication number
CN106910246B
CN106910246B CN201710133724.7A CN201710133724A CN106910246B CN 106910246 B CN106910246 B CN 106910246B CN 201710133724 A CN201710133724 A CN 201710133724A CN 106910246 B CN106910246 B CN 106910246B
Authority
CN
China
Prior art keywords
speckle image
value
pixel
corresponding point
sub
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710133724.7A
Other languages
Chinese (zh)
Other versions
CN106910246A (en
Inventor
刘晓利
赵恢和
汤其剑
彭翔
蔡泽伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen University
Original Assignee
Shenzhen University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen University filed Critical Shenzhen University
Priority to CN201710133724.7A priority Critical patent/CN106910246B/en
Publication of CN106910246A publication Critical patent/CN106910246A/en
Application granted granted Critical
Publication of CN106910246B publication Critical patent/CN106910246B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects

Abstract

The invention discloses a space-time combined speckle three-dimensional imaging method and a device, wherein the method comprises the following steps: performing a temporal correlation operation on the left speckle image sequence and the right speckle image sequence to determine integer-pixel-level corresponding points in the right speckle image sequence, according to the corresponding point of the whole pixel level, the space correlation function and the pixel point coordinate of each left speckle image in the left speckle image sequence, performing sub-pixel corresponding point search operation on each frame of right speckle image in the right speckle image sequence to obtain sub-pixel corresponding points, calculating corresponding points to be three-dimensionally reconstructed according to time average operation of the sub-pixel corresponding points, three-dimensional reconstruction is performed on the time node by the corresponding point to be three-dimensionally reconstructed, so that by combining spatial correlation operation with temporal correlation operation, corresponding point searching operation can be carried out on a plurality of images started at any time node, and a corresponding point to be three-dimensionally reconstructed with high precision is searched, so that the precision of three-dimensional reconstruction is improved.

Description

Space-time combined speckle three-dimensional imaging method and device
Technical Field
The invention belongs to the field of image processing, and particularly relates to a space-time combined speckle three-dimensional imaging method and device.
Background
The speckle structure light illumination based three-dimensional imaging technology is a non-contact optical three-dimensional digital imaging and measuring method. The method is widely applied to the measurement of the three-dimensional deformation strain of the object. The performance of the material of the measured object can be better understood and analyzed through the speckle structured light illumination three-dimensional imaging technology. With the rapid development of three-dimensional imaging and measurement technologies, shortening the measurement time and improving the measurement accuracy become the main research direction at present, and since the measurement accuracy can be affected by the three-dimensional reconstruction accuracy, how to improve the three-dimensional reconstruction accuracy becomes especially important.
In the prior art, a three-dimensional reconstruction method based on random speckle images mainly adopts a space correlation method, the space correlation method can realize three-dimensional reconstruction only by using a single image, but the space correlation method is established on the basis of gray level change of a matching area, and the three-dimensional reconstruction result precision of the space correlation method is lower due to the influence of different discharge positions of imaging devices, uneven surface gradient change of an object to be measured and other factors.
Disclosure of Invention
The invention provides a space-time combined speckle three-dimensional imaging method and a space-time combined speckle three-dimensional imaging device, and aims to solve the problem of low three-dimensional reconstruction precision of the conventional space correlation method.
The invention provides a space-time combined speckle three-dimensional imaging method, which comprises the following steps:
selecting a time node from a preset time sequence, and acquiring a group of left speckle image sequences and a group of right speckle image sequences which are respectively output by a left imaging device and a right imaging device from the selected time node, wherein the number of images contained in the left speckle image sequences is the same as that of images contained in the right speckle image sequences;
performing time correlation operation on the left speckle image sequence and the right speckle image sequence to determine corresponding points of a whole pixel level in the right speckle image sequence;
performing sub-pixel corresponding point searching operation on each frame of right speckle image in the right speckle image sequence according to the whole pixel level corresponding point, the spatial correlation function and the pixel point coordinates of each left speckle image in the left speckle image sequence to obtain a sub-pixel corresponding point;
calculating corresponding points to be three-dimensionally reconstructed according to the time average operation of the corresponding points of the sub-pixels;
and performing three-dimensional reconstruction on the time node through the corresponding point to be subjected to three-dimensional reconstruction.
The invention provides a space-time combined speckle three-dimensional imaging device, which comprises:
the device comprises an acquisition module, a comparison module and a processing module, wherein the acquisition module is used for selecting a time node from a preset time sequence and acquiring a group of left speckle image sequences and a group of right speckle image sequences which are respectively output by a left imaging device and a right imaging device from the selected time node, and the number of images contained in the left speckle image sequences is the same as that of images contained in the right speckle image sequences;
the corresponding point searching module is used for carrying out time correlation operation on the left speckle image sequence and the right speckle image sequence so as to determine corresponding points of a whole pixel level in the right speckle image sequence;
performing sub-pixel corresponding point search operation on each frame of right speckle image in the right speckle image sequence according to the integer pixel level corresponding point, the spatial correlation function and the pixel point coordinates of each left speckle image in the left speckle image sequence to obtain a sub-pixel corresponding point;
calculating corresponding points to be three-dimensionally reconstructed according to the time average operation of the corresponding points of the sub-pixels;
and the three-dimensional reconstruction module is used for performing three-dimensional reconstruction on the time node through the corresponding point to be three-dimensionally reconstructed.
The invention provides a space-time combined speckle three-dimensional imaging method and a device, which select time nodes from a preset time sequence, and obtain a group of left speckle image sequences and a group of right speckle image sequences which are respectively output by a left imaging device and a right imaging device from the selected time nodes, wherein the number of images contained in the left speckle image sequences and the right speckle image sequences is the same, time correlation operation is carried out on the left speckle image sequences and the right speckle image sequences to determine whole pixel level corresponding points in the right speckle image sequences, sub-pixel corresponding point searching operation is carried out on each frame of right speckle images in the right speckle image sequences according to the whole pixel level corresponding points, a space correlation function and pixel point coordinates of each left speckle image in the left speckle image sequences to obtain sub-pixel corresponding points, corresponding points to be three-dimensionally reconstructed are calculated according to the time average operation of the sub-pixel corresponding points, the corresponding point to be three-dimensionally reconstructed is subjected to three-dimensional reconstruction on the time node, so that the corresponding point operation can be searched for a plurality of images started at any time node by combining the space correlation operation and the time correlation operation, the corresponding point to be three-dimensionally reconstructed with high precision is searched, and the three-dimensional reconstruction is performed according to the corresponding point to be three-dimensionally reconstructed with high precision, so that the precision of the three-dimensional reconstruction is improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention.
FIG. 1 is a schematic flow chart of an implementation of a space-time combined speckle three-dimensional imaging method according to a first embodiment of the present invention;
FIG. 2 is a schematic diagram of the positions of a projection device and an imaging device provided by an embodiment of the invention;
FIG. 3 is a schematic diagram of a three-dimensional digital model reconstructed from a fan blade by a conventional spatial correlation method;
FIG. 4 is a schematic diagram of a three-dimensional digital model reconstructed from a fan blade by the space-time speckle three-dimensional imaging method according to the embodiment of the invention;
fig. 5 is a schematic structural diagram of a three-dimensional imaging device with speckles combined in time and space according to a second embodiment of the present invention.
Detailed Description
In order to make the objects, features and advantages of the present invention more obvious and understandable, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, fig. 1 is a schematic flow chart of an implementation of a space-time combined speckle three-dimensional imaging method according to a first embodiment of the present invention, which can be applied to an optical three-dimensional scanning system, and the space-time combined speckle three-dimensional imaging method shown in fig. 1 mainly includes the following steps:
s101, selecting a time node from a preset time sequence, and acquiring a group of left speckle image sequences and a group of right speckle image sequences which are respectively output by a left imaging device and a right imaging device from the selected time node.
Wherein two acquisition conditions need to be satisfied: firstly, the number of images contained in the left speckle image sequence is the same as that of images contained in the right speckle image sequence; and secondly, the time node for acquiring each speckle image in the left speckle image sequence is consistent with the time node for acquiring each speckle image in the right speckle image sequence. Examples are as follows:
if set from time node t0Acquisition is started and the order of acquisition is t0,t1,...tnThen t is0Then, t is acquired from the left and right imaging devices, respectively0One frame of left speckle image and one frame of right speckle image shot at the moment, and the next time node t1Again, t is acquired from the left and right imaging devices, respectively1And a frame of left speckle image and a frame of right speckle image which are shot at the moment are repeated in a similar way.
In addition, the acquisition of the group of left speckle image sequences and the group of right speckle image sequences respectively output by the left imaging device and the right imaging device from the selected time node does not limit the sequence of acquiring the speckle images, and only the two acquisition conditions are required to be met. For example, let the time sequence be t0,t1,...t5And if from time node t3When the acquisition is started, the images can be acquired in a reverse order, in a forward order, or in a forward order and a reverse order respectively.
Under the two above-mentioned acquisition conditions, there are various ways to acquire the image sequence, for example, taking a group of left speckle image sequences as an example, the left speckle image sequence can be expressed as: t is ti(i ═ 0,1,2,. n), set at time node t0The initial acquisition is n, the left imaging device outputs the leftThe speckle image sequence is t0,t1,...tnAt time node t for the same reason1When the acquisition is started to be n +1, the left speckle image sequence output by the left imaging device is t1,t2,...tn+1
Further, step S101 is preceded by: the random digital speckle pattern is projected to the surface of an object through a projection device, and left and right speckle images of the object are respectively collected through left and right imaging devices arranged on two sides of the projection device.
Fig. 2 is a schematic position diagram of the projection apparatus and the imaging apparatus, as shown in fig. 2. As can be seen from fig. 2, two imaging devices, such as cameras, are located on either side of the projection device. It should be noted that, for convenience of description, the imaging device located on the left side of the projection device is referred to as a left imaging device in all embodiments of the present invention; the right imaging device is positioned at the right side of the projection device, and a group of images output by the left imaging device is set as a left speckle image sequence, and a group of images output by the right imaging device is set as a right speckle image sequence. Wherein the projection device and the two imaging devices form a traditional binocular stereo vision device. Wherein the speckle regions of the images in the left speckle image sequence and the right speckle image sequence are shot objects.
And S102, performing time correlation operation on the left speckle image sequence and the right speckle image sequence to determine corresponding points of an integral pixel level in the right speckle image sequence.
The temporal correlation is also called time-domain correlation.
Further, performing time correlation operation on the left speckle image sequence and the right speckle image sequence to determine the corresponding points of the whole pixel level in the right speckle image sequence specifically as follows:
performing time correlation operation on the left speckle image sequence and the right speckle image sequence according to a time correlation calculation formula to determine corresponding points corresponding to all pixel points in the left speckle image sequence in the right speckle image sequence, wherein the time correlation calculation formula is as follows:
Figure BDA0001240746990000051
wherein, Xi,j,tIs expressed as the gray value, X ', of the left imaging device image plane point (i, j) at the t-th left speckle image'i′,j′,tRepresenting the gray value of the corresponding point (i ', j') in the right imaging device image plane at the t-th right speckle image,
Figure BDA0001240746990000052
and
Figure BDA0001240746990000053
respectively representing the gray level average value of a point (i, j) and a corresponding point (i ', j') in the image plane of the left imaging device and the right imaging device in k left speckle image sequences and k right speckle image sequences;
and selecting a corresponding point corresponding to the maximum value from the calculation result values passing through the time correlation calculation formula as a whole pixel level corresponding point.
Wherein k is greater than or equal to 2. Where k denotes that there are k images in the left speckle image sequence and k images in the right speckle image sequence.
S103, performing sub-pixel corresponding point searching operation on each frame of right speckle image in the right speckle image sequence according to the integral pixel level corresponding point, the spatial correlation function and the pixel point coordinates of each left speckle image in the left speckle image sequence to obtain a sub-pixel corresponding point.
The spatial correlation function is also called a spatial correlation function.
Further, according to the whole pixel level corresponding point, the spatial correlation function and the pixel point coordinates of each left speckle image in the left speckle image sequence, performing sub-pixel corresponding point search operation on each frame of right speckle image in the right speckle image sequence to obtain sub-pixel corresponding points, specifically:
a window size of (2 w) is created within each left speckle image in the sequence of left speckle imagesm+1)×(2wm+1) reference sub-window;
taking a nonlinear spatial correlation function w(s) under a second-order parallax model as a function to be optimized of N-R iterative operation;
Figure BDA0001240746990000061
wherein the content of the first and second substances,
Figure BDA0001240746990000062
the gray average value of the pixel points in the reference sub-window on the left speckle image,
Figure BDA0001240746990000063
is the average value of the gray levels of the pixel points in the reference sub-window on the right speckle image, pR(uR,vR) Pixel point p in the reference sub-window for the left speckle imageRGray value of pG(uG,vG) For the corresponding point p on the right speckle image to be matchedGThe gray value of (a);
according to preset iteration steps and an N-R iteration operation formula
Figure BDA0001240746990000064
Performing iterative operation to determine the correlation function value s calculated by the last iterative operationNIs the result value, wherein,
Figure BDA0001240746990000065
Figure BDA0001240746990000066
Figure BDA0001240746990000067
Figure BDA0001240746990000068
wherein the value range of N is an integer greater than or equal to 1; in the initial state, if N is 1, s0Is a preset iteration initial estimation value;
Figure BDA0001240746990000071
for the correlation function at sN-1The value of the gradient of (a) is,
Figure BDA0001240746990000072
for the correlation function at sN-1The second partial derivative, M represents the number of s parameters;
and calculating the sub-pixel corresponding point according to the result value and a second-order parallax model.
The iterative operation starts with N-1, followed by N-2, 3 …. The preset iteration step number is a preset value, and the value of the preset iteration step number can be 1 step or multiple steps.
It should be noted that, in the following description,
Figure BDA0001240746990000073
is obtained by modifying the above formula w(s), so that w(s) is w(s)N-1)。
The sub-pixel corresponding points calculated here are a plurality of sub-pixel corresponding points, i.e., a group of sub-pixel corresponding point sequences.
Parallel to the above method, further, according to the integer pixel level corresponding point, the spatial correlation function and the pixel point coordinates of each left speckle image in the left speckle image sequence, performing a sub-pixel corresponding point search operation on each frame of right speckle image in the right speckle image sequence to obtain a sub-pixel corresponding point specifically:
a window size of (2 w) is created within each left speckle image in the sequence of left speckle imagesm+1)×(2wm+1) reference sub-window;
taking a nonlinear spatial correlation function w(s) under a second-order parallax model as a function to be optimized of N-R iterative operation;
Figure BDA0001240746990000074
wherein the content of the first and second substances,
Figure BDA0001240746990000075
is thatThe average value of the gray levels of the pixels in the reference sub-window on the left speckle image,
Figure BDA0001240746990000076
is the average value of the gray levels of the pixel points in the reference sub-window on the right speckle image, pR(uR,vR) Pixel point p in the reference sub-window for the left speckle imageRGray value of pG(uG,vG) For the corresponding point p on the right speckle image to be matchedGThe gray value of (a);
according to the formula of N-R iterative operation
Figure BDA0001240746990000077
Performing iterative operation to calculate correlation function value sNWherein, in the step (A),
Figure BDA0001240746990000078
Figure BDA0001240746990000079
Figure BDA0001240746990000081
Figure BDA0001240746990000082
wherein the value range of N is an integer greater than or equal to 1; in the initial state, if N is 1, s0Is a preset iteration initial estimation value;
Figure BDA0001240746990000083
for the correlation function at sN-1The value of the gradient of (a) is,
Figure BDA0001240746990000084
for the correlation function at sN-1The second partial derivative, M represents the number of s parameters;
based on the calculated phaseValue of relation sNCalculating coordinate value with second-order parallax model, and calculating correlation function value s for two adjacent iterative operationsNCalculating the difference value by calculating the difference of the corresponding coordinate values;
if the difference is less than the preset threshold, stopping the iterative operation and calculating the correlation function value s calculated by the last iterative operationNThe corresponding coordinate value is used as the sub-pixel corresponding point.
And S104, calculating a corresponding point to be three-dimensionally reconstructed according to the time average operation of the sub-pixel corresponding point.
And calculating an accurate corresponding point, namely the corresponding point to be three-dimensionally reconstructed, according to the time average operation of the corresponding point of the sub-pixel.
Further, the calculation of the corresponding point to be three-dimensionally reconstructed according to the time average operation of the corresponding point of the sub-pixel is specifically:
for the sub-pixel corresponding point Pt G(i ', j') performing a time-averaging operation to calculate the corresponding point to be three-dimensionally reconstructed
Figure BDA0001240746990000085
If the left speckle image sequence and the right speckle image sequence respectively have k images, selecting a point P in the tth left speckle image in the speckle image sequencet R(i, j) the point Pt RThe sub-pixel corresponding point on the t-th right speckle image corresponding to (i, j) is Pt G(i ', j'), then the corresponding point to be three-dimensionally reconstructed is
Figure BDA0001240746990000086
And S105, performing three-dimensional reconstruction on the time node through the corresponding point to be three-dimensionally reconstructed.
The process of performing three-dimensional reconstruction on the time node through the corresponding point to be three-dimensionally reconstructed utilizes a stereoscopic vision principle, which is the prior art and is not described herein again.
It should be noted that the time sequence and the speckle image sequence are kept consistent, for example, if the time node selected in the time sequence is t, the t-th speckle image is in the left speckle image sequence and the t-th speckle image is also in the right speckle image sequence. The images described in the embodiments of the present invention are all speckle images.
As shown in fig. 3 and 4, fig. 3 and 4 are schematic diagrams of a three-dimensional digital model reconstructed from a fan blade by using a conventional spatial correlation method and a method described in the embodiment of the present invention, respectively. Comparing fig. 3 and fig. 4, it can be seen that the accuracy of the method provided by the embodiment of the present invention is higher than that of the existing spatial correlation method.
In the embodiment of the invention, a time node is selected from a preset time sequence, a group of left speckle image sequences and a group of right speckle image sequences which are respectively output by a left imaging device and a right imaging device are obtained from the selected time node, time correlation operation is carried out on the left speckle image sequences and the right speckle image sequences so as to determine whole pixel level corresponding points in the right speckle image sequences, sub-pixel corresponding point searching operation is carried out on each frame of right speckle images in the right speckle image sequences according to the whole pixel level corresponding points, a space correlation function and pixel point coordinates of each left speckle image in the left speckle image sequences so as to obtain sub-pixel corresponding points, the corresponding points to be three-dimensionally reconstructed are calculated according to the time average operation of the sub-pixel corresponding points, three-dimensional reconstruction is carried out on the time node through the corresponding points to be three-dimensionally reconstructed, thus the space correlation operation is combined with the time correlation operation, corresponding point operation can be carried out on a plurality of images started at any time node, a corresponding point to be three-dimensionally reconstructed with high precision is searched, three-dimensional reconstruction is carried out according to the corresponding point to be three-dimensionally reconstructed with high precision, and then the precision of three-dimensional reconstruction is improved.
Referring to fig. 5, fig. 5 is a schematic structural diagram of a three-dimensional imaging device with spatially and temporally combined speckle provided by a second embodiment of the invention, and for convenience of explanation, only the parts related to the embodiment of the invention are shown. The space-time combined speckle three-dimensional imaging device illustrated in fig. 5 can be the subject of the implementation of the space-time combined speckle three-dimensional imaging method provided by the embodiment shown in fig. 1. The speckle three-dimensional imaging device of the space-time combination of fig. 5 example mainly includes: an acquisition module 501, a corresponding point searching module 502 and a three-dimensional reconstruction module 503. The above functional modules are described in detail as follows:
an obtaining module 501, configured to select a time node from a preset time sequence, and obtain a group of left speckle image sequences and a group of right speckle image sequences respectively output by the left and right imaging devices from the selected time node.
Wherein two acquisition conditions need to be satisfied: firstly, the number of images contained in the left speckle image sequence is the same as that of images contained in the right speckle image sequence; and secondly, the time node for acquiring each speckle image in the left speckle image sequence is consistent with the time node for acquiring each speckle image in the right speckle image sequence. Examples are as follows:
if set from time node t0Acquisition is started and the order of acquisition is t0,t1,...tnThen t is0Then, t is acquired from the left and right imaging devices, respectively0One frame of left speckle image and one frame of right speckle image shot at the moment, and the next time node t1Again, t is acquired from the left and right imaging devices, respectively1And a frame of left speckle image and a frame of right speckle image which are shot at the moment are repeated in a similar way.
In addition, the acquisition of the group of left speckle image sequences and the group of right speckle image sequences respectively output by the left imaging device and the right imaging device from the selected time node does not limit the sequence of acquiring the speckle images, and only the two acquisition conditions are required to be met. For example, let the time sequence be t0,t1,...t5And if from time node t3When the acquisition is started, the images can be acquired in a reverse order, in a forward order, or in a forward order and a reverse order respectively.
Under the two above-mentioned acquisition conditions, there are various ways to acquire the image sequence, for example, taking a group of left speckle image sequences as an example, the left speckle image sequence can be expressed as: t is ti(i ═ 0,1,2,. n), set at time node t0When the acquisition is started to be n, the left speckle image sequence output by the left imaging device ist0,t1,...tnAt time node t for the same reason1When the acquisition is started to be n +1, the left speckle image sequence output by the left imaging device is t1,t2,...tn+1
And a corresponding point searching module 502, configured to perform a time correlation operation on the left speckle image sequence and the right speckle image sequence, so as to determine an integer-pixel-level corresponding point in the right speckle image sequence.
Further, the corresponding point searching module 502 is further configured to perform the following steps:
performing time correlation operation on the left speckle image sequence and the right speckle image sequence according to a time correlation calculation formula to determine corresponding points corresponding to all pixel points in the left speckle image sequence in the right speckle image sequence, wherein the time correlation calculation formula is as follows:
Figure BDA0001240746990000101
wherein, Xi,j,tIs expressed as the gray value, X ', of the left imaging device image plane point (i, j) at the t-th left speckle image'i′,j′,tRepresenting the gray value of the corresponding point (i ', j') in the right imaging device image plane at the t-th right speckle image,
Figure BDA0001240746990000102
and
Figure BDA0001240746990000103
respectively representing the gray level average value of a point (i, j) and a corresponding point (i ', j') in the image plane of the left imaging device and the right imaging device in k left speckle image sequences and k right speckle image sequences, wherein k is greater than or equal to 2;
and selecting a corresponding point corresponding to the maximum value from the calculation result values passing through the time correlation calculation formula as a whole pixel level corresponding point.
The corresponding point searching module 502 is further configured to perform a sub-pixel corresponding point searching operation on each frame of the right speckle image in the right speckle image sequence according to the integer pixel level corresponding point, the spatial correlation function, and the pixel point coordinates of each left speckle image in the left speckle image sequence, so as to obtain a sub-pixel corresponding point.
Further, the corresponding point searching module 502 is further configured to perform the following steps:
a window size of (2 w) is created within each left speckle image in the sequence of left speckle imagesm+1)×(2wm+1) reference sub-window;
taking a nonlinear spatial correlation function w(s) under a second-order parallax model as a function to be optimized of N-R iterative operation;
Figure BDA0001240746990000111
wherein the content of the first and second substances,
Figure BDA0001240746990000112
the gray average value of the pixel points in the reference sub-window on the left speckle image,
Figure BDA0001240746990000113
is the average value of the gray levels of the pixel points in the reference sub-window on the right speckle image, pR(uR,vR) Pixel point p in the reference sub-window for the left speckle imageRGray value of pG(uG,vG) For the corresponding point p on the right speckle image to be matchedGThe gray value of (a);
according to preset iteration steps and an N-R iteration operation formula
Figure BDA0001240746990000114
Performing iterative operation to determine the correlation function value s calculated by the last iterative operationNIs the result value, wherein,
Figure BDA0001240746990000115
Figure BDA0001240746990000116
Figure BDA0001240746990000117
Figure BDA0001240746990000118
wherein the value range of N is an integer greater than or equal to 1; in the initial state, if N is 1, s0Is a preset iteration initial estimation value;
Figure BDA0001240746990000121
for the correlation function at sN-1The value of the gradient of (a) is,
Figure BDA0001240746990000122
for the correlation function at sN-1The second partial derivative, M represents the number of s parameters;
and calculating the sub-pixel corresponding point according to the result value and a second-order parallax model.
It should be noted that, in the following description,
Figure BDA0001240746990000123
is obtained by modifying the above formula w(s), so that w(s) is w(s)N-1). Optionally, the corresponding point searching module 502 is further configured to perform the following steps:
a window size of (2 w) is created within each left speckle image in the sequence of left speckle imagesm+1)×(2wm+1) reference sub-window;
taking a nonlinear spatial correlation function w(s) under a second-order parallax model as a function to be optimized of N-R iterative operation;
Figure BDA0001240746990000124
wherein the content of the first and second substances,
Figure BDA0001240746990000125
the gray average value of the pixel points in the reference sub-window on the left speckle image,
Figure BDA0001240746990000126
is the average value of the gray levels of the pixel points in the reference sub-window on the right speckle image, pR(uR,vR) Pixel point p in the reference sub-window for the left speckle imageRGray value of pG(uG,vG) For the corresponding point p on the right speckle image to be matchedGThe gray value of (a);
according to the formula of N-R iterative operation
Figure BDA0001240746990000127
Performing iterative operation to calculate correlation function value sNWherein, in the step (A),
Figure BDA0001240746990000128
Figure BDA0001240746990000129
Figure BDA00012407469900001210
Figure BDA00012407469900001211
wherein the value range of N is an integer greater than or equal to 1; in the initial state, if N is 1, s0Is a preset iteration initial estimation value;
Figure BDA00012407469900001212
for the correlation function at sN-1The value of the gradient of (a) is,
Figure BDA00012407469900001213
for the correlation function at sN-1The second partial derivative, M represents the number of s parameters;
based on the calculated correlation function value sNCalculating coordinate values with the second-order parallax model, and calculating the coordinate values for two adjacent timesCorrelation function value s calculated by iterative operationNCalculating the difference value by calculating the difference of the corresponding coordinate values;
if the difference is less than the preset threshold, stopping the iterative operation and calculating the correlation function value s calculated by the last iterative operationNThe corresponding coordinate value is used as the sub-pixel corresponding point.
The corresponding point searching module 502 is further configured to calculate a corresponding point to be three-dimensionally reconstructed according to a time average operation of the sub-pixel corresponding point.
Further, the corresponding point searching module 502 is also used for corresponding point P to the sub-pixelt G(i ', j') performing a time-averaging operation to calculate the corresponding point to be three-dimensionally reconstructed
Figure BDA0001240746990000131
And a three-dimensional reconstruction module 503, configured to perform three-dimensional reconstruction on the time node through the corresponding point to be three-dimensionally reconstructed.
Further, the device also includes a collecting module (not shown in the figure) for projecting a random digital speckle pattern onto the surface of the object through the projecting device, and collecting the left and right speckle images with the object through the left and right imaging devices respectively placed on two sides of the projecting device.
As can be seen from fig. 2, two imaging devices, such as cameras, are located on either side of the projection device. It should be noted that, for convenience of description, the imaging device located on the left side of the projection device is referred to as a left imaging device in all embodiments of the present invention; the right imaging device is positioned at the right side of the projection device, and a group of images output by the left imaging device is set as a left speckle image sequence, and a group of images output by the right imaging device is set as a right speckle image sequence. Wherein the projection device and the two imaging devices form a traditional binocular stereo vision device. Wherein the speckle regions of the images in the left speckle image sequence and the right speckle image sequence are shot objects.
For details that are not described in the present embodiment, please refer to the description of the embodiment shown in fig. 1, which is not described herein again.
In the embodiment of the present invention, the obtaining module 501 selects a time node from a preset time sequence, and obtains a group of left speckle image sequences and a group of right speckle image sequences respectively output by the left imaging device and the right imaging device from the selected time node, the corresponding point searching module 502 performs a time correlation operation on the left speckle image sequence and the right speckle image sequence to determine an integer pixel level corresponding point in the right speckle image sequence, performs a sub-pixel corresponding point searching operation on each frame of right speckle image in the right speckle image sequence according to the integer pixel level corresponding point, a space correlation function and a pixel point coordinate of each left speckle image in the left speckle image sequence to obtain a sub-pixel corresponding point, calculates a corresponding point to be three-dimensionally reconstructed according to a time average operation of the sub-pixel corresponding point, and the three-dimensional reconstruction module 503 performs three-dimensional reconstruction on the time node through the corresponding point to be three-dimensionally reconstructed, therefore, by combining the spatial correlation operation and the time correlation operation, corresponding point searching operation can be carried out on a plurality of images started at any time node, corresponding points to be three-dimensionally reconstructed with high precision are searched, three-dimensional reconstruction is carried out according to the corresponding points to be three-dimensionally reconstructed with high precision, and the precision of the three-dimensional reconstruction is further improved.
In the embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the modules is merely a logical division, and in actual implementation, there may be other divisions, for example, multiple modules or components may be combined or integrated into another system, or some features may be omitted, or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication link may be an indirect coupling or communication link of some interfaces, devices or modules, and may be in an electrical, mechanical or other form.
The modules described as separate parts may or may not be physically separate, and parts displayed as modules may or may not be physical modules, may be located in one place, or may be distributed on a plurality of network modules. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment.
In addition, functional modules in the embodiments of the present invention may be integrated into one processing module, or each of the modules may exist alone physically, or two or more modules are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode.
The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
It should be noted that, for the sake of simplicity, the above-mentioned method embodiments are described as a series of acts or combinations, but those skilled in the art should understand that the present invention is not limited by the described order of acts, as some steps may be performed in other orders or simultaneously according to the present invention. Further, those skilled in the art will appreciate that the embodiments described in the specification are presently preferred and that no acts or modules are necessarily required of the invention.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
The above description is provided for the space-time combined speckle three-dimensional imaging method and apparatus, and for those skilled in the art, there may be variations in the specific implementation and application scope according to the idea of the embodiments of the present invention, and in summary, the content of the present description should not be construed as limiting the present invention.

Claims (6)

1. A method for three-dimensional imaging of temporally and spatially combined speckle, comprising:
selecting a time node from a preset time sequence, and acquiring a group of left speckle image sequences and a group of right speckle image sequences which are respectively output by a left imaging device and a right imaging device from the selected time node, wherein the number of images contained in the left speckle image sequences is the same as that of images contained in the right speckle image sequences;
performing time correlation operation on the left speckle image sequence and the right speckle image sequence to determine corresponding points at an integral pixel level in the right speckle image sequence, wherein the time correlation operation is performed on the left speckle image sequence and the right speckle image sequence according to a time correlation calculation formula to determine corresponding points corresponding to all pixel points in the left speckle image sequence in the right speckle image sequence, wherein the time correlation calculation formula is as follows:
Figure FDA0002382822010000011
wherein, Xi,j,tIs expressed as the gray value, X ', of the left imaging device image plane point (i, j) at the t-th left speckle image'i′,j′,tRepresenting the gray value of the corresponding point (i ', j') in the right imaging device image plane at the t-th right speckle image,
Figure FDA0002382822010000012
and
Figure FDA0002382822010000013
respectively representing points (i, j) and corresponding points (i ', j') in the image plane of the left and right imaging devicesThe gray level average value of the k left speckle image sequences and the gray level average value of the k right speckle image sequences, wherein k is greater than or equal to 2;
selecting a corresponding point corresponding to the maximum value from the calculation result values passing through the time correlation calculation formula as a whole pixel level corresponding point;
according to the whole pixel level corresponding point, the space correlation function and the pixel point coordinates of each left speckle image in the left speckle image sequence, performing sub-pixel corresponding point searching operation on each frame of right speckle image in the right speckle image sequence to obtain a sub-pixel corresponding point, wherein the size of a window created in each left speckle image in the left speckle image sequence is (2 w)m+1)×(2wm+1) reference sub-window;
taking a nonlinear spatial correlation function w(s) under a second-order parallax model as a function to be optimized of N-R iterative operation;
Figure FDA0002382822010000014
wherein the content of the first and second substances,
Figure FDA0002382822010000021
the gray average value of the pixel points in the reference sub-window on the left speckle image is obtained,
Figure FDA0002382822010000022
the gray average value p of the pixel points in the reference sub-window on the right speckle imageR(uR,vR) Pixel point p in the reference sub-window for the left speckle imageRGray value of pG(uG,vG) For the corresponding point p on the right speckle image to be matchedGThe gray value of (a);
according to preset iteration steps and an N-R iteration operation formula
Figure FDA0002382822010000023
Performing iterative operation to determine the correlation calculated by the last iterative operationFunction value sNIs the result value, wherein,
Figure FDA0002382822010000024
Figure FDA0002382822010000025
Figure FDA0002382822010000026
Figure FDA0002382822010000027
wherein the value range of N is an integer greater than or equal to 1; in the initial state, if N is 1, s0Is a preset iteration initial estimation value;
Figure FDA0002382822010000028
for the correlation function at sN-1The value of the gradient of (a) is,
Figure FDA0002382822010000029
for the correlation function at sN-1The second partial derivative, M represents the number of s parameters;
calculating the sub-pixel corresponding point according to the result value and a second-order parallax model;
calculating corresponding points to be three-dimensionally reconstructed according to the time average operation of the corresponding points of the sub-pixels;
and performing three-dimensional reconstruction on the time node through the corresponding point to be subjected to three-dimensional reconstruction.
2. The method according to claim 1, wherein the performing a sub-pixel corresponding point search operation on each frame of right speckle image in the right speckle image sequence according to the integer-pixel level corresponding point, the spatial correlation function, and the pixel coordinates of each left speckle image in the left speckle image sequence to obtain a sub-pixel corresponding point comprises:
creating a window size of (2 w) within each left speckle image in the sequence of left speckle imagesm+1)×(2wm+1) reference sub-window;
taking a nonlinear spatial correlation function w(s) under a second-order parallax model as a function to be optimized of N-R iterative operation;
Figure FDA0002382822010000031
wherein the content of the first and second substances,
Figure FDA0002382822010000032
the gray average value of the pixel points in the reference sub-window on the left speckle image is obtained,
Figure FDA0002382822010000033
the gray average value p of the pixel points in the reference sub-window on the right speckle imageR(uR,vR) Pixel point p in the reference sub-window for the left speckle imageRGray value of pG(uG,vG) For the corresponding point p on the right speckle image to be matchedGThe gray value of (a);
according to the formula of N-R iterative operation
Figure FDA0002382822010000034
Performing iterative operation to calculate correlation function value sNWherein, in the step (A),
Figure FDA0002382822010000035
Figure FDA0002382822010000036
Figure FDA0002382822010000037
Figure FDA0002382822010000038
wherein the value range of N is an integer greater than or equal to 1; in the initial state, if N is 1, s0Is a preset iteration initial estimation value;
Figure FDA0002382822010000039
for the correlation function at sN-1The value of the gradient of (a) is,
Figure FDA00023828220100000310
for the correlation function at sN-1The second partial derivative, M represents the number of s parameters;
based on the calculated correlation function value sNCalculating coordinate value with second-order parallax model, and calculating correlation function value s for two adjacent iterative operationsNCalculating the difference value by calculating the difference of the corresponding coordinate values;
if the difference is smaller than the preset threshold, stopping the iterative operation and calculating the correlation function value s calculated by the last iterative operationNAnd the corresponding coordinate value is used as the sub-pixel corresponding point.
3. The method of claim 2, wherein said calculating the corresponding points to be reconstructed three-dimensionally from the time-averaged operation of the corresponding points of the sub-pixels comprises:
for the sub-pixel corresponding point Pt G(i ', j') performing time average operation to calculate the corresponding point to be three-dimensionally reconstructed
Figure FDA00023828220100000311
4. A spatiotemporal combined speckle three-dimensional imaging apparatus, the apparatus comprising:
the device comprises an acquisition module, a comparison module and a processing module, wherein the acquisition module is used for selecting a time node from a preset time sequence and acquiring a group of left speckle image sequences and a group of right speckle image sequences which are respectively output by a left imaging device and a right imaging device from the selected time node, and the number of images contained in the left speckle image sequences is the same as that of images contained in the right speckle image sequences;
a corresponding point searching module, configured to perform a time correlation operation on the left speckle image sequence and the right speckle image sequence to determine an integer-pixel-level corresponding point in the right speckle image sequence, where the corresponding point searching module is further configured to perform the following steps:
performing time correlation operation on the left speckle image sequence and the right speckle image sequence according to a time correlation calculation formula to determine corresponding points corresponding to all pixel points in the left speckle image sequence in the right speckle image sequence, wherein the time correlation calculation formula is as follows:
Figure FDA0002382822010000041
wherein, Xi,j,tIs expressed as the gray value, X ', of the left imaging device image plane point (i, j) at the t-th left speckle image'i′,j′,tRepresenting the gray value of the corresponding point (i ', j') in the right imaging device image plane at the t-th right speckle image,
Figure FDA0002382822010000042
and
Figure FDA0002382822010000043
respectively representing the gray level average value of a point (i, j) and a corresponding point (i ', j') in the image plane of the left imaging device and the right imaging device in k left speckle image sequences and k right speckle image sequences, wherein k is greater than or equal to 2;
selecting a corresponding point corresponding to the maximum value from the calculation result values passing through the time correlation calculation formula as a whole pixel level corresponding point;
performing sub-pixel corresponding point search operation on each frame of right speckle image in the right speckle image sequence according to the integer pixel level corresponding point, the spatial correlation function and the pixel point coordinates of each left speckle image in the left speckle image sequence to obtain a sub-pixel corresponding point;
the corresponding point searching module is further used for executing the following steps:
creating a window size of (2 w) within each left speckle image in the sequence of left speckle imagesm+1)×(2wm+1) reference sub-window;
taking a nonlinear spatial correlation function w(s) under a second-order parallax model as a function to be optimized of N-R iterative operation;
Figure FDA0002382822010000051
wherein the content of the first and second substances,
Figure FDA0002382822010000052
the gray average value of the pixel points in the reference sub-window on the left speckle image is obtained,
Figure FDA0002382822010000053
the gray average value p of the pixel points in the reference sub-window on the right speckle imageR(uR,vR) Pixel point p in the reference sub-window for the left speckle imageRGray value of pG(uG,vG) For the corresponding point p on the right speckle image to be matchedGThe gray value of (a);
according to preset iteration steps and an N-R iteration operation formula
Figure FDA0002382822010000054
Performing iterative operation to determine the correlation function value s calculated by the last iterative operationNIs the result value, wherein,
Figure FDA0002382822010000055
Figure FDA0002382822010000056
Figure FDA0002382822010000057
Figure FDA0002382822010000058
wherein the value range of N is an integer greater than or equal to 1; in the initial state, if N is 1, s0Is a preset iteration initial estimation value;
Figure FDA0002382822010000059
for the correlation function at sN-1The value of the gradient of (a) is,
Figure FDA00023828220100000510
for the correlation function at sN-1The second partial derivative, M represents the number of s parameters;
calculating the sub-pixel corresponding point according to the result value and a second-order parallax model;
calculating corresponding points to be three-dimensionally reconstructed according to the time average operation of the corresponding points of the sub-pixels;
and the three-dimensional reconstruction module is used for performing three-dimensional reconstruction on the time node through the corresponding point to be three-dimensionally reconstructed.
5. The apparatus of claim 4, wherein the corresponding point searching module is further configured to perform the following steps:
creating a window size of (2 w) within each left speckle image in the sequence of left speckle imagesm+1)×(2wm+1) reference sub-window;
taking a nonlinear spatial correlation function w(s) under a second-order parallax model as a function to be optimized of N-R iterative operation;
Figure FDA0002382822010000061
wherein the content of the first and second substances,
Figure FDA0002382822010000062
the gray average value of the pixel points in the reference sub-window on the left speckle image is obtained,
Figure FDA0002382822010000063
the gray average value p of the pixel points in the reference sub-window on the right speckle imageR(uR,vR) Pixel point p in the reference sub-window for the left speckle imageRGray value of pG(uG,vG) For the corresponding point p on the right speckle image to be matchedGThe gray value of (a);
according to the formula of N-R iterative operation
Figure FDA0002382822010000064
Performing iterative operation to calculate correlation function value sNWherein, in the step (A),
Figure FDA0002382822010000065
Figure FDA0002382822010000066
Figure FDA0002382822010000067
Figure FDA0002382822010000068
wherein the value range of N is an integer greater than or equal to 1; in the initial state, if N is 1, s0Is a preset iteration initial estimation value;
Figure FDA0002382822010000069
for the correlation function at sN-1The value of the gradient of (a) is,
Figure FDA00023828220100000610
for the correlation function at sN-1The second partial derivative, M represents the number of s parameters;
based on the calculated correlation function value sNCalculating coordinate value with second-order parallax model, and calculating correlation function value s for two adjacent iterative operationsNCalculating the difference value by calculating the difference of the corresponding coordinate values;
if the difference is smaller than the preset threshold, stopping the iterative operation and calculating the correlation function value s calculated by the last iterative operationNAnd the corresponding coordinate value is used as the sub-pixel corresponding point.
6. The apparatus of claim 5,
the corresponding point searching module is also used for searching the sub-pixel corresponding point Pt G(i ', j') performing time average operation to calculate the corresponding point to be three-dimensionally reconstructed
Figure FDA0002382822010000071
CN201710133724.7A 2017-03-08 2017-03-08 Space-time combined speckle three-dimensional imaging method and device Active CN106910246B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710133724.7A CN106910246B (en) 2017-03-08 2017-03-08 Space-time combined speckle three-dimensional imaging method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710133724.7A CN106910246B (en) 2017-03-08 2017-03-08 Space-time combined speckle three-dimensional imaging method and device

Publications (2)

Publication Number Publication Date
CN106910246A CN106910246A (en) 2017-06-30
CN106910246B true CN106910246B (en) 2020-07-10

Family

ID=59186856

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710133724.7A Active CN106910246B (en) 2017-03-08 2017-03-08 Space-time combined speckle three-dimensional imaging method and device

Country Status (1)

Country Link
CN (1) CN106910246B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108734776B (en) * 2018-05-23 2022-03-25 四川川大智胜软件股份有限公司 Speckle-based three-dimensional face reconstruction method and equipment
CN111063034B (en) * 2019-12-13 2023-08-04 四川中绳矩阵技术发展有限公司 Time domain interaction method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101447083A (en) * 2008-12-29 2009-06-03 北京航空航天大学 Beaconing-free vision measuring-technique for moving target based on time-space correlative characteristics
CN102384726A (en) * 2011-12-31 2012-03-21 中国矿业大学 Digital speckle relevant deformation analyzing method of dynamic fracture-containing material
CN103279982A (en) * 2013-05-24 2013-09-04 中国科学院自动化研究所 Robust rapid high-depth-resolution speckle three-dimensional rebuilding method
CN104596439A (en) * 2015-01-07 2015-05-06 东南大学 Speckle matching and three-dimensional measuring method based on phase information aiding
CN104864819A (en) * 2015-01-19 2015-08-26 华中科技大学 Digital speckle-based high-speed three-dimensional strain measurement method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101447083A (en) * 2008-12-29 2009-06-03 北京航空航天大学 Beaconing-free vision measuring-technique for moving target based on time-space correlative characteristics
CN102384726A (en) * 2011-12-31 2012-03-21 中国矿业大学 Digital speckle relevant deformation analyzing method of dynamic fracture-containing material
CN103279982A (en) * 2013-05-24 2013-09-04 中国科学院自动化研究所 Robust rapid high-depth-resolution speckle three-dimensional rebuilding method
CN104596439A (en) * 2015-01-07 2015-05-06 东南大学 Speckle matching and three-dimensional measuring method based on phase information aiding
CN104864819A (en) * 2015-01-19 2015-08-26 华中科技大学 Digital speckle-based high-speed three-dimensional strain measurement method

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
Digital image correlation using Newton-Raphson method of partial differential correction;H.A.Bruck et al.;《Experimental Mechanics》;19890930;第29卷(第3期);第261-267页 *
Improved Digital Image Correlation method;Asloob Ahmad Mudassar et al.;《Optics and Lasers in Engineering》;20151023;第156-167页 *
亚像素数字散斑相关测量的曲面拟合法研究;李新忠 等;《激光与光电子学进展》;20090831;第72-75页 *
数字散斑时间序列相关三维面形测量中提高精度的方法;代红军 等;《激光杂志》;20010131;第22卷(第1期);第46-49页 *
随机光照双目立体测量系统中的若干关键问题研究;石春琴;《万方学位论文数据库》;20130627;全文 *

Also Published As

Publication number Publication date
CN106910246A (en) 2017-06-30

Similar Documents

Publication Publication Date Title
EP3444560B1 (en) Three-dimensional scanning system and scanning method thereof
Jeon et al. Accurate depth map estimation from a lenslet light field camera
CN110264509B (en) Method, apparatus, and storage medium for determining pose of image capturing device
CN108734776B (en) Speckle-based three-dimensional face reconstruction method and equipment
US10257450B2 (en) Multi-frame noise reduction method, and terminal
US20110285701A1 (en) Stereo-Matching Processor Using Belief Propagation
US20100142828A1 (en) Image matching apparatus and method
EP2811457B1 (en) Image processing method and apparatus
CN104182982A (en) Overall optimizing method of calibration parameter of binocular stereo vision camera
Ignatov et al. Aim 2019 challenge on raw to rgb mapping: Methods and results
JP7257781B2 (en) Image Depth Estimation Methods in Structured Light-Based 3D Camera Systems
CN109859314B (en) Three-dimensional reconstruction method, three-dimensional reconstruction device, electronic equipment and storage medium
CN106952304B (en) A kind of depth image calculation method using video sequence interframe correlation
CN111080776B (en) Human body action three-dimensional data acquisition and reproduction processing method and system
EP2887310A1 (en) Method and apparatus for processing light-field image
JP2014505389A (en) Method for processing an image in the invisible spectral region, corresponding camera and measuring device
US20210254968A1 (en) Method and System for Automatic Focusing for High-Resolution Structured Light 3D Imaging
CN106910246B (en) Space-time combined speckle three-dimensional imaging method and device
CN114640885B (en) Video frame inserting method, training device and electronic equipment
CN113313740B (en) Disparity map and surface normal vector joint learning method based on plane continuity
CN108507476B (en) Displacement field measuring method, device, equipment and storage medium for material surface
Ding et al. Improved real-time correlation-based FPGA stereo vision system
CN107977986B (en) Method and device for predicting motion trail
KR101852085B1 (en) Depth map acquisition device and depth map acquisition method
CN109859313B (en) 3D point cloud data acquisition method and device, and 3D data generation method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant