CN103438834A - Hierarchy-type rapid three-dimensional measuring device and method based on structured light projection - Google Patents

Hierarchy-type rapid three-dimensional measuring device and method based on structured light projection Download PDF

Info

Publication number
CN103438834A
CN103438834A CN2013104261519A CN201310426151A CN103438834A CN 103438834 A CN103438834 A CN 103438834A CN 2013104261519 A CN2013104261519 A CN 2013104261519A CN 201310426151 A CN201310426151 A CN 201310426151A CN 103438834 A CN103438834 A CN 103438834A
Authority
CN
China
Prior art keywords
pixel
structured light
horizontal
gridline
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2013104261519A
Other languages
Chinese (zh)
Other versions
CN103438834B (en
Inventor
王好谦
张新
邵航
戴琼海
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Graduate School Tsinghua University
Original Assignee
Shenzhen Graduate School Tsinghua University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Graduate School Tsinghua University filed Critical Shenzhen Graduate School Tsinghua University
Priority to CN201310426151.9A priority Critical patent/CN103438834B/en
Publication of CN103438834A publication Critical patent/CN103438834A/en
Application granted granted Critical
Publication of CN103438834B publication Critical patent/CN103438834B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a hierarchy-type rapid three-dimensional measuring device and method based on structured light projection. The device comprises a structured light projection unit, binocular image acquisition units and a data processing unit, wherein patterns of structured light projected by the structured light projection unit comprise a square grid array, and a gray level gradient area is formed in each grid unit; a plurality of characteristic circles are distributed at intersecting points of grid lines in a square dot matrix mode; the data processing unit is used for receiving left and right parallax images acquired by the binocular image acquisition units, and for decoding so as to acquire three-dimensional information, including pixel point depth coordinates, of a measured field, wherein the positions of the characteristic circles in the left and right parallax images are detected and matched, the grid lines of corresponding areas in the left and right parallax images are detected and matched by taking the matched characteristic circles as the center, areas, which are defined by two transverse and longitudinal adjacent grid lines in match are subjected to gray level matching, and the depth coordinates of pixel points are determined according to the acquired pixel point parallax. By utilizing the device and the method, which are provided by the invention, the real-time three-dimensional measurement of the measured field is accomplished rapidly and precisely.

Description

Hierarchical quick three-dimensional measurement mechanism and measuring method based on structured light projection
Technical field
The present invention relates to optics and computer vision measurement field, particularly relate to a kind of hierarchical quick three-dimensional measurement mechanism and measuring method based on structured light projection.
Background technology
Adopting optics and computer vision technique to obtain accurately in real time object dimensional information is a very important research field, has application background very widely.The 3 D information obtaining method of structured light projection is by being used the structured light grenade instrumentation, according to predetermined projection program to tested scene simulation structured light, after structured light patterns is projected onto on tested scene, can be subject to the modulation of tested scene three-dimensional information, thereby make structured light patterns produce distortion, use single or a plurality of video cameras to be gathered structured light patterns, by the analysis of the structured light patterns to collecting, can obtain the three-dimensional data of tested scene.
But there is following shortcoming in present structured light projection three-dimensional measurement scheme:
1. the structured light pattern too simply can not be carried enough coded messages, or the structured light pattern enough complicated but decoding algorithm can not take full advantage of structure light coding information.
2. existing some three-dimension measuring system can not complete good measurement to dynamic scene.
Although 4. some three-dimension measuring system but to measure the dynamic scene error larger.
5. the three-dimension measuring system used in higher occasion in accuracy requirement is expensive, and cost performance is not high, has limited greatly the popularization of three-dimension measuring system.
Summary of the invention
The object of the invention is to overcome the deficiencies in the prior art, a kind of hierarchical quick three-dimensional measurement mechanism and measuring method based on structured light projection is provided, can capable of meeting requirements on three-dimensional measurement of coordinates density, can improve again precision and the speed of three-dimensional coordinate measurement.
For achieving the above object, the present invention is by the following technical solutions:
A kind of hierarchical quick three-dimensional measurement mechanism based on structured light projection comprises:
The structured light projecting unit, for projective structure light, to tested scene, the pattern of structured light comprises the square grid array, forms the gray scale gradation zone in each grid cell, a plurality of characteristic circle with the formal distribution of square dot matrix in the intersection point place of gridline;
The binocular image collecting unit, for gathering the horizontal parallax image of tested scene;
Data processing unit, connect described structured light projecting unit and described binocular image collecting unit, described data processing unit is for controlling described structured light projecting unit projective structure light, and the horizontal parallax image that receives the tested scene of described binocular image collecting unit collection, and decoded to obtain the three-dimensional information that comprises the pixel depth coordinate of tested scene, wherein, described data processing unit detects the position of characteristic circle in the horizontal parallax image and carries out the characteristic circle coupling, detect the gridline of corresponding region in the horizontal parallax image and carry out the gridline coupling centered by the characteristic circle matched, get the zone that horizontal and vertical each two adjacent gridlines of matching enclose and carry out Gray-scale Matching, obtain the parallax of pixel, and determine the depth coordinate of pixel according to parallax.
Described characteristic circle has maximum gray scale or minimal gray in the pattern of described structured light, and the adjacent feature circle is laterally and two grid cells of being separated by vertically.
The gray scale of described gray scale gradation zone is ascending or descending gradual change in the horizontal, and the gray scale gradual change trend in all grid cells is consistent.
A kind of hierarchical quick three-dimensional measuring method based on structured light projection comprises the following steps:
The structured light projecting unit is to tested scene simulation structured light, and the pattern of structured light comprises the square grid array, forms the gray scale gradation zone in each grid cell, a plurality of characteristic circle with the formal distribution of square dot matrix in the intersection point place of gridline;
The binocular image collecting unit is gathered and is transferred to data processing unit to the horizontal parallax image of tested scene;
Data processing unit is decoded to obtain the three-dimensional information that comprises the pixel depth coordinate of tested scene to the described horizontal parallax image received, described decoding comprises the following steps:
A. detect the position of characteristic circle in the horizontal parallax image and carry out the characteristic circle coupling;
B. centered by the characteristic circle matched, detect the gridline of corresponding region in the horizontal parallax image and carry out the gridline coupling;
C. horizontal and vertical each two zones that adjacent gridline encloses to matching in the horizontal parallax image, carry out Gray-scale Matching, obtains the pixel parallax;
D. determine the depth coordinate of pixel according to the pixel parallax.
Described characteristic circle has maximum gray scale or minimal gray in the pattern of described structured light, and the adjacent feature circle is laterally and two grid cells of being separated by vertically.
The gray scale of described gray scale gradation zone is ascending or descending gradual change in the horizontal, and the gray scale gradual change trend in all grid cells is consistent.
Step a comprises:
According to the following formula the left and right view is carried out to binaryzation with larger threshold value T,
Figure BDA0000383665790000031
Wherein, I dst(x, y) is the binary conversion treatment result, I src(x, y) is original image, x, and y is coordinate figure,
Extract the position of characteristic circle in the horizontal parallax image by binary conversion treatment, and, according to the position of characteristic circle, the characteristic circle detected in the horizontal parallax image is mated.
In step b, described zone is that the length of side is at least square region of twice of the grid cell length of side.
In step b, detect gridline and comprise:
(1) use following gaussian derivative convolution kernel function and image to carry out convolutional calculation,
g x,δ(x,y)=g δ(y)g′ δ(x)
g y,δ(x,y)=g′ δ(y)g δ(x)
g xx,δ(x,y)=g δ(y)g″ δ(x)
g xy,δ(x,y)=g′ δ(y)g′ δ(x)
g yy,δ(x,y)=g″ δ(y)g δ(x)
Wherein for the binary Gaussian function, x, y is coordinate figure, and δ is standard deviation, and the result that image and above-mentioned kernel function convolution obtain is respectively r x, r y, r xx, r xy, r yy;
(2) use the result of calculation of (1), obtain each gloomy matrix in pixel sea
H ( x , y ) = r xx r xy r xy r yy
Calculate two eigenwerts and the characteristic of correspondence vector of extra large gloomy matrix, get the normal vector of larger eigenwert characteristic of correspondence vector as this place curve, according to normal vector and single order r x, r yderivative judges whether this point is the point on curve;
(3), according to the normal vector direction, on the curve that (2) are detected, point is linked to be a pixel curve as the square grid line detected.
Step c comprises:
Use is carried out Gray-scale Matching with following formula
C ( p , p ′ ) = Σ i = - n n [ I ( u + i , v ) - I ( u , v ) ‾ ] [ I ′ ( u ′ + i , v ) - I ′ ( u ′ , v ) ‾ ] ( 2 n + 1 ) δ ( u , v ) δ ( u ′ , v ) ,
P=(u, v) ∈ I and p '=(u ', v) ∈ I ',
Wherein, I means left figure, and I ' means right figure, and p means left pixel point coordinate, p ' means right pixel point coordinate, and v means the ordinate of pixel p and p ', and u means the horizontal ordinate of pixel p, the horizontal ordinate of u ' expression pixel p ', 2n+1 means the region of search width centered by p and p '
Figure BDA0000383665790000042
the rectangular window average gray of expression centered by (u, v), δ (u, v) means the standard deviation of corresponding region of search, can calculate by following formula:
δ ( u , v ) = Σ i = - n n I 2 ( u + i , v ) 2 n + 1 - I ( u , v ) ‾ 2
If C is (p, p ') be less than setting threshold with the absolute value of 1 difference, and be the maximal value of all matching results, judge pixel p and pixel p ' height correlation, and 2 the match is successful, otherwise continue to search the match point of pixel p in right figure, after finding match point, can determine the parallax of pixel p.
Useful technique effect of the present invention:
The structured light patterns that adopts the present invention to comprise multi-layer information, structured light patterns can be carried sufficient 3-dimensional encoding information, simultaneously, structured light decoding by hierarchical of the present invention, determine the depth value of each pixel by the mode of successively refinement, by the feature of high-level easy detection, instruct low level to be difficult for the characteristic matching detected, effectively avoided global characteristics coupling blindly, reduce the complexity of decoding algorithm, and improved the precision of measuring, and the utilization of structure light coding information is also more abundant, therefore can obtain accurate decoded result.Therefore, use the present invention to carry out three-dimensional measurement, more accurate than traditional three-dimensional coordinate measurement.
The present invention can capable of meeting requirements on three-dimensional measurement of coordinates density, has improved again precision and the speed of three-dimensional coordinate measurement simultaneously, can complete quickly and accurately the real-time three-dimensional non-contact measurement of tested scene.
The accompanying drawing explanation
The three-dimensional measuring apparatus structural representation that Fig. 1 is an embodiment of the present invention;
Fig. 2 is the structured light decoding process flow diagram in an embodiment of the present invention;
Fig. 3 is the structured light mode chart in an embodiment of the present invention.
Embodiment
Below in conjunction with accompanying drawing, embodiments of the invention are elaborated.Should be emphasized that, following explanation is only exemplary, rather than in order to limit the scope of the invention and to apply.
Consult Fig. 1, in some embodiments, the hierarchical quick three-dimensional measurement mechanism based on structured light projection comprises structured light projecting unit 1, binocular image collecting unit 2,3 and data processing unit 4.
Structured light projecting unit 1 arrives tested scene for projective structure light.Consult Fig. 3, the pattern of structured light comprises the square grid array, forms the gray scale gradation zone in each grid cell, a plurality of characteristic circle with the formal distribution of square dot matrix in the intersection point place of gridline.
As shown in Figure 2, preferably, the adjacent feature circle is (or claiming horizontal direction and vertical direction) two grid cells of being separated by laterally reaching vertically.
Preferably, described characteristic circle has maximum gray scale or minimal gray in the pattern of described structured light.Characteristic circle as shown in Figure 2 has minimal gray, is the solid black circle.
Preferably, the gray scale of described gray scale gradation zone is ascending or descending gradual change in the horizontal, and the gray scale gradual change trend in all grid cells is consistent.
Described structured light projecting unit 1 can adopt the DLP projector.
Binocular image collecting unit 2,3 is for gathering the horizontal parallax image of tested scene.The video camera pair that the binocular image collecting unit can adopt two video cameras to form.
Data processing unit 4 connects described structured light projecting unit 1 and described binocular image collecting unit 2,3.Described data processing unit 1 is for controlling described structured light projecting unit 1 projective structure light, and receive described binocular image collecting unit 2, the horizontal parallax image of the 3 tested scenes that gather, and decode and calculate to obtain the three-dimensional information that comprises the pixel depth coordinate of tested scene, wherein, described data processing unit 4 detects the position of characteristic circle in the horizontal parallax image and carries out the characteristic circle coupling, detect the gridline of corresponding region in the horizontal parallax image and carry out the gridline coupling centered by the characteristic circle matched, get the zone that horizontal and vertical each two adjacent gridlines of matching enclose and carry out Gray-scale Matching, obtain the parallax of pixel, and determine the depth coordinate of pixel according to parallax.
Data processing unit 4 can adopt the computing machine with projector and video camera compatibility interface.
In other embodiment, the hierarchical quick three-dimensional measuring method based on structured light projection comprises the following steps:
Structured light projecting unit 1 is to tested scene simulation structured light, and the pattern of structured light comprises the square grid array, forms the gray scale gradation zone in each grid cell, a plurality of characteristic circle with the formal distribution of square dot matrix in the intersection point place of gridline.
The horizontal parallax image of 2,3 pairs of tested scenes of binocular image collecting unit is gathered and is transferred to data processing unit.The image that binocular digitized video unit collects is the stack of actual tested scene and structured light patterns, and structured light patterns deforms because of the modulation of the three-dimensional appearance of tested scene.
The described horizontal parallax image that 4 pairs of data processing units receive is decoded to obtain the three-dimensional information that comprises the pixel depth coordinate of tested scene.Consult Fig. 2, described decoding comprises the following steps:
Step a. detects the position of characteristic circle in the horizontal parallax image and carries out the characteristic circle coupling;
Step b., centered by the characteristic circle matched, detects the gridline of corresponding region in the horizontal parallax image and carries out the gridline coupling;
Step c horizontal and vertical each two zones that adjacent gridline encloses to matching in the horizontal parallax image, carry out Gray-scale Matching, obtains the pixel parallax;
Steps d. determine the depth coordinate of pixel according to the pixel parallax.
According to above step a~d, 4 pairs of digitized videos that collect of data processing unit are used the hierarchical structure photodissociation code method of successively refinement, instruct the coupling at gridline place by characteristic circle being detected, matching result by gridline instructs image parallactic coupling in gray areas, realization is obtained to dense ground scene parallax by sparse, finally, by the triangle relation of parallax and the degree of depth, obtain the three-dimensional information of whole scene.
As shown in Figure 2, specifically, the hierarchical structure photodissociation code method of described successively refinement can comprise the following steps:
The method that step S1, use detect based on characteristic circle detects solid black characteristic circle in the image of left and right, determines the position of characteristic circle, and according to the characteristic circle in the location matches left and right figure of the left and right characteristic circle detected.
Centered by step S2, each characteristic circle of having mated in the figure of left and right respectively, can in the image of left and right, get the rectangular window area of the double-width of rectangular grid cell width as shown in Figure 2, detect gridline in rectangular window, according to mating in the horizontal direction the gridline detected in the figure of left and right, the gridline detected in the location matches left and right figure according to characteristic circle.
Step S3, in the figure of left and right, get the image in adjacent transverse and the zone that gridline surrounds longitudinally, carry out Gray-scale Matching.Structured light grid Area comparison is little, can be similar to the gray scale of thinking in grid and not have larger saltus step.Therefore, gridline surrounds the parallax of each point in zone, can determine by the Gray-scale Matching in local grid and known left and right figure gridline matching result.
Step S4, finally according to the triangle principle, can be obtained by parallax the degree of depth of pixel.
In a further embodiment, the idiographic flow that obtains scene point of density three-dimensional coordinate is as follows,
In step S1, use larger threshold value T according to following formula
Figure BDA0000383665790000061
The left and right view is carried out to binaryzation, I dst(x, y) is the binary conversion treatment result, I src(x, y) is original image, x, and y is coordinate figure.Due to the characteristic circle gray-scale value minimum in the present embodiment structured light, at first extract the position of characteristic circle in the figure of left and right by binary conversion treatment.According to the position of characteristic circle in the figure of left and right, the characteristic circle that left and right figure is detected is mated.
In step S-2, centered by-the characteristic circle that detects, get the square detection of the double-width of the rectangular grid width of the rectangle as shown in Figure 3 gridline in square in step S-1, according to the matching result of characteristic circle, the left and right gridline is mated.
The concrete steps that detect gridline can be as follows,
(1) use gaussian derivative convolution kernel and image to carry out convolutional calculation.Gaussian derivative convolution kernel function is calculated as follows shown in formula:
g x,δ(x,y)=g δ(y)g′ δ(x)
g y,δ(x,y)=g′ δ(y)g δ(x)
g xx,δ(x,y)=g δ(y)g″ δ(x)
g xy,δ(x,y)=g′ δ(y)g′ δ(x)
g yy,δ(x,y)=g″ δ(y)g δ(x)
Wherein
Figure BDA0000383665790000071
for the binary Gaussian function, x, y is the transverse and longitudinal coordinate, δ is standard deviation.The result that image and above-mentioned kernel function convolution obtain is respectively r x, r y, r xx, r xy, r yy.
Described step is used differential and the image of Gaussian function to carry out convolution, and it is equivalent to image is carried out Gassian low-pass filter and then image is carried out to differential calculation.Use the noise of Gaussian function in can the filtering image.
(2) use the result of calculation of (1), obtain each gloomy matrix in pixel sea
H ( x , y ) = r xx r xy r xy r yy
Calculate two eigenwerts and the characteristic of correspondence vector of extra large gloomy matrix.Get the normal vector of larger eigenwert characteristic of correspondence vector as this place curve.According to normal vector and single order r x, r ythe derivative judgement, whether this point is the point on curve.
(3), according to the normal vector direction, on the curve that (2) are detected, point is linked to be a pixel curve as the square grid line detected.
In step S-3, according to the gridline detected in step S-2 and matching result, get the interior image of quadrilateral area of the gridline composition that adjacent neighbouring gridline is adjacent with left and right.In order to reduce the only point in the direction matching area of one dimension of calculated amount, the concrete coupling of gray scale is used following formula:
C ( p , p ′ ) = Σ i = - n n [ I ( u + i , v ) - I ( u , v ) ‾ ] [ I ′ ( u ′ + i , v ) - I ′ ( u ′ , v ) ‾ ] ( 2 n + 1 ) δ ( u , v ) δ ( u ′ , v )
P=(u, v) ∈ I and p ' in formula=(u ', v) ∈ I '.Wherein, I means left figure, and I ' means right figure, and p means left pixel coordinate, p ' means right pixel coordinate, and v means the ordinate of pixel p and p ', and u means the horizontal ordinate of pixel p, u ' expression pixel p ' horizontal ordinate, 2n+1 means the region of search width centered by p and p ' the rectangular window average gray of expression centered by (u, v), δ (u, v) means the standard deviation of corresponding region of search, can calculate by following formula:
δ ( u , v ) = Σ i = - n n I 2 ( u + i , v ) 2 n + 1 - I ( u , v ) ‾ 2
If C (p, p ') is less than setting threshold with the absolute value of 1 difference, close to 1, and while being the maximal value of all matching results, judge pixel p and pixel p ' height correlation, and 2 the match is successful.Otherwise, if C (p, p ') is less than setting threshold, approach 0, illustrate that some p are uncorrelated with some p ', 2 do not mate, and need to continue to search the match point of p in right figure, until find match point.After finding match point, can determine the parallax that p is ordered.For example, in practical operation, the match point decision method is as follows, and the match point that is a p as fruit dot p ' should meet following two conditions (1) C (p, p ') simultaneously > 0.7, (2) C (p, p ') is the maximal value of all matching results.
In step S-4, according to the relation of parallax and the degree of depth, draw the depth value that p is ordered.
Embodiments of the invention are used the structured light design of hierarchical, by the feature of high-level easy detection, instruct low level to be difficult for the characteristic matching detected, and have effectively avoided global characteristics coupling blindly to reduce the complexity of decoding algorithm, and have improved the precision of measuring.
In a kind of specific embodiment, the quick three-dimensional measurement mechanism comprises the binocular collecting unit of two video cameras to forming, a DLP projector, a computing machine and a scaling board.Two video cameras are connected with computing machine with the DLP projector.Connecting interface can be the video interfaces such as VGA, DVI, HDMI.Projector projects the structured light of the present invention's design in tested scene.Two video cameras are responsible for the image of tested scene is fetched and is transferred to computing machine.Corresponding image pick-up card is installed on computing machine, and the interface of video camera can be the camera interfaces commonly used such as USB3.0, HD-SDI, 1394b.
Measuring process as:
1) measure the front scaling board that uses binocular camera and projector are carried out to parameter calibration.
2) by computer control projector projects structured light patterns, simultaneously by controlling the digitized video of the corresponding scene of binocular camera synchronous acquisition.Detailed process is, computing machine transfers to projector by pattern, control binocular camera simultaneously and carry out synchronous acquisition, video camera is by the image process data line collected, image pick-up card is transferred to computing machine, computing machine carries out analyzing and processing to the binocular image collected automatically, thereby draws the three-dimensional information of tested scene.
Above content is in conjunction with concrete preferred implementation further description made for the present invention, can not assert that specific embodiment of the invention is confined to these explanations.For the general technical staff of the technical field of the invention, without departing from the inventive concept of the premise, can also make some simple deduction or replace, all should be considered as belonging to protection scope of the present invention.

Claims (10)

1. the hierarchical quick three-dimensional measurement mechanism based on structured light projection, is characterized in that, comprising:
The structured light projecting unit, for projective structure light, to tested scene, the pattern of structured light comprises the square grid array, forms the gray scale gradation zone in each grid cell, a plurality of characteristic circle with the formal distribution of square dot matrix in the intersection point place of gridline;
The binocular image collecting unit, for gathering the horizontal parallax image of tested scene;
Data processing unit, connect described structured light projecting unit and described binocular image collecting unit, described data processing unit is for controlling described structured light projecting unit projective structure light, and the horizontal parallax image that receives the tested scene of described binocular image collecting unit collection, and decoded to obtain the three-dimensional information that comprises the pixel depth coordinate of tested scene, wherein, described data processing unit detects the position of characteristic circle in the horizontal parallax image and carries out the characteristic circle coupling, detect the gridline of corresponding region in the horizontal parallax image and carry out the gridline coupling centered by the characteristic circle matched, get the zone that horizontal and vertical each two adjacent gridlines of matching enclose and carry out Gray-scale Matching, obtain the parallax of pixel, and determine the depth coordinate of pixel according to parallax.
2. hierarchical quick three-dimensional measurement mechanism as claimed in claim 1, is characterized in that, described characteristic circle has maximum gray scale or minimal gray in the pattern of described structured light, and the adjacent feature circle is laterally and two grid cells of being separated by vertically.
3. hierarchical quick three-dimensional measurement mechanism as claimed in claim 2, is characterized in that, the gray scale of described gray scale gradation zone is ascending or descending gradual change in the horizontal, and the gray scale gradual change trend in all grid cells is consistent.
4. the hierarchical quick three-dimensional measuring method based on structured light projection, is characterized in that, comprises the following steps:
The structured light projecting unit is to tested scene simulation structured light, and the pattern of structured light comprises the square grid array, forms the gray scale gradation zone in each grid cell, a plurality of characteristic circle with the formal distribution of square dot matrix in the intersection point place of gridline;
The binocular image collecting unit is gathered and is transferred to data processing unit to the horizontal parallax image of tested scene;
Data processing unit is decoded to obtain the three-dimensional information that comprises the pixel depth coordinate of tested scene to the described horizontal parallax image received, described decoding comprises the following steps:
A. detect the position of characteristic circle in the horizontal parallax image and carry out the characteristic circle coupling;
B. centered by the characteristic circle matched, detect the gridline of corresponding region in the horizontal parallax image and carry out the gridline coupling;
C. horizontal and vertical each two zones that adjacent gridline encloses to matching in the horizontal parallax image, carry out Gray-scale Matching, obtains the pixel parallax;
D. determine the depth coordinate of pixel according to the pixel parallax.
5. measuring method as claimed in claim 4, is characterized in that, described characteristic circle has maximum gray scale or minimal gray in the pattern of described structured light, and the adjacent feature circle is laterally and two grid cells of being separated by vertically.
6. measuring method as claimed in claim 4, is characterized in that, the gray scale of described gray scale gradation zone is ascending or descending gradual change in the horizontal, and the gray scale gradual change trend in all grid cells is consistent.
7. measuring method as described as claim 4 to 6 any one, is characterized in that, step a comprises:
According to the following formula the left and right view is carried out to binaryzation with larger threshold value T,
Figure FDA0000383665780000021
Wherein, I dst(x, y) is the binary conversion treatment result, I src(x, y) is original image, x, and y is the transverse and longitudinal coordinate,
Extract the position of characteristic circle in the horizontal parallax image by binary conversion treatment, and, according to the position of characteristic circle, the characteristic circle detected in the horizontal parallax image is mated.
8. measuring method as described as claim 4 to 6 any one, is characterized in that, in step b, and the square region that described zone is the length of side at least twice that is the grid cell length of side.
9. measuring method as described as claim 4 to 6 any one, is characterized in that, in step b, detects gridline and comprise:
(1) use following gaussian derivative convolution kernel function and image to carry out convolutional calculation,
g x,δ(x,y)=g δ(y)g′ δ(x)
g y,δ(x,y)=g′ δ(y)g δ(x)
g xx,δ(x,y)=g δ(y)g″ δ(x)
g xy,δ(x,y)=g′ δ(y)g′ δ(x)
g yy,δ(x,y)=g″ δ(y)g δ(x)
Wherein
Figure FDA0000383665780000031
for the binary Gaussian function, x, y is the transverse and longitudinal coordinate, and δ is standard deviation, and the result that image and above-mentioned kernel function convolution obtain is respectively r x, r y, r xx, r xy, r yy;
(2) use the result of calculation of (1), obtain each gloomy matrix in pixel sea
H ( x , y ) = r xx r xy r xy r yy
Calculate two eigenwerts and the characteristic of correspondence vector of extra large gloomy matrix, get the normal vector of larger eigenwert characteristic of correspondence vector as this place curve, according to normal vector and single order r x, r yderivative judges whether this point is the point on curve;
(3), according to the normal vector direction, on the curve that (2) are detected, point is linked to be a pixel curve as the square grid line detected.
10. measuring method as described as claim 4 to 6 any one, is characterized in that, step c comprises:
Use is carried out Gray-scale Matching with following formula
C ( p , p ′ ) = Σ i = - n n [ I ( u + i , v ) - I ( u , v ) ‾ ] [ I ′ ( u ′ + i , v ) - I ′ ( u ′ , v ) ‾ ] ( 2 n + 1 ) δ ( u , v ) δ ( u ′ , v ) ,
P=(u, v) ∈ I and p '=(u ', v) ∈ I ',
Wherein, I means left figure, and I ' means right figure, and p means left pixel point coordinate, p ' means right pixel point coordinate, and v means the ordinate of pixel p and p ', and u means the horizontal ordinate of pixel p, the horizontal ordinate of u ' expression pixel p ', 2n+1 means the region of search width centered by p and p '
Figure FDA0000383665780000034
the rectangular window average gray of expression centered by (u, v), δ (u, v) means the standard deviation of corresponding region of search, can calculate by following formula:
δ ( u , v ) = Σ i = - n n I 2 ( u + i , v ) 2 n + 1 - I ( u , v ) ‾ 2
If C is (p, p ') be less than setting threshold with the absolute value of 1 difference, and be the maximal value of all matching results, judge pixel p and pixel p ' height correlation, and 2 the match is successful, otherwise continue to search the match point of pixel p in right figure, after finding match point, can determine the parallax of pixel p.
CN201310426151.9A 2013-09-17 2013-09-17 The hierarchical quick three-dimensional measurement mechanism of structure based light projection and measuring method Active CN103438834B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310426151.9A CN103438834B (en) 2013-09-17 2013-09-17 The hierarchical quick three-dimensional measurement mechanism of structure based light projection and measuring method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310426151.9A CN103438834B (en) 2013-09-17 2013-09-17 The hierarchical quick three-dimensional measurement mechanism of structure based light projection and measuring method

Publications (2)

Publication Number Publication Date
CN103438834A true CN103438834A (en) 2013-12-11
CN103438834B CN103438834B (en) 2015-10-28

Family

ID=49692530

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310426151.9A Active CN103438834B (en) 2013-09-17 2013-09-17 The hierarchical quick three-dimensional measurement mechanism of structure based light projection and measuring method

Country Status (1)

Country Link
CN (1) CN103438834B (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103900494A (en) * 2014-03-31 2014-07-02 中国科学院上海光学精密机械研究所 Homologous point rapid matching method used for binocular vision three-dimensional measurement
CN106225676A (en) * 2016-09-05 2016-12-14 凌云光技术集团有限责任公司 Method for three-dimensional measurement, Apparatus and system
CN106840251A (en) * 2015-12-07 2017-06-13 中国电力科学研究院 A kind of 3 D scanning system for the detection of low-voltage current mutual inductor outward appearance
CN107992868A (en) * 2017-11-15 2018-05-04 辽宁警察学院 A kind of High Precision Stereo footprint Quick Acquisition method
CN108088390A (en) * 2017-12-13 2018-05-29 浙江工业大学 Optical losses three-dimensional coordinate acquisition methods based on double eye line structure light in a kind of welding detection
CN108303038A (en) * 2017-12-21 2018-07-20 天津大学 Reflection-type surface shape measurement method and device based on two-dimension optical dot matrix
CN109274871A (en) * 2018-09-27 2019-01-25 维沃移动通信有限公司 A kind of image imaging method and device of mobile terminal
CN111080689A (en) * 2018-10-22 2020-04-28 杭州海康威视数字技术股份有限公司 Method and device for determining face depth map
CN114252027A (en) * 2021-12-22 2022-03-29 深圳市响西科技有限公司 Continuous playing method of structured light stripe pattern and 3D structured light machine

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6751344B1 (en) * 1999-05-28 2004-06-15 Champion Orthotic Investments, Inc. Enhanced projector system for machine vision
CN1865847A (en) * 2005-04-21 2006-11-22 Gom光学测量技术有限公司 Projector for a system for three dimensional optical object measurement
CN101960253A (en) * 2008-02-26 2011-01-26 株式会社高永科技 Apparatus and method for measuring a three-dimensional shape
KR20110016770A (en) * 2009-08-12 2011-02-18 지스캔(주) A three-dimensional image measuring apparatus
CN102538708A (en) * 2011-12-23 2012-07-04 北京理工大学 Measurement system for three-dimensional shape of optional surface

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6751344B1 (en) * 1999-05-28 2004-06-15 Champion Orthotic Investments, Inc. Enhanced projector system for machine vision
CN1865847A (en) * 2005-04-21 2006-11-22 Gom光学测量技术有限公司 Projector for a system for three dimensional optical object measurement
CN101960253A (en) * 2008-02-26 2011-01-26 株式会社高永科技 Apparatus and method for measuring a three-dimensional shape
KR20110016770A (en) * 2009-08-12 2011-02-18 지스캔(주) A three-dimensional image measuring apparatus
CN102538708A (en) * 2011-12-23 2012-07-04 北京理工大学 Measurement system for three-dimensional shape of optional surface

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103900494A (en) * 2014-03-31 2014-07-02 中国科学院上海光学精密机械研究所 Homologous point rapid matching method used for binocular vision three-dimensional measurement
CN103900494B (en) * 2014-03-31 2016-06-08 中国科学院上海光学精密机械研究所 For the homologous points fast matching method of binocular vision 3 D measurement
CN106840251B (en) * 2015-12-07 2020-04-14 中国电力科学研究院 Three-dimensional scanning system for appearance detection of low-voltage current transformer
CN106840251A (en) * 2015-12-07 2017-06-13 中国电力科学研究院 A kind of 3 D scanning system for the detection of low-voltage current mutual inductor outward appearance
CN106225676B (en) * 2016-09-05 2018-10-23 凌云光技术集团有限责任公司 Method for three-dimensional measurement, apparatus and system
CN106225676A (en) * 2016-09-05 2016-12-14 凌云光技术集团有限责任公司 Method for three-dimensional measurement, Apparatus and system
CN107992868A (en) * 2017-11-15 2018-05-04 辽宁警察学院 A kind of High Precision Stereo footprint Quick Acquisition method
CN108088390A (en) * 2017-12-13 2018-05-29 浙江工业大学 Optical losses three-dimensional coordinate acquisition methods based on double eye line structure light in a kind of welding detection
CN108303038A (en) * 2017-12-21 2018-07-20 天津大学 Reflection-type surface shape measurement method and device based on two-dimension optical dot matrix
CN109274871A (en) * 2018-09-27 2019-01-25 维沃移动通信有限公司 A kind of image imaging method and device of mobile terminal
CN111080689A (en) * 2018-10-22 2020-04-28 杭州海康威视数字技术股份有限公司 Method and device for determining face depth map
CN111080689B (en) * 2018-10-22 2023-04-14 杭州海康威视数字技术股份有限公司 Method and device for determining face depth map
CN114252027A (en) * 2021-12-22 2022-03-29 深圳市响西科技有限公司 Continuous playing method of structured light stripe pattern and 3D structured light machine

Also Published As

Publication number Publication date
CN103438834B (en) 2015-10-28

Similar Documents

Publication Publication Date Title
CN103438834B (en) The hierarchical quick three-dimensional measurement mechanism of structure based light projection and measuring method
CN110285793B (en) Intelligent vehicle track measuring method based on binocular stereo vision system
CN109791696B (en) Method, device and method for locating event cameras for 3D reconstruction of a scene
CN105225482B (en) Vehicle detecting system and method based on binocular stereo vision
CN101697233B (en) Structured light-based three-dimensional object surface reconstruction method
CN103400366B (en) Based on the dynamic scene depth acquisition methods of fringe structure light
CN102183524B (en) Double-CCD (Charge Coupled Device) detecting method and system for apparent defect assessment of civil engineering structure
CN103900494B (en) For the homologous points fast matching method of binocular vision 3 D measurement
CN102901444B (en) Method for detecting component size based on matching pursuit (MP) wavelet filtering and detecting system thereof
Kim et al. Semiautomatic reconstruction of building height and footprints from single satellite images
CN105043350A (en) Binocular vision measuring method
CN105046743A (en) Super-high-resolution three dimensional reconstruction method based on global variation technology
CN102982334B (en) The sparse disparities acquisition methods of based target edge feature and grey similarity
CN104408725A (en) Target recapture system and method based on TLD optimization algorithm
CN104361314A (en) Method and device for positioning power transformation equipment on basis of infrared and visible image fusion
CN104574393A (en) Three-dimensional pavement crack image generation system and method
CN102903101B (en) Method for carrying out water-surface data acquisition and reconstruction by using multiple cameras
CN103822581B (en) A kind of irregularly shaped object volume measuring method based on compressed sensing
CN103996220A (en) Three-dimensional reconstruction method and system in intelligent transportation
CN106996748A (en) Wheel diameter measuring method based on binocular vision
CN106128121A (en) Vehicle queue length fast algorithm of detecting based on Local Features Analysis
CN105913013A (en) Binocular vision face recognition algorithm
CN104034269A (en) Monocular vision measuring method and monocular vision measuring device
CN103308000B (en) Based on the curve object measuring method of binocular vision
CN107374638A (en) A kind of height measuring system and method based on binocular vision module

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant