CN110645920B - Automatic extraction method and system for effective points of grating projection profile - Google Patents

Automatic extraction method and system for effective points of grating projection profile Download PDF

Info

Publication number
CN110645920B
CN110645920B CN201910915664.3A CN201910915664A CN110645920B CN 110645920 B CN110645920 B CN 110645920B CN 201910915664 A CN201910915664 A CN 201910915664A CN 110645920 B CN110645920 B CN 110645920B
Authority
CN
China
Prior art keywords
points
centroid
clustering
target
measured object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910915664.3A
Other languages
Chinese (zh)
Other versions
CN110645920A (en
Inventor
马峻
陈宏�
陈寿宏
郭玲
徐翠锋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guilin University of Electronic Technology
Original Assignee
Guilin University of Electronic Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guilin University of Electronic Technology filed Critical Guilin University of Electronic Technology
Priority to CN201910915664.3A priority Critical patent/CN110645920B/en
Publication of CN110645920A publication Critical patent/CN110645920A/en
Application granted granted Critical
Publication of CN110645920B publication Critical patent/CN110645920B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2518Projection by scanning of the object
    • G01B11/2527Projection by scanning of the object with phase change by in-plane movement of the patern
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • G06F18/232Non-hierarchical techniques
    • G06F18/2321Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
    • G06F18/23213Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions with fixed number of clusters, e.g. K-means clustering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Probability & Statistics with Applications (AREA)
  • Software Systems (AREA)
  • Geometry (AREA)
  • Computer Graphics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a method and a system for automatically extracting effective points of a grating projection contour, wherein the method comprises the following steps: coding on a PC to generate a sine stripe image; the projector projects the sine stripe image to the surface of the measured object; the CMOS camera collects a plurality of modulation images of the measured object; unwrapping the phase by software programming, and reconstructing a three-dimensional profile of the measured object; after the modulation image is obtained, clustering pixel points into three types of background points, boundary points and target points, and respectively determining an initial centroid of the boundary points, an initial centroid of the target points and an initial centroid of the background points; clustering, defining a threshold value tm, and when the difference value between the centroid values of two iterations before and after the centroid is smaller than the threshold value tm in the t-th clustering, determining that the centroid at the moment approaches convergence; removing points greater than c3t from the data list if the target point centroid approaches convergence; if the background point centroid approaches convergence, then points less than c1t are removed from the data list.

Description

Automatic extraction method and system for effective points of grating projection profile
Technical Field
The invention relates to the technical field, in particular to a method and a system for automatically extracting effective points of a grating projection profile.
Background
In the grating projection technology, the phase shift method is the most common method for measuring three dimensions, and the method uses a projector to project grating stripes onto a measured object, then uses a camera to shoot a grating pattern of the measured object, and a computer analyzes the grating pattern and reconstructs the surface geometry of the measured object according to the analysis result and the parameters of the system. For some measured objects, the grating stripes can not completely cover the surface of the measured object, so that a shadow area exists in a picture captured by a camera. The shadow area does not carry any information related to the surface of the measured object, and the three-dimensional reconstruction result of the area is wrong. The shadow area needs to be removed, except for the shadow area, the background points in the grating projection system also need to be removed, only the effective area of the object to be measured is reserved, and the result of three-dimensional reconstruction is optimal.
The prior similar implementation schemes are as follows:
skydan uses multiple projectors to illuminate the object under test from different viewing angles to obtain shadowless three-dimensional reconstruction data
Zhang et al removes random noise with a gaussian filter and uses the characteristics of phase monotonicity to identify phase null points.
Xiao proposes a method for dividing the target and background, which effectively removes the invalid point, but needs to separately obtain the modulation image of the background plate first, then put the target into the grating projection system to obtain the modulation image of the target and the background plate
And 4, Wang proposes that pixel points are divided into three types of points, namely target points, boundary points and background points in a modulation graph, clustering is carried out by utilizing a K-means method, and after clustering is finished, each boundary point is judged to belong to a target point or a background point, so that the target points can be automatically and effectively extracted.
However, the above methods all have some problems:
the Skydan approach requires multiple cameras, additional cost in hardware, making it less popular
The Zhang method works well only on objects with flat surfaces, and when the object has a surface that changes rapidly, the gaussian filter treats the surface as noise
The method of Xiao requires cumbersome operations
The Wang method has the problems that clustering iteration time is too long, target points exist in background points and the like.
Disclosure of Invention
In view of the above, the present invention provides a method and a system for automatically extracting effective points of a grating projection profile, which reduce iterative convergence time of a conventional K-means method, and obtain more target points by re-segmenting background points by combining an Otsu method, so as to achieve fast, automatic, efficient, simple, convenient and economical extraction of effective points of a measured object.
The invention solves the technical problems by the following technical means:
a method for automatically extracting effective points of a grating projection profile comprises the following steps:
coding on a PC to generate a sine stripe image;
the projector projects the sine stripe image to the surface of the measured object;
the CMOS camera collects a plurality of modulation images of the measured object;
unwrapping the phase by software programming, and reconstructing a three-dimensional profile of the measured object; wherein the content of the first and second substances,
after the modulation image is obtained, clustering pixel points into three types of background points, boundary points and target points, and respectively determining an initial centroid of the boundary points, an initial centroid of the target points and an initial centroid of the background points;
clustering, defining a threshold value tm, and when the difference value between the centroid values of two iterations before and after the centroid is smaller than the threshold value tm in the t-th clustering, determining that the centroid at the moment approaches convergence;
removing points greater than c3t from the data list if the target point centroid approaches convergence;
if the background point centroid approaches convergence, then points less than c1t are removed from the data list.
Further, the modulation image is obtained according to the following formula:
Figure BDA0002216053360000031
where N is the nth phase shift, N represents the total phase shift, and In is the fringe intensity plot of the nth phase shift.
Further, the three initial centroids are determined according to the following formula:
Figure BDA0002216053360000032
wherein the boundary point initial centroid c2 is the average modulation value of the entire modulation map, the target point initial centroid c3 is the average of all points greater than c2, and the background point initial centroid c1 is the average of all points less than centroid c 2.
Further, after the clustering is completed, the point of the boundary point can be divided into target points if it satisfies one of the following two conditions:
first, the point is directly connected to the target point;
second, the gradient at this point is less than 2 times the maximum gradient at the target point.
Further, segmentation is carried out on background points in the modulated image after clustering is completed by using an Otsu method, and a part with a large threshold value in a segmentation result is taken as a target point.
Further, the CMOS camera collects four modulated images of the measured object.
On the other hand, the invention also provides an automatic extraction system for effective points of a grating projection profile, which comprises the following components:
the PC is used for encoding to generate a sine stripe image;
the projector is used for projecting the sine stripe image to the surface of the measured object; and
the CMOS camera is used for acquiring a plurality of modulation images of the measured object; wherein the content of the first and second substances,
the PC is also used for unwrapping the phase through software programming and reconstructing the three-dimensional profile of the measured object; after the modulation image is obtained, clustering pixel points into three types of background points, boundary points and target points, and respectively determining an initial centroid of the boundary points, an initial centroid of the target points and an initial centroid of the background points; clustering, defining a threshold value tm, and when the difference value between the centroid values of two iterations before and after the centroid is smaller than the threshold value tm in the t-th clustering, determining that the centroid at the moment approaches convergence; removing points greater than c3t from the data list if the target point centroid approaches convergence; if the background point centroid approaches convergence, then points less than c1t are removed from the data list.
Furthermore, the modulated images of the measured object collected by the CMOS camera are four.
The invention has the beneficial effects that: the invention improves the conventional K-means method, reduces clustering iteration convergence time, and the result obtained by the improved method is consistent with the result obtained by the conventional method, but only needs less running time, and in 1024 x 1280 pictures, the running time of the conventional K-means is 32.214 seconds; the improved K-menas process of the present invention is 18.741 seconds. Further, the background points in the conventional K-means have wrongly-divided target points, and are not processed, and the Otsu method is used for carrying out secondary processing on the background points to obtain more effective points.
Drawings
Fig. 1 is a schematic structural diagram of an automatic extraction system for effective points of a grating projection profile according to an embodiment of the present invention;
fig. 2 is a flowchart of a method for automatically extracting valid points of a grating projection profile according to an embodiment of the present invention;
fig. 3 is a diagram of a moving process of an initial centroid of a boundary point, an initial centroid of a target point, and an initial centroid of a background point in the method for automatically extracting effective points of a grating projection profile provided in the embodiment of the present invention;
fig. 4 is a modulation image provided by an embodiment of the present invention for verifying the effect of the present invention;
FIG. 5 is a graph of the results of the manual segmentation applied to FIG. 4;
FIG. 6 is a graph showing the results of the conventional K-means segmentation applied to FIG. 4;
FIG. 7 is a graph comparing the differences between FIG. 6 and FIG. 5;
FIG. 8 is a graph of the segmentation results of FIG. 4 using a method according to an embodiment of the present invention;
fig. 9 is a graph comparing the difference between fig. 8 and fig. 5.
Detailed Description
The invention will be described in detail below with reference to the following figures and specific examples:
as shown in fig. 2, the method for automatically extracting effective points of a grating projection profile of the present invention includes the following steps:
coding on a PC to generate a sine stripe image;
the projector projects the sine stripe image to the surface of the measured object;
the CMOS camera collects a plurality of modulation images of the measured object;
unwrapping the phase by software programming, and reconstructing a three-dimensional profile of the measured object; wherein the content of the first and second substances,
after the modulation image is obtained, clustering pixel points into three types of background points, boundary points and target points, and respectively determining an initial centroid of the boundary points, an initial centroid of the target points and an initial centroid of the background points;
clustering, defining a threshold value tm, and when the difference value between the centroid values of two iterations before and after the centroid is smaller than the threshold value tm in the t-th clustering, determining that the centroid at the moment approaches convergence;
removing points greater than c3t from the data list if the target point centroid approaches convergence;
if the background point centroid approaches convergence, then points less than c1t are removed from the data list.
Specifically, the CMOS camera acquires four modulated images of the measured object, the invention removes the invalid point and uses the modulated image of the measured object, and the modulated image acquisition method is as shown in formula (1):
Figure BDA0002216053360000051
where N is the nth phase shift, N represents the total phase shift, and In is the fringe intensity plot of the nth phase shift.
After the modulation map is acquired, clustering is performed using the improved K-means algorithm of the present invention. The invention sets the pixel point cluster as three types (K is 3) of background point, boundary point and target point, the initial centroid c2 of the boundary point is the average modulation value of the whole modulation chart, the initial centroid c3 of the target point is the average value of all points which are larger than c2, and the initial centroid c1 of the background point is the average value of all points which are smaller than the centroid c 2. The determination of the three initial centroids is shown in equation (2):
Figure BDA0002216053360000061
the clustering is started after the centroid is set, and the innovation points of the invention are as follows: in the clustering process, a threshold value tm is defined, and when the difference value between the centroid values of two iterations before and after the centroid is smaller than the threshold value tm in the t-th clustering, the centroid at the moment can be considered to approach convergence. If the centroid of the target point S3 approaches convergence, points greater than c3t are removed from the data list. If the centroid of the background point S2 is close to convergence, the points smaller than c1t are removed from the data list, so that the number of clustering points can be reduced, and the clustering iteration time is shortened. The process of moving the respective centroids is shown in fig. 3.
The clustering result of the method is consistent with the traditional K-means result, but the required time is shorter. In the plot of plot 41024 x 1280, the clustering results are shown in table 1:
table 1 difference between the algorithm of the present invention and the conventional algorithm
Figure BDA0002216053360000062
After clustering is completed, the three points are divided into a target point, a background point and a boundary point. Boundary point S2The point of (2) satisfying one of the following two conditions can be classified as a target point, and first, the point is directly connected to the target point. Second, the gradient at this point is less than 2 times the maximum gradient at the target point. In order to obtain more target points, eight neighborhood points of the newly added target point are also divided into target points.
In the background points after clustering, there are target points, fig. 5 is a segmentation result of manually selecting a threshold, fig. 6 is a clustering result, and fig. 7 is a difference of fig. 5 from fig. 6.
The right ear and cheek portion dots are background dots in fig. 6, and are target dots in fig. 5. These modulation values were observed to be 12.7-14.5, while the modulation values for the true background points were 0.5-3.5. Therefore, the background point can be divided by using the Otsu method, and the division node can be taken
The part of the result where the threshold is large is taken as the target point. FIG. 8 shows the results obtained by the method of the present invention, and FIG. 9 shows the difference between the method of the present invention and the manual segmentation. Obviously, the difference between the method and the manual segmentation is smaller, the result is better than that obtained by the conventional K-means, the number of the effective points of the manual segmentation result is 390703 and accounts for 29.80 percent of the total points, the number of the effective points of the traditional K-means clustering result is 363461 and accounts for 27.75 percent of the total points, and the number of the effective points of the method is 391450 and accounts for 29.86 percent of the total points.
On the other hand, as shown in the figure, an embodiment of the present invention further provides an automatic extraction system for effective points of a grating projection profile, where the system includes:
the PC is used for encoding to generate a sine stripe image;
the projector is used for projecting the sine stripe image to the surface of the measured object; and
the CMOS camera is used for acquiring a plurality of modulation images of the measured object; wherein the content of the first and second substances,
the PC is also used for unwrapping the phase through software programming and reconstructing the three-dimensional profile of the measured object; after the modulation image is obtained, clustering pixel points into three types of background points, boundary points and target points, and respectively determining an initial centroid of the boundary points, an initial centroid of the target points and an initial centroid of the background points; clustering, defining a threshold value tm, and when the difference value between the centroid values of two iterations before and after the centroid is smaller than the threshold value tm in the t-th clustering, determining that the centroid at the moment approaches convergence; removing points greater than c3t from the data list if the target point centroid approaches convergence; if the background point centroid approaches convergence, then points less than c1t are removed from the data list.
Specifically, the CMOS camera acquires four modulated images of the object to be measured.
Although the present invention has been described in detail with reference to the preferred embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted without departing from the spirit and scope of the invention as defined in the appended claims. The techniques, shapes, and configurations not described in detail in the present invention are all known techniques.

Claims (7)

1. A method for automatically extracting effective points of a grating projection profile is characterized by comprising the following steps:
coding on a PC to generate a sine stripe image;
the projector projects the sine stripe image to the surface of the measured object;
the CMOS camera collects a plurality of modulation images of the measured object;
unwrapping the phase by software programming, and reconstructing a three-dimensional profile of the measured object; wherein the content of the first and second substances,
after the modulation image is obtained, clustering pixel points into three types of background points, boundary points and target points, and respectively determining an initial centroid of the boundary points, an initial centroid of the target points and an initial centroid of the background points;
clustering, defining a threshold value tm, and when the difference value between the centroid values of two iterations before and after the centroid is smaller than the threshold value tm in the t-th clustering, determining that the centroid at the moment approaches convergence;
removing points greater than c3t from the data list if the target point centroid approaches convergence;
removing points less than c1t from the data list if the background point centroid approaches convergence;
the three initial centroids are determined according to the following formula:
Figure FDA0002929829620000011
wherein the boundary point initial centroid c2 is the average modulation value of the entire modulation map, the target point initial centroid c3 is the average of all points greater than c2, and the background point initial centroid c1 is the average of all points less than centroid c 2;
Figure FDA0002929829620000012
2. the method for automatically extracting the effective points of the projection profile of the grating as claimed in claim 1, wherein: the modulation image is obtained according to the following formula:
Figure FDA0002929829620000013
wherein N is the nth phase shift, N represents the total phase shift number, and In is the fringe intensity map of the nth phase shift; m is a modulation image.
3. The method for automatically extracting the effective points of the projection profile of the grating as claimed in claim 2, wherein: after the clustering is completed, the points of the boundary points can be divided into target points if they satisfy one of the following two conditions:
first, the point is directly connected to the target point;
second, the gradient at this point is less than 2 times the maximum gradient at the target point.
4. The method for automatically extracting the effective points of the projection profile of the grating as claimed in claim 3, wherein: and segmenting background points in the modulated image after clustering is finished by using an Otsu method, and taking a part with a large threshold value in a segmentation result as a target point.
5. The method for automatically extracting the effective points of the projection profile of the grating as claimed in claim 4, wherein: the CMOS camera collects four modulation images of the measured object.
6. System for application in a method for automatic extraction of significant points of a raster projection profile according to any of claims 1 to 5, characterized in that it comprises: the PC is used for encoding to generate a sine stripe image;
the projector is used for projecting the sine stripe image to the surface of the measured object; and
the CMOS camera is used for acquiring a plurality of modulation images of the measured object; wherein the content of the first and second substances,
the PC is also used for unwrapping the phase through software programming and reconstructing the three-dimensional profile of the measured object; after the modulation image is obtained, clustering pixel points into three types of background points, boundary points and target points, and respectively determining an initial centroid of the boundary points, an initial centroid of the target points and an initial centroid of the background points; clustering, defining a threshold value tm, and when the difference value between the centroid values of two iterations before and after the centroid is smaller than the threshold value tm in the t-th clustering, determining that the centroid at the moment approaches convergence; removing points greater than c3t from the data list if the target point centroid approaches convergence; if the background point centroid approaches convergence, then points less than c1t are removed from the data list.
7. The system according to claim 6, wherein the system comprises: the CMOS camera acquires four modulated images of the measured object.
CN201910915664.3A 2019-09-26 2019-09-26 Automatic extraction method and system for effective points of grating projection profile Active CN110645920B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910915664.3A CN110645920B (en) 2019-09-26 2019-09-26 Automatic extraction method and system for effective points of grating projection profile

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910915664.3A CN110645920B (en) 2019-09-26 2019-09-26 Automatic extraction method and system for effective points of grating projection profile

Publications (2)

Publication Number Publication Date
CN110645920A CN110645920A (en) 2020-01-03
CN110645920B true CN110645920B (en) 2021-04-27

Family

ID=68992728

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910915664.3A Active CN110645920B (en) 2019-09-26 2019-09-26 Automatic extraction method and system for effective points of grating projection profile

Country Status (1)

Country Link
CN (1) CN110645920B (en)

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2327894A1 (en) * 2000-12-07 2002-06-07 Clearview Geophysics Inc. Method and system for complete 3d object and area digitizing
CN100520285C (en) * 2006-07-13 2009-07-29 黑龙江科技学院 Vision measuring method for projecting multiple frequency grating object surface tri-dimensional profile
CN102607466B (en) * 2012-03-29 2014-10-01 天津大学 Grating projection rapid non-contact measurement method and device for high-reflectance free-form curved-surface parts
CN104966065B (en) * 2015-06-23 2018-11-09 电子科技大学 target identification method and device
CN109489584B (en) * 2018-12-03 2021-02-26 大连维德集成电路有限公司 Tunnel clearance detection system and tunnel clearance identification method based on 3D technology

Also Published As

Publication number Publication date
CN110645920A (en) 2020-01-03

Similar Documents

Publication Publication Date Title
Williem et al. Robust light field depth estimation for noisy scene with occlusion
CN108549874B (en) Target detection method, target detection equipment and computer-readable storage medium
CN109636732B (en) Hole repairing method of depth image and image processing device
Fattal Dehazing using color-lines
IES20060564A2 (en) Improved foreground / background separation
KR101811718B1 (en) Method and apparatus for processing the image
KR100825773B1 (en) Method and apparatus for estimating orientation
JP2004164624A (en) Method and apparatus for low depth of field image segmentation
US20200250840A1 (en) Shadow detection method and system for surveillance video image, and shadow removing method
US20130215234A1 (en) Method and apparatus for stereo matching
CN110400338B (en) Depth map processing method and device and electronic equipment
KR20130112311A (en) Apparatus and method for reconstructing dense three dimension image
CN107239729B (en) Illumination face recognition method based on illumination estimation
CN112085673A (en) Multi-exposure image fusion method for removing strong ghost
TW201432620A (en) Image processor with edge selection functionality
Tsomko et al. Linear Gaussian blur evolution for detection of blurry images
CN110645920B (en) Automatic extraction method and system for effective points of grating projection profile
McFeely et al. Shadow identification for digital imagery using colour and texture cues
CN108680119B (en) Partitioned single-amplitude fast phase unwrapping method
CN109241865B (en) Vehicle detection segmentation algorithm under weak contrast traffic scene
AU2016273979A1 (en) System and method for adjusting perceived depth of an image
KR20160127206A (en) System and method for removing eyelashes in iris region
WO2015140484A1 (en) A method, apparatus, system, and computer readable medium for enhancing differences between two images of a structure
CN111951254B (en) Edge-guided weighted-average-based source camera identification method and system
KR101655036B1 (en) Method and System for Generating Anaglyph Image Reconstruction and Depth Map

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant