CN118212410A - Industrial scene part instance segmentation method based on improved European clustering - Google Patents

Industrial scene part instance segmentation method based on improved European clustering Download PDF

Info

Publication number
CN118212410A
CN118212410A CN202410374446.4A CN202410374446A CN118212410A CN 118212410 A CN118212410 A CN 118212410A CN 202410374446 A CN202410374446 A CN 202410374446A CN 118212410 A CN118212410 A CN 118212410A
Authority
CN
China
Prior art keywords
point
point cloud
clustering
dimensional
scene
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410374446.4A
Other languages
Chinese (zh)
Inventor
余洪山
赵嘉荣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hunan University
Original Assignee
Hunan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hunan University filed Critical Hunan University
Priority to CN202410374446.4A priority Critical patent/CN118212410A/en
Publication of CN118212410A publication Critical patent/CN118212410A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/30Noise filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/762Arrangements for image or video recognition or understanding using pattern recognition or machine learning using clustering, e.g. of similar faces in social networks
    • G06V10/763Non-hierarchical techniques, e.g. based on statistics of modelling distributions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/06Recognition of objects for industrial automation

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Software Systems (AREA)
  • Computing Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Artificial Intelligence (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Health & Medical Sciences (AREA)
  • Probability & Statistics with Applications (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses an industrial scene part example segmentation method based on improved European clustering, which comprises the following steps: firstly, preprocessing a real industrial scene point cloud which is acquired by a vision system and comprises a plurality of workpieces, and removing a background working plane, background redundant points, outliers and workpiece contact point clouds; then, performing example segmentation clustering of two-dimensional Euclidean distance and three-dimensional Euclidean distance on the preprocessed point cloud file by adopting an Euclidean clustering algorithm, and completing position division of an example level by utilizing a point cloud position relation by fusing the clustering result to segment a single workpiece point cloud; and finally, dynamically compensating the removed contact point cloud to the segmented workpiece point cloud to finish the workpiece instance segmentation in the industrial scene. The invention designs a set of high-stability high-instantaneity example segmentation method without a large amount of training data, and solves the technical problem of the urgent need of segmenting a target scene part stack into a series of single workpieces in the field of industrial robots.

Description

Industrial scene part instance segmentation method based on improved European clustering
Technical Field
The invention relates to an industrial scene part example segmentation method based on improved European clustering, and belongs to the technical field of computer vision.
Background
In recent years, the Chinese manufacturing industry has undergone rapid development, and in order to promote the upgrading and transformation of the manufacturing industry, the goal is to promote the manufacturing industry to develop towards high-end and intelligent directions, strengthen the technological innovation and the independent research and development capability, and promote the deep integration of industrialization and informatization. In this context, the combination of computer vision and industrial production has become an important area of research, especially robotic grasping based on visual localization. By utilizing the vision technology, a plurality of scattered targets in an industrial scene can be accurately detected and positioned, so that the intelligent grabbing operation of the mechanical arm is realized. Three key tasks in the robot grabbing process based on visual positioning are target positioning, target pose estimation and grabbing estimation, and the target positioning is the hot spot direction studied in recent years.
Specifically, the object localization tasks in visual localization, including unclassified object localization, object detection, and object instance segmentation, provide regions of the object in the input data. Non-classified object localization outputs only potential regions of objects without knowing their classifications; target detection provides bounding boxes of target objects and their categories; the target instance segmentation further provides pixel-level or point-level regions of the target object along their categories. The object instance segmentation task is one of the most challenging tasks in the robot grabbing process, the object instance segmentation requirement is to not only distinguish different individuals of the same category, but also provide a segmentation mask of pixel/point cloud level, namely, each pixel or point cloud of each object is marked, which is equivalent to combining semantic segmentation and object detection, and greatly increases the research difficulty of an algorithm.
The deep learning direction has been very successful in instance segmentation. By using a deep neural network, this approach is able to automatically learn and predict the segmentation results at the object boundary and pixel level in the image. The advantage of this approach is that complex scenes and multiple object instances can be handled and, with sufficient training, higher accuracy can be achieved. However, instance segmentation algorithms based on deep learning need to rely on extensive annotation data to support training of network parameters, thereby creating a dilemma of "how intelligent there is, and how much manual (annotation)" there is. In the aspect of industrial target data acquisition, the labeling cost of the industrial target real scene training sample is high, and the dilemma is further aggravated.
Disclosure of Invention
The invention solves the technical problems that: aiming at the problem that the training cost is high in an instance segmentation algorithm by using a deep neural network, the method for segmenting the instance of the industrial scene part based on the improved European clustering is provided.
The invention is realized by adopting the following technical scheme:
an industrial scene part example segmentation method based on improved European clustering comprises the following steps:
S1, preprocessing a real industrial scene point cloud which is acquired by a visual system and comprises a plurality of workpieces, and removing redundant point set working planes, pseudo edges, isolated noise points and workpiece contact edge point clouds in a real industrial scene point cloud image;
s2, performing example segmentation clustering on the preprocessed point cloud file by adopting an Euclidean clustering algorithm, wherein the example segmentation clustering is respectively performed on the two-dimensional Euclidean distance and the three-dimensional Euclidean distance, and the fusion clustering result is used for completing position division of an example level by utilizing a point cloud position relation to segment out single workpiece point clouds;
S3, dynamically compensating the point cloud of the contact edge point of the workpiece removed in the step S1 back to the segmented point cloud of the workpiece, and completing the segmentation of the workpiece instance in the industrial scene.
In the method for segmenting the industrial scene part examples based on the improved European style clustering, the step S1 further comprises the following sub-steps:
S11, acquiring real industrial scene point cloud images of N workpieces by using a binocular structured light three-dimensional imaging system;
S12, fitting a plane in the scene by adopting a RANSAC method, and eliminating a redundant point set working plane of the scene point cloud image acquired in the substep S11 by utilizing the fitted plane parameters and the plane point set centroid coordinates;
S13, filtering the outliers of the scene point cloud image obtained in the sub-step S12 by using a statistical method, and removing the pseudo edges and the isolated noise points in the outliers of the scene point cloud image;
S14, calculating surface normal vectors of the scene point cloud processed in the substep S13 based on a principal component analysis method, and carrying out consistency adjustment on the normal vectors to keep all the normal vectors pointing out of the surface;
And S15, performing edge point rejection optimization on the scene point cloud filtered in the substep S13 by utilizing the scene point cloud normal vector obtained in the substep S14, distinguishing contact edge points and non-contact edge points, and rejecting contact edge point clouds of a real industrial scene point cloud image where the workpiece is located.
In the method for segmenting the industrial scene part examples based on the improved European style clustering, the concrete process of the substep S12 is as follows:
firstly, selecting a plurality of candidate point clouds with sampling base number k >3 on the same plane in a point cloud space The following equation is satisfied:
where θ 1、θ2、θ3、θ4 is four parameters of the plane equation, And/>Representing three-dimensional coordinates of candidate point clouds p 1 and p N in a point cloud space, and converting the three-dimensional coordinates into a matrix form by the following formula:
θ is a plane fitting parameter vector, a represents a plane fitting matrix, and the plane fitting parameter vector θ * is calculated by:
Error calculation is then performed on the calculated plane fit parametric model with the remaining data points in the point cloud space, the fit error in the plane fit parametric model being defined as the distance between the point p and the affine space, wherein the affine space Is defined as:
solving for each point by Lagrangian multiplier Solution of (2);
The point p is found according to the following Fitting error on, i.e. orthogonal projection squared distance:
Comparing the orthogonal projection square distance with a preset error threshold mu, regarding the orthogonal projection square distance as an in-plane point smaller than the preset error threshold mu, traversing each point cloud, and recording the number of all the in-plane points under the theta *;
updating the plane fitting parameter vector theta *, repeating the process, if the number of the new in-plane points is larger than the current maximum number of the in-plane points, reserving the updated plane equation parameters until the set maximum iteration number, and reserving the point cloud plane corresponding to the finally updated plane equation parameters;
And calculating the coordinates (sigma xyz) of the point cloud plane centroid sigma corresponding to the final updated plane equation parameters, directly deleting all point clouds with the Z-axis direction smaller than sigma z+Zi, and taking Z i as a set tiny fluctuation value.
In the method for segmenting the industrial scene part examples based on the improved European style clustering, the concrete process of the substep S13 is as follows:
Defining the maximum distance threshold tau between N adjacent points in the scene point cloud neighbor obtained in the substep S12, for the point cloud P i epsilon P with the sequence number i in the point cloud, P represents the whole point cloud space, searching N nearest neighbor point clouds M n, calculating the Euclidean distance between each adjacent point and P i by using the following formula, calculating the distance average d i,
Where d (p i,Mn) is the point cloud p i and neighbor point cloud M n distance calculation,Is the three-dimensional coordinates of the point cloud p i,/>Three-dimensional coordinates of the neighboring point cloud M n;
If the average distance d i is larger than tau, filtering out outliers according to p i, otherwise, keeping, repeating the process to traverse each scene point cloud, and eliminating all scene point cloud image outliers.
In the method for segmenting the industrial scene part examples based on the improved European style clustering, the concrete process of the substep S14 is as follows:
firstly, performing z-channel smoothing on scene point cloud data under a primary camera coordinate system to obtain smoothed point cloud For each point cloud/>All calculate the z-axis mean μ z of neighboring k points if/>Subtracting the z-axis of the point by a step dρ if/>Adding a step dρ, τ l and τ h being the set calibration threshold, step dρ being 2 to 4 times the camera resolution;
computing point cloud Calculating a covariance matrix sigma of the point cloud using the following formula,
Wherein k represents the number of neighborhood points, and p ij represents the ith point cloudCoordinates of the j-th neighborhood point of (c)/>Represents the i-th point cloud/>An average of coordinates of neighboring points;
Decomposing the covariance matrix sigma to obtain a characteristic value and a characteristic vector of the matrix, wherein the characteristic vector with the minimum characteristic value is a point cloud normal vector; traversing each point cloud, and calculating normal vectors of all the point clouds;
Then calculating the normal vector mean value of all the point clouds, and calculating the sum of the normal vectors of all the q points of the neighboring point clouds to obtain a sum vector And then/>Dividing the neighborhood point number q to obtain a normal vector mean vector/>
Computing for each point cloud its original normal vectorAnd normal vector mean vector/>The included angle beta between the two normal vectors is larger than the set threshold beta max, namely the original normal vector is inconsistent with the neighborhood normal vector direction, the original normal vector/>And taking the reverse, and repeating the process until the normal vectors of all the point clouds are consistent after adjustment.
In the method for segmenting the industrial scene part examples based on the improved European clustering, the concrete process of the substep S15 is as follows:
For the rest points after the outlier points of all scene point cloud images are removed and the normal vectors of the point clouds are adjusted, the radius r is selected, a neighborhood of the point cloud P b epsilon P is established, and the included angle between the point cloud P b and the normal vectors of the other point clouds P j in the neighborhood is calculated by using the following arctangent function:
Wherein, theta bj is the included angle between the normal vector of the point cloud P b and another point cloud P j in the neighborhood, n b and n j are the normal vectors of P b and P j respectively, the average value alpha of the included angle of the whole neighborhood normal vector of the point cloud P b epsilon P is calculated by the following method,
G is the number of point clouds in the neighborhood, a judgment threshold epsilon th is set, and whether the point clouds are contact edge point clouds is distinguished through comparison of the following threshold values:
For contact edge point clouds, alpha > epsilon th, whereas for other point clouds in the scene, alpha is less than or equal to epsilon th;
for all contact edge point clouds i∈CPcon, k n nearest neighbors are searched for each contact edge point s i, the distance average value mu i of each nearest neighbor to the contact edge point is calculated, the size of the distance average value mu i and the threshold mu th is judged, if mu i<μth is determined, the point belongs to the contact edge point to be removed, if mu i≥μth is determined, the point is not the contact edge point and should not be removed;
And storing the removed edge point cloud into an edge point set P e to finally obtain preprocessed scene point cloud data S sc.
In the method for segmenting the industrial scene part examples based on the improved European style clustering, the step S2 further comprises the following sub-steps:
S21, setting a maximum point cloud distance threshold, and the minimum point cloud number M min and the maximum point cloud number M max of a single-instance workpiece, and continuously carrying out k neighbor dynamic searching on the scene point cloud file processed in the sub-step S15 to obtain a three-dimensional point cloud instance segmentation clustering result of the workpiece in the scene based on European clustering;
S22, multiplying the scene point cloud file processed in the substep S15 by a scaling with a proportion omega in the z-axis direction under a camera coordinate system, wherein 0< omega <1, and then performing the same European clustering calculation in the substep S21 to obtain a workpiece two-dimensional point cloud instance segmentation clustering result in the scene based on European clustering;
S23, merging the three-dimensional European clustering obtained in the substep S21 and the segmentation clustering result of the two-dimensional European clustering obtained in the substep S22, and realizing the clustering selection of a combination selection strategy to obtain a workpiece instance segmentation result in a final scene.
In the method for segmenting the industrial scene part examples based on the improved European style clustering, the concrete process of the substep S21 is as follows:
Initializing a distance threshold r d of a point cloud distance threshold, setting the minimum point cloud number M min and the maximum point cloud number M max of each cluster of an example workpiece, inputting scene point cloud data S sc, randomly selecting a starting point S 0 as a current active point, and putting the current active point into a cluster subset Q i;
Searching all points with the distance less than or equal to r d in the k neighborhood, and collecting the points into a point set Merging all members of the point set S ok into the cluster subset Q i, marking S q as processed, selecting the next unprocessed point S n as a new active point in Q i, and repeating the above processing procedure;
When all points in the clustering subset Q i are marked as processed, or the number of points of the clustering subset Q i exceeds the maximum number of point clouds M max, removing Q i from the scene point clouds S sc, restarting clustering the next subset Q i+1, and repeating subset merging clustering;
After the clustering of all the points in the point cloud is completed, the clustering subset with the number of the point clouds smaller than the minimum number of the point clouds M min is removed, and the following formula is satisfied for any set Q i:
Qi={sj|d(sj,sk)<rd,sj,sk∈Qi},
Wherein s j is any point of the set, s k is a point which is less than r d from s j in the set, and the number of s k is at least 1.
In the method for segmenting the industrial scene part examples based on the improved European style clustering, the concrete process of the substep S23 is as follows:
Respectively carrying out three-dimensional (x, y, z) space European clustering of the substep S21 and two-dimensional (x, y) space European clustering of the substep S22 on the scene point cloud data, wherein the maximum threshold delta max of the set point cloud quantity is set during the three-dimensional European clustering, and the maximum threshold is not set during the two-dimensional European clustering; after a clustering result is obtained, the number of point clouds of each cluster is reduced according to a proportion C, all cluster subsets obtained by two European clusters are ranked according to the number of the point clouds from large to small, the cluster center of each cluster is calculated, and the number n of cluster combinations is set according to the workpiece types;
selecting a cluster with the largest number of point clouds from the three-dimensional European cluster array as a current cluster, finding n cluster centers closest to the current cluster center, and combining the n cluster instances with the current cluster to form a new cluster;
Finding a new cluster with the number of cluster points exceeding the maximum threshold delta max of the number of point clouds, sequentially comparing the new cluster with the two-dimensional clusters according to the positive sequence, wherein the compared indexes comprise the total number of the point clouds and the Euclidean distance of the cluster center on the (x, y, 0) plane, and calculating by adopting the following formula:
N 3d is the number of three-dimensional European style clustering point clouds, N 2d is the number of two-dimensional European style clustering point clouds, C 3d is the center coordinates of the three-dimensional European style clustering point clouds, C 2d is the center coordinates of the two-dimensional European style clustering point clouds, sigma N and sigma D are super-parameters, if the epsilon value obtained by calculation is smaller than a set threshold epsilon h, the two-dimensional European style clustering and the three-dimensional European style clustering after combination are considered to be corresponding, under-segmentation occurs during the three-dimensional European style clustering, the result of the combination clustering is adopted, and the corresponding two-dimensional clustering and N three-dimensional clusters after combination are respectively removed from an array;
selecting a next three-dimensional clustering result from the three-dimensional clustering array to continue execution; after traversing the whole three-dimensional clustering array, adding 1 to the value of n, and starting circulation from the initial position again until the array is empty or n exceeds a preset maximum value;
In the cyclic process, when the number of the clustering point clouds exceeds a maximum threshold delta max of the number of the point clouds, the situation that the two-dimensional segmentation is excessively segmented is indicated, at this time, the largest n clusters are gradually removed from the far to near sequence relative to the clustering center until the total number of the point clouds meets the requirement, the reserved clusters are added to the result, the corresponding points are deleted from the two-dimensional clustering point clouds, and the cyclic is continuously executed after the corresponding clusters are deleted from the three-dimensional clustering array.
In the method for segmenting the industrial scene part examples based on the improved European style clustering, the specific process of the step S3 is as follows:
Performing nearest neighbor search of a range R on all edge points in an edge point set P e, and classifying the edge points into a corresponding subset Y CN according to the non-edge point type if the non-edge points are found;
The cluster centers for each of the subsets Y CN are calculated according to the following formula:
Wherein, C Y is the cluster center coordinate, m is the number of cluster point clouds, x i is the single point cloud coordinate, then, for each non-edge point p c, the euclidean distance between the non-edge point p c and each cluster center is calculated, the non-edge point p c is classified into the category closest to the cluster center, and Y CN for classifying all edge points is repeated until the category of all points is not changed or exceeds the set iteration frequency limit.
The present invention gives full play again to the existing object point cloud instance segmentation, which is generally based on some classical algorithms and techniques in the computer vision field, such as edge detection, region growing, and graph cutting. These methods do not require a large amount of training data, which relies on manually designed features and rules. Furthermore, example segmentation methods may be more stable and reliable in certain scenarios because they are less susceptible to noise and complex background interference.
Meanwhile, compared with the traditional segmentation method based on European clustering, the method has the following beneficial effects:
(1) The invention carries out preprocessing on point cloud, eliminates working plane by adopting RANSAC plane fitting algorithm, removes background redundant point and outlier by using statistical method, and reduces speckle error influence of binocular structured light camera.
(2) Aiming at the problem of segmentation misjudgment caused by shape discontinuity introduced by the contact point cloud, the method calculates the normal vector of the point cloud based on a principal component analysis method, adjusts the normal vector direction and selectively eliminates the scene contact edge point cloud by using an arctangent function. Compared with the traditional European clustering algorithm which is difficult to treat shielding and adhesion, after edge point elimination is added, the performance of the European clustering algorithm is improved to a certain extent, and the accuracy of the result is improved by 40%.
(3) The invention provides a combination strategy for fusing two-dimensional three-dimensional European clustering segmentation results, wherein the two-dimensional three-dimensional clustering results are fused to finish position division of instance levels by utilizing the position relation of point clouds, single workpiece point clouds are segmented, influence caused by imaging missing is reduced, edge point elimination is combined, and European clustering algorithm segmentation effect is further improved.
In summary, the method for segmenting the part example of the industrial scene, which is designed by fully utilizing the three-dimensional point cloud information acquired by the 3D camera under the industrial scene of part stacking and part shielding, can quickly acquire a target for accurate positioning without a large amount of training data, robustly complete the unordered grabbing task of the robot, and solves the technical problem that the target scene part stack is segmented into a series of single workpieces in the field of industrial robots.
The invention is further described below with reference to the drawings and detailed description.
Drawings
Fig. 1 is an overall flow chart of the present invention.
Fig. 2 is a sub-step flow chart of step S1.
Fig. 3a and 3b are front-back comparison diagrams for filtering redundant working planes, wherein fig. 3a is an original workpiece scene point cloud diagram, and fig. 3b is a point cloud diagram after eliminating a redundant point set working plane based on a RANSAC plane fitting algorithm.
Fig. 4a and 4b are comparison diagrams before and after outlier filtering, wherein fig. 4a is a workpiece scene point cloud diagram, and fig. 4b is a point cloud diagram after removing false edges and isolated noise points of the scene point cloud based on a statistical method.
Fig. 5 and 6 are diagrams of comparison examples before and after the contact edge point cloud processing of two workpieces in the embodiment.
Fig. 7 is a sub-step flow chart of step S2.
FIG. 8 is a flowchart of Euclidean clustering calculation in the present invention.
Detailed Description
Examples
The industrial scene part example segmentation method is based on an improved European cluster segmentation algorithm, the whole flow is shown in figure 1, S1, preprocessing is carried out on a real industrial scene point cloud acquired by a visual system, a working plane is removed by adopting a RANSAC plane fitting algorithm, background redundant points and outliers are removed by using a statistical method, the speckle error influence of a binocular structured light camera is reduced, meanwhile, the normal vector of the point cloud is calculated, the normal vector is adjusted, and the redundant point set working plane, a pseudo edge, isolated noise point and workpiece contact edge point cloud in a real industrial scene point cloud image are removed; s2, respectively carrying out two-dimensional Euclidean distance and three-dimensional Euclidean distance by adopting an Euclidean clustering algorithm, fusing a two-dimensional three-dimensional clustering result, and completing position division of an instance level by utilizing the position relation of the point cloud to divide a single workpiece point cloud; s3, dynamically compensating the point cloud of the contact edge point of the workpiece removed in the step S1 back to the segmented point cloud of the workpiece, completing the segmentation of the workpiece instance in the industrial scene, and ensuring the integrity of data.
The above steps of the present invention will be described in detail below by way of an application example.
The industrial part scene point cloud of the embodiment is obtained by shooting and simulating an industrial robot grabbing environment by the LED double-monocular structured light three-dimensional imaging sensor Gocator and comprises three-dimensional point cloud information of an industrial part scene. As an application example, a test result of a homemade dataset DB9-1000 with an example segmentation target of nine-pin serial port model is given, the dataset containing 1000 real photographed industrial part scenes.
Step S1, preprocessing the collected industrial real scene point cloud comprising a plurality of workpieces comprises the following substeps:
and S11, shooting an environment of capturing by an analog industrial robot by using an LED double-monocular structured light three-dimensional imaging sensor Gocator and obtaining 1000 real industrial part scene point cloud pictures, wherein part of the point cloud pictures are shown in FIG. 3 a.
And step S12, fitting planes in the scene by adopting a RANSAC method, and eliminating redundant point set working planes of 1000 scene point cloud images acquired in the step S11 by utilizing fitted plane parameters and plane point set centroid coordinates, wherein the partially filtered scene is shown in fig. 3 b.
The specific filtering process in the step is as follows:
(1) Randomly selecting candidate point clouds p 1、p2、p3 with sampling cardinality k=3 on the same plane in the point cloud space, satisfying the following equation,
Where θ 1、θ2、θ3、θ4 is four parameters of the plane equation, The three-dimensional coordinates of the candidate point cloud p 1、p2、p3 in the point cloud space are respectively represented, and the three-dimensional coordinates are converted into a matrix form as follows:
θ is a plane fitting parameter vector, a is a plane fitting matrix, and the plane fitting parameter vector θ * is calculated by:
(2) Error calculation of the calculated planar fitting parameter model with the remaining data points in the point cloud space, the fitting error being defined as the point p to affine space Distance between, affine space/>Is defined as:
solving for each point by Lagrangian multiplier Is calculated according to the following equationFitting errors, i.e. orthogonal projection squared distances,
And meanwhile, comparing the distance with a preset error threshold mu, regarding the distance as an in-plane point smaller than a set value, traversing each point cloud, and recording the number of all the in-plane points under the theta *.
(3) Repeating the steps (1) and (2), updating the plane fitting parameter vector theta *, and if the number of the new in-plane points is greater than the current maximum number of the in-plane points, retaining the updated plane equation parameters.
(4) Repeating the steps (1) - (3) until the maximum iteration time is 500 times, and reserving the point cloud plane corresponding to the finally updated plane equation parameters.
(5) And calculating the coordinates (sigma xyz) of the point cloud plane centroid sigma corresponding to the final updated plane equation parameters, directly deleting all point clouds smaller than sigma z+Zi in the Z-axis direction, wherein Z i is a set tiny fluctuation value, and the example is set to be 0.3mm.
In the substep S13, the scene point cloud is shown in fig. 3b, the outlier of the scene point cloud file calculated in the substep S12 is filtered by using a statistical method, the pseudo edges and the isolated noise points of the scene point cloud are removed, and the scene point cloud before and after the filtering is shown in fig. 4a and fig. 4b respectively.
The specific removing process for removing the scene point cloud pseudo edges and the isolated noise points in the embodiment is as follows:
(1) Defining the neighbor point number n=50 and the maximum distance threshold τ=4mm of the scene point cloud neighborhood obtained in the substep S12.
(2) For a point cloud P i epsilon P with a sequence number of i in the point clouds, searching N nearest neighbor point clouds M n, calculating Euclidean distance between each adjacent point and P i by using the following formula, and calculating a distance average value d i;
Where d (p i,Mn) is the point cloud p i and neighbor point cloud M n distance calculation, Is the three-dimensional coordinates of the point cloud p i,/>Is the three-dimensional coordinates of the neighboring point cloud M n.
(3) If the average distance d i > τ, then p i is considered as the outlier for filtering, otherwise it is preserved.
(4) And (3) repeating the steps (2) and (3), traversing each point cloud, and eliminating outliers of all scene point cloud images.
The neighbor searching method in the embodiment adopts a KD-Tree neighbor searching method.
And S14, calculating normal vectors of the surface of the workpiece point cloud processed in the step S13 based on a principal component analysis method, and carrying out consistency adjustment on the normal vectors to keep all the normal vectors pointing out of the surface.
The specific process of the calculation and adjustment method is as follows:
(1) Performing primary z-axis channel smoothing on scene point cloud data under a camera coordinate system to obtain smoothed point cloud For each point cloud P i ε P, calculate the z-axis mean μ z of neighboring k points, if/>The z-axis of the point is subtracted by a step dρ if/>A step dρ is added. Here τ l and τ h are manually set calibration thresholds, and in this embodiment, τ l=0.05mm,τh =0.15 mm, the step size dρ is 2 to 4 times the camera resolution, and in this embodiment dρ=0.05 mm.
(2) Computing point cloudCalculating a covariance matrix sigma of the point cloud using the following formula,
Wherein k represents the number of neighborhood points, and p ij represents the ith point cloudCoordinates of the j-th neighborhood point of (c)/>Represents the i-th point cloud/>Is an average of coordinates of neighboring points.
Decomposing the covariance matrix sigma to obtain a characteristic value and a characteristic vector of the matrix, wherein the characteristic vector with the minimum characteristic value is a normal vector, traversing each point cloud, and calculating normal vectors of all the point clouds.
(3) Calculating a vector mean value, and calculating the sum of normal vectors of all points of the neighboring q=100 points of each point cloud to obtain a sum vectorAnd then/>Dividing the neighborhood point number q=100 to obtain a normal vector mean vector/>
(4) For each point cloud P i E P, calculating the original normal vectorAnd normal vector mean vector/>And an included angle beta between the two. If beta is larger than the set threshold beta max =90°, i.e. the original normal vector is not consistent with the neighborhood normal vector direction, the original normal vector is thenAnd taking the inverse.
(5) Repeating the step (3) and the step (4) until the normal vector consistency adjustment of all the point clouds is completed.
Step S15, performing edge point rejection optimization on the scene point cloud filtered in the sub-step 13 by utilizing the scene point cloud normal vector obtained in the sub-step S14, distinguishing contact edge points and non-contact edge points, rejecting contact point clouds of a workpiece, rejecting contact edge point clouds of a real industrial scene point cloud image of the workpiece, and enumerating two examples of comparison before and after rejecting the contact point clouds in the embodiment as shown in fig. 5 and 6.
The specific process of edge point cloud rejection is as follows:
(1) And selecting a radius r for the rest points after removing outliers of all scene point cloud images and adjusting normal vectors of the point clouds, and establishing a neighborhood of the point cloud P b epsilon P. r is typically 20 times the resolution of the camera, in order to obtain a near point cloud within the radius of the point cloud r, calculate if it is a contact edge point cloud, r=1 mm in this embodiment.
(2) The angle between the point cloud p b and the normal vector of the other point cloud p j in the neighborhood is calculated using the following arctangent function,
Wherein, theta bj is the included angle between the normal vector of the point cloud P b and another point cloud P j in the field, n b and n j are the normal vectors of P b and P j, the average value alpha of the included angle of the whole neighborhood normal vector of the point cloud P b epsilon P is calculated by using the following formula,
G is the number of point clouds in the neighborhood.
(3) Setting a judgment threshold epsilon th =36°, comparing the thresholds by the following method to distinguish whether the point cloud is a contact edge point cloud,
For contact edge point clouds, α > ε th, whereas for other point clouds in the scene, α ε th is less than or equal to.
(4) For all contact point clouds i∈CPcon, for each contact edge point S i, k n =6 nearest neighbors are found, the average value mu i of the distance from each nearest neighbor to the contact edge point S i is calculated, the size of the average value mu i and the threshold mu th is judged, in this embodiment mu th =0.25 mm, if mu i<μth, the point is considered to be a contact edge point, if mu i≥μth, the point is considered not to be a contact edge point, the removed edge point clouds are stored in an edge point set P e, and finally preprocessed scene point cloud data S sc is obtained.
The pretreatment of the real industrial scene point cloud acquired by the visual system in the step S1 is finished, the working plane is removed by adopting a RANSAC plane fitting algorithm, and the background redundant points and outliers are removed by using a statistical method, so that the speckle error influence of the binocular structured light camera is reduced. Meanwhile, a point cloud normal vector is calculated based on a principal component analysis method, normal vector pointing is adjusted, scene contact edge point clouds are selectively removed by using an arctangent function, workpiece shielding and adhesion conditions are reduced after edge point removal is added, and the accuracy of a subsequent European clustering algorithm is improved by 40%.
As shown in fig. 7, step S2 adopted in the present embodiment segments a single workpiece point cloud by fusing after performing example segmentation clustering by two-dimensional euclidean clustering and three-dimensional euclidean clustering, and specifically includes the following sub-steps:
and S21, continuously and dynamically searching the point cloud file processed in the step S15 by the k neighbor dynamic search sub-step to obtain a three-dimensional point cloud instance segmentation clustering result of the scene workpiece based on European clustering through manually setting a maximum point cloud distance threshold r d, a minimum point cloud number M min and a maximum point cloud number M max of the single instance workpiece.
Referring to the European clustering flow shown in FIG. 8, the clustering specific process based on European clustering in this embodiment is as follows:
(1) A distance threshold value r d =0.325 mm is initialized, the minimum and maximum points of each type of clusters are set to be M min =4500 and M max =27500, and point cloud data S sc of a scene is input.
(2) A starting point S 0 is randomly selected as the current active point in S sc and placed into the cluster subset Q i.
(3) Searching all points with the distance less than or equal to r in the k neighborhood, and collecting the points into a point set
(4) All members of set S 0k are merged into Q i and marked S 0 as processed.
(5) The next unprocessed point s n is selected as a new active point in Q i and steps (3) and (4) are repeated.
(6) When all points in the clustering subset Q i have been marked as processed, or the number of points of Q i exceeds M max, the clustering subset Q i is removed from the scene point cloud S sc and the clustering of the next subset Q i+1 is restarted, repeating steps (1) - (5).
(7) After the clustering of all the points in the point cloud is completed, the clustering subset with the number of the point cloud smaller than M min is removed, and the following formula is satisfied for any set Q i:
Qi={sj|d(sj,sk)<rd,sj,sk∈Qi}。
Wherein s j is any point of the set, s k is a point which is less than r d from s j in the set, and the number of s k is at least 1.
And S22, multiplying the processed scene point cloud file in the step 15 by a scaling with a proportion of omega in the z-axis direction, performing European clustering in the step S21 again to obtain a segmentation clustering result of the scene workpiece two-dimensional point cloud instance based on European clustering.
And S23, fusing the segmentation results of the two-dimensional European clustering and the three-dimensional European clustering obtained in the substep S21 and the substep S22, and realizing the cluster selection of a combined selection strategy to obtain a final example segmentation result.
The specific process of implementing the instance division by the combination policy of the embodiment is as follows:
(1) The three-dimensional (x, y, z) spatial euclidean clustering of the substep S21 and the two-dimensional (x, y) spatial euclidean clustering of the substep S22 are performed on the scene point cloud data respectively, wherein the maximum threshold delta max of the set point cloud quantity is set during the three-dimensional euclidean clustering, delta max =42000 in the embodiment, and the maximum threshold is not set during the two-dimensional euclidean clustering.
(2) After the clustering result is obtained, the number of point clouds of each cluster is reduced according to a proportion C, the influence caused by errors is reduced through discretization, and the proportion C is 0.05-0.2 according to the number of target examples, in this embodiment, c=0.1.
(3) And sequencing all cluster subsets obtained by the two European clusters from large to small according to the number of point clouds, calculating the cluster center of each cluster, and setting the number n of cluster combinations to be 1, wherein each cluster combination only comprises one cluster result, and the number n of combinations is set to be2 if two workpieces exist in the industrial scene.
(4) And selecting a largest cluster from the three-dimensional cluster array as a current cluster, finding 2 cluster centers closest to the current cluster center, and combining the instances of the 2 clusters into a new cluster. If the new cluster point number exceeds the maximum point cloud number threshold delta max, the following step (7) is entered.
(5) And comparing the new clusters with the two-dimensional clusters sequentially according to the positive sequence. The index of comparison comprises the total point cloud number and the Euclidean distance of the clustering center on the (x, y, 0) plane, the calculation is carried out by adopting the following formula,
Wherein N 3d is the number of three-dimensional euclidean clustered point clouds, N 2d is the number of two-dimensional euclidean clustered point clouds, C 3d is the center coordinates of the three-dimensional euclidean clustered point clouds, C 2d is the center coordinates of the two-dimensional euclidean clustered point clouds, σ N and σ D are super-parameters, and σ N=0.25,σD =1 in this embodiment.
If the calculated epsilon value is smaller than the set threshold epsilon h, the two-dimensional European clustering and the combined three-dimensional European clustering are considered to be corresponding, namely under-segmentation occurs during the three-dimensional European clustering, the threshold epsilon h is based on fine tuning of half of the maximum number of parallel planes contained in the current workpiece, generally 2-4, and in the embodiment, epsilon h =3.5. At this time, the result of merging clusters is adopted, and the corresponding two-dimensional European clusters and the merged n three-dimensional European clusters are respectively removed from the array. And (4) returning to the step, selecting the next three-dimensional European style clustering result to continue to execute, wherein the largest cluster in the three-dimensional European style clustering array is selected as the current cluster before, and the three-dimensional European style clustering result after the largest cluster is selected according to the largest cluster in the rest clusters. If the complete two-dimensional European cluster array is traversed and no matching item exists, continuing to circularly execute the step (4).
(6) After traversing the whole three-dimensional European cluster array, adding 1 to the value of n. Then, the loop is restarted from the start position until the array is empty or n exceeds a preset maximum value n max =2.
(7) When the number of the clustering points exceeds a maximum threshold delta max of the number of the point clouds, the situation that the two-dimensional segmentation is overdetermined is indicated. At this time, the largest n clusters are gradually excluded in the order from far to near until the total number of point clouds meets the requirement. The retained clusters are added to the result and the corresponding points are deleted from the two-dimensional European cluster point cloud instead of deleting the entire cluster entirely. And simultaneously deleting the corresponding clusters in the three-dimensional European cluster array, and returning to the step (4) to continue execution.
The method has the advantages that the workpiece point cloud is segmented based on European clustering in the step S2, the position division of the example level is completed by means of the position relation of the point cloud by fusing the two-dimensional European clustering and the three-dimensional European clustering segmentation results, the single workpiece point cloud is segmented, the influence caused by imaging missing is reduced, the edge point is removed, and the European clustering algorithm segmentation effect is further improved.
And finally, in the step S3, dynamically compensating the contact point cloud removed in the step S1 back to the segmented workpiece point cloud. In the step S3 of the embodiment, a k-means clustering algorithm is used for processing a final segmentation result, and the rejected contact edge point cloud is dynamically restored to the original point cloud. For one of the edge points in the edge point set P e, the nearest non-edge point is found by nearest neighbor search, so that the point belongs to the example represented by the non-edge point, all the edge points are searched in turn to find attribution, and then the dynamic compensation of the contact edge point cloud is completed. The specific recharging process is as follows:
(1) And carrying out nearest neighbor search with a larger range of R=2.35 mm on all edge points in the edge point set P e, and classifying the edge points into a corresponding subset Y CN according to the non-edge point category if the non-edge points are found.
(2) All Y CN are processed, the cluster centers for each of subset Y CN are calculated according to the following formula,
Wherein c Y is the cluster center coordinate, m is the number of cluster point clouds, x i is the single point cloud coordinate, and then, for each non-edge point p c, the Euclidean distance between the non-edge point p c and each cluster center is calculated, and the non-edge point p c is classified into the category closest to the cluster center.
(3) Step (2) is repeatedly executed until the categories of all points are not changed any more or the set maximum iteration number 100 is exceeded.
The foregoing is merely illustrative embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily think about variations or substitutions within the technical scope of the present invention, and the invention should be covered. Therefore, the protection scope of the invention is subject to the protection scope of the claims.

Claims (10)

1. An industrial scene part example segmentation method based on improved European clustering is characterized by comprising the following steps:
S1, preprocessing a real industrial scene point cloud which is acquired by a visual system and comprises a plurality of workpieces, and removing redundant point set working planes, pseudo edges, isolated noise points and workpiece contact edge point clouds in a real industrial scene point cloud image;
s2, performing example segmentation clustering on the preprocessed point cloud file by adopting an Euclidean clustering algorithm, wherein the example segmentation clustering is respectively performed on the two-dimensional Euclidean distance and the three-dimensional Euclidean distance, and the fusion clustering result is used for completing position division of an example level by utilizing a point cloud position relation to segment out single workpiece point clouds;
S3, dynamically compensating the point cloud of the contact edge point of the workpiece removed in the step S1 back to the segmented point cloud of the workpiece, and completing the segmentation of the workpiece instance in the industrial scene.
2. The method for segmenting the industrial scene part instances based on the improved European clustering according to claim 1, wherein said step S1 comprises the following sub-steps:
S11, acquiring real industrial scene point cloud images of N workpieces by using a binocular structured light three-dimensional imaging system;
S12, fitting a plane in the scene by adopting a RANSAC method, and eliminating a redundant point set working plane of the scene point cloud image acquired in the substep S11 by utilizing the fitted plane parameters and the plane point set centroid coordinates;
S13, filtering the outliers of the scene point cloud image obtained in the sub-step S12 by using a statistical method, and removing the pseudo edges and the isolated noise points in the outliers of the scene point cloud image;
S14, calculating surface normal vectors of the scene point cloud processed in the substep S13 based on a principal component analysis method, and carrying out consistency adjustment on the normal vectors to keep all the normal vectors pointing out of the surface;
And S15, performing edge point rejection optimization on the scene point cloud filtered in the substep S13 by utilizing the scene point cloud normal vector obtained in the substep S14, distinguishing contact edge points and non-contact edge points, and rejecting contact edge point clouds of a real industrial scene point cloud image where the workpiece is located.
3. The method for segmenting the industrial scene part instances based on the improved European clustering according to claim 2, wherein the specific process of the substep S12 is as follows:
firstly, selecting a plurality of candidate point clouds with sampling base number k >3 on the same plane in a point cloud space The following equation is satisfied:
where θ 1、θ2、θ3、θ4 is four parameters of the plane equation, And/>Representing three-dimensional coordinates of candidate point clouds p 1 and p N in a point cloud space, and converting the three-dimensional coordinates into a matrix form by the following formula:
θ is a plane fitting parameter vector, a represents a plane fitting matrix, and the plane fitting parameter vector θ * is calculated by:
Error calculation is then performed on the calculated plane fit parametric model with the remaining data points in the point cloud space, the fit error in the plane fit parametric model being defined as the distance between the point p and the affine space, wherein the affine space Is defined as:
solving for each point by Lagrangian multiplier Solution of (2);
The point p is found according to the following Fitting error on, i.e. orthogonal projection squared distance:
Comparing the orthogonal projection square distance with a preset error threshold mu, regarding the orthogonal projection square distance as an in-plane point smaller than the preset error threshold mu, traversing each point cloud, and recording the number of all the in-plane points under the theta *;
updating the plane fitting parameter vector theta *, repeating the process, if the number of the new in-plane points is larger than the current maximum number of the in-plane points, reserving the updated plane equation parameters until the set maximum iteration number, and reserving the point cloud plane corresponding to the finally updated plane equation parameters;
And calculating the coordinates (sigma xyz) of the point cloud plane centroid sigma corresponding to the final updated plane equation parameters, directly deleting all point clouds with the Z-axis direction smaller than sigma z+Zi, and taking Z i as a set tiny fluctuation value.
4. The method for segmenting the industrial scene part instances based on the improved European clustering according to claim 2, wherein the specific process of the substep S13 is as follows:
Defining the maximum distance threshold tau between N adjacent points in the scene point cloud neighbor obtained in the substep S12, for the point cloud P i epsilon P with the sequence number i in the point cloud, P represents the whole point cloud space, searching N nearest neighbor point clouds M n, calculating the Euclidean distance between each adjacent point and P i by using the following formula, calculating the distance average d i,
Where d (p i,Mn) is the point cloud p i and neighbor point cloud M n distance calculation,Is the three-dimensional coordinates of the point cloud p i,Three-dimensional coordinates of the neighboring point cloud M n;
If the average distance d i is larger than tau, filtering out outliers according to p i, otherwise, keeping, repeating the process to traverse each scene point cloud, and eliminating all scene point cloud image outliers.
5. The method for segmenting the industrial scene part instances based on the improved European clustering according to claim 2, wherein the specific process of the substep S14 is as follows:
firstly, performing z-channel smoothing on scene point cloud data under a primary camera coordinate system to obtain smoothed point cloud For each point cloud/>All calculate the z-axis mean μ z of neighboring k points if/>Subtracting the z-axis of the point by a step dρ if/>Adding a step dρ, τ l and τ h being the set calibration threshold, step dρ being 2 to 4 times the camera resolution;
computing point cloud Calculating a covariance matrix sigma of the point cloud using the following formula,
Wherein k represents the number of neighborhood points, and p ij represents the ith point cloudCoordinates of the j-th neighborhood point of (c)/>Represents the i-th point cloud/>An average of coordinates of neighboring points;
decomposing the covariance matrix sigma to obtain a characteristic value and a characteristic vector of the matrix, wherein the characteristic vector with the minimum characteristic value is a point cloud normal vector; traversing each point cloud, and calculating normal vectors of all the point clouds;
Then calculating the normal vector mean value of all the point clouds, and calculating the sum of the normal vectors of all the q points of the neighboring point clouds to obtain a sum vector And then/>Dividing the neighborhood point number q to obtain a normal vector mean vector/>
Computing for each point cloud its original normal vectorAnd normal vector mean vector/>The included angle beta between the two normal vectors is larger than the set threshold beta max, namely the original normal vector is inconsistent with the neighborhood normal vector direction, the original normal vector/>And taking the reverse, and repeating the process until the normal vectors of all the point clouds are consistent after adjustment.
6. The method for segmenting the industrial scene part instances based on the improved European clustering according to claim 2, wherein the specific process of the substep S15 is as follows:
For the rest points after the outlier points of all scene point cloud images are removed and the normal vectors of the point clouds are adjusted, the radius r is selected, a neighborhood of the point cloud P b epsilon P is established, and the included angle between the point cloud P b and the normal vectors of the other point clouds P j in the neighborhood is calculated by using the following arctangent function:
Wherein, theta bj is the included angle between the normal vector of the point cloud P b and another point cloud P j in the neighborhood, n b and n j are the normal vectors of P b and P j respectively, the average value alpha of the included angle of the whole neighborhood normal vector of the point cloud P b epsilon P is calculated by the following method,
G is the number of point clouds in the neighborhood, a judgment threshold epsilon th is set, and whether the point clouds are contact edge point clouds is distinguished through comparison of the following threshold values:
For contact edge point clouds, alpha > epsilon th, whereas for other point clouds in the scene, alpha is less than or equal to epsilon th;
for all contact edge point clouds i∈CPcon, k n nearest neighbors are searched for each contact edge point s i, the distance average value mu i of each nearest neighbor to the contact edge point is calculated, the size of the distance average value mu i and the threshold mu th is judged, if mu i<μth is determined, the point belongs to the contact edge point to be removed, if mu i≥μth is determined, the point is not the contact edge point and should not be removed;
And storing the removed edge point cloud into an edge point set P e to finally obtain preprocessed scene point cloud data S sc.
7. The method for segmenting the industrial scene part instances based on the improved European clustering according to claim 1, wherein said step S2 comprises the following sub-steps:
S21, setting a maximum point cloud distance threshold, and the minimum point cloud number M min and the maximum point cloud number M max of a single-instance workpiece, and continuously carrying out k neighbor dynamic searching on the scene point cloud file processed in the sub-step S15 to obtain a three-dimensional point cloud instance segmentation clustering result of the workpiece in the scene based on European clustering;
S22, multiplying the scene point cloud file processed in the substep S15 by a scaling with a proportion omega in the z-axis direction under a camera coordinate system, wherein 0< omega <1, and then performing the same European clustering calculation in the substep S21 to obtain a workpiece two-dimensional point cloud instance segmentation clustering result in the scene based on European clustering;
S23, merging the three-dimensional European clustering obtained in the substep S21 and the segmentation clustering result of the two-dimensional European clustering obtained in the substep S22, and realizing the clustering selection of a combination selection strategy to obtain a workpiece instance segmentation result in a final scene.
8. The method for partitioning industrial scene parts instances based on improved European clustering according to claim 7, wherein the specific process of the substep S21 is as follows:
Initializing a distance threshold r d of a point cloud distance threshold, setting the minimum point cloud number M min and the maximum point cloud number M max of each cluster of an example workpiece, inputting scene point cloud data S sc, randomly selecting a starting point S 0 as a current active point, and putting the current active point into a cluster subset Q i;
Searching all points with the distance less than or equal to r d in the k neighborhood, and collecting the points into a point set Merging all members of the point set S ok into the cluster subset Q i, marking S q as processed, selecting the next unprocessed point S n as a new active point in Q i, and repeating the above processing procedure;
When all points in the clustering subset Q i are marked as processed, or the number of points of the clustering subset Q i exceeds the maximum number of point clouds M max, removing Q i from the scene point clouds S sc, restarting clustering the next subset Q i+1, and repeating subset merging clustering;
After the clustering of all the points in the point cloud is completed, the clustering subset with the number of the point clouds smaller than the minimum number of the point clouds M min is removed, and the following formula is satisfied for any set Q i:
Qi={sj|d(sj,sk)<rd,sj,sk∈Qi},
Wherein s j is any point of the set, s k is a point which is less than r d from s j in the set, and the number of s k is at least 1.
9. The method for partitioning industrial scene parts instances based on improved European clustering according to claim 7, wherein the specific process of the substep S23 is as follows:
Respectively carrying out three-dimensional (x, y, z) space European clustering of the substep S21 and two-dimensional (x, y) space European clustering of the substep S22 on the scene point cloud data, wherein the maximum threshold delta max of the set point cloud quantity is set during the three-dimensional European clustering, and the maximum threshold is not set during the two-dimensional European clustering; after a clustering result is obtained, the number of point clouds of each cluster is reduced according to a proportion C, all cluster subsets obtained by two European clusters are ranked according to the number of the point clouds from large to small, the cluster center of each cluster is calculated, and the number n of cluster combinations is set according to the workpiece types;
selecting a cluster with the largest number of point clouds from the three-dimensional European cluster array as a current cluster, finding n cluster centers closest to the current cluster center, and combining the n cluster instances with the current cluster to form a new cluster;
Finding a new cluster with the number of cluster points exceeding the maximum threshold delta max of the number of point clouds, sequentially comparing the new cluster with the two-dimensional clusters according to the positive sequence, wherein the compared indexes comprise the total number of the point clouds and the Euclidean distance of the cluster center on the (x, y, 0) plane, and calculating by adopting the following formula:
N 3d is the number of three-dimensional European style clustering point clouds, N 2d is the number of two-dimensional European style clustering point clouds, C 3d is the center coordinates of the three-dimensional European style clustering point clouds, C 2d is the center coordinates of the two-dimensional European style clustering point clouds, sigma N and sigma D are super-parameters, if the epsilon value obtained by calculation is smaller than a set threshold epsilon h, the two-dimensional European style clustering and the three-dimensional European style clustering after combination are considered to be corresponding, under-segmentation occurs during the three-dimensional European style clustering, the result of the combination clustering is adopted, and the corresponding two-dimensional clustering and N three-dimensional clusters after combination are respectively removed from an array;
selecting a next three-dimensional clustering result from the three-dimensional clustering array to continue execution; after traversing the whole three-dimensional clustering array, adding 1 to the value of n, and starting circulation from the initial position again until the array is empty or n exceeds a preset maximum value;
In the cyclic process, when the number of the clustering point clouds exceeds a maximum threshold delta max of the number of the point clouds, the situation that the two-dimensional segmentation is excessively segmented is indicated, at this time, the largest n clusters are gradually removed from the far to near sequence relative to the clustering center until the total number of the point clouds meets the requirement, the reserved clusters are added to the result, the corresponding points are deleted from the two-dimensional clustering point clouds, and the cyclic is continuously executed after the corresponding clusters are deleted from the three-dimensional clustering array.
10. The method for segmenting the industrial scene part instances based on the improved European clustering according to claim 7, wherein the specific process of the step S3 is as follows:
Performing nearest neighbor search of a range R on all edge points in the edge point set P e, and classifying the edge points into a corresponding subset Y CN according to a non-edge point category Y if the non-edge points are found;
The cluster centers for each of the subsets Y CN are calculated according to the following formula:
Wherein c Y is the cluster center coordinate, m is the number of cluster point clouds, x i is the single point cloud coordinate, then, for each non-edge point p c, the Euclidean distance between the non-edge point p c and each cluster center is calculated, the non-edge point p c is classified into the category closest to the cluster center, and Y CN for classifying all edge points is repeated until the category of all points is not changed or exceeds the set iteration frequency limit.
CN202410374446.4A 2024-03-29 2024-03-29 Industrial scene part instance segmentation method based on improved European clustering Pending CN118212410A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410374446.4A CN118212410A (en) 2024-03-29 2024-03-29 Industrial scene part instance segmentation method based on improved European clustering

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410374446.4A CN118212410A (en) 2024-03-29 2024-03-29 Industrial scene part instance segmentation method based on improved European clustering

Publications (1)

Publication Number Publication Date
CN118212410A true CN118212410A (en) 2024-06-18

Family

ID=91451887

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410374446.4A Pending CN118212410A (en) 2024-03-29 2024-03-29 Industrial scene part instance segmentation method based on improved European clustering

Country Status (1)

Country Link
CN (1) CN118212410A (en)

Similar Documents

Publication Publication Date Title
CN112070818B (en) Robot disordered grabbing method and system based on machine vision and storage medium
JP6216508B2 (en) Method for recognition and pose determination of 3D objects in 3D scenes
CN104573614B (en) Apparatus and method for tracking human face
CN113128610B (en) Industrial part pose estimation method and system
CN111060115A (en) Visual SLAM method and system based on image edge features
US20130089260A1 (en) Systems, Methods, and Software Implementing Affine-Invariant Feature Detection Implementing Iterative Searching of an Affine Space
CN108107444A (en) Substation&#39;s method for recognizing impurities based on laser data
CN113012161B (en) Stacked scattered target point cloud segmentation method based on convex region growth
CN112651944B (en) 3C component high-precision six-dimensional pose estimation method and system based on CAD model
CN110472585A (en) A kind of VI-SLAM closed loop detection method based on inertial navigation posture trace information auxiliary
CN114743259A (en) Pose estimation method, pose estimation system, terminal, storage medium and application
CN111445497B (en) Target tracking and following method based on scale context regression
CN116229189B (en) Image processing method, device, equipment and storage medium based on fluorescence endoscope
CN111783722B (en) Lane line extraction method of laser point cloud and electronic equipment
CN106033613B (en) Method for tracking target and device
CN117495891B (en) Point cloud edge detection method and device and electronic equipment
CN108154513A (en) Cell based on two photon imaging data detects automatically and dividing method
CN117745780A (en) Outdoor large scene 3D point cloud registration method based on isolated cluster removal
CN113159103B (en) Image matching method, device, electronic equipment and storage medium
CN117496401A (en) Full-automatic identification and tracking method for oval target points of video measurement image sequences
CN116921932A (en) Welding track recognition method, device, equipment and storage medium
CN118212410A (en) Industrial scene part instance segmentation method based on improved European clustering
Zhang et al. Object detection and grabbing based on machine vision for service robot
Asif et al. Model-free segmentation and grasp selection of unknown stacked objects
Pratama et al. 3D Object Pose Estimation Using Local Features Based for Industrial Appliance

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination