CN117495868A - Point cloud deep learning-based mechanical part assembly feature measurement method - Google Patents

Point cloud deep learning-based mechanical part assembly feature measurement method Download PDF

Info

Publication number
CN117495868A
CN117495868A CN202410007187.1A CN202410007187A CN117495868A CN 117495868 A CN117495868 A CN 117495868A CN 202410007187 A CN202410007187 A CN 202410007187A CN 117495868 A CN117495868 A CN 117495868A
Authority
CN
China
Prior art keywords
primitive
point
point cloud
points
neural network
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410007187.1A
Other languages
Chinese (zh)
Inventor
胡艺砾
李红卫
罗群
汪俊
肖坤
李子宽
周铉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Aeronautics and Astronautics
Original Assignee
Nanjing University of Aeronautics and Astronautics
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Aeronautics and Astronautics filed Critical Nanjing University of Aeronautics and Astronautics
Priority to CN202410007187.1A priority Critical patent/CN117495868A/en
Publication of CN117495868A publication Critical patent/CN117495868A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/187Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/766Arrangements for image or video recognition or understanding using pattern recognition or machine learning using regression, e.g. by projecting features on hyperplanes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/80Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
    • G06V10/806Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/64Three-dimensional objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30164Workpiece; Machine component
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/06Recognition of objects for industrial automation
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Geometry (AREA)
  • Quality & Reliability (AREA)
  • Length Measuring Devices With Unspecified Measuring Means (AREA)

Abstract

The invention discloses a mechanical part assembly characteristic measurement method based on point cloud deep learning, which comprises the following steps: scanning various mechanical parts, and marking primitive boundary points and primitive types to form a training data set; building a primitive boundary point detection and primitive type prediction neural network; training a neural network; inputting point cloud data of the mechanical parts to be tested into a network to obtain predicted primitive boundary points and primitive types of all points; based on the predicted primitive boundary points and primitive types of all points, dividing all primitive instances by using a region growing algorithm; and carrying out weighted least square fitting on the point cloud of each primitive instance to obtain specific parameters of each primitive instance, namely the assembly characteristics. The invention adopts a deep learning method to extract multi-scale fusion characteristics of the point cloud, predicts boundary points and primitive types, and can better reconstruct the assembly characteristics of mechanical parts, thereby improving the precision and accuracy of the assembly of the mechanical parts.

Description

Point cloud deep learning-based mechanical part assembly feature measurement method
Technical Field
The invention belongs to the technical field of machine part assembly cloud feature extraction, and particularly relates to a mechanical part assembly feature measurement method based on point cloud deep learning.
Background
The assembly of mechanical parts is a critical ring in the manufacturing industry, directly affecting the performance and quality of the product. In order to ensure the accuracy and quality of component assembly, accurate measurements of the features of the components are required.
Conventional measurement methods, such as those using Coordinate Measuring Machines (CMMs) or optical measurement systems, while reliable, typically require significant time and human resources and present challenges for measurement of complex part geometries. Furthermore, conventional methods often require physical contact, which can lead to wear or damage to components, which is not suitable for certain applications.
Disclosure of Invention
Aiming at the defects of the prior art, the invention provides a mechanical part assembly feature measurement method based on point cloud deep learning, which adopts the deep learning method to extract multi-scale fusion features of point cloud, predicts boundary points and primitive types, and can reconstruct mechanical part assembly features better, thereby improving the precision and accuracy of mechanical part assembly, processing complex part geometric shape data and realizing efficient feature measurement and quality control.
In order to achieve the technical purpose, the invention adopts the following technical scheme:
a mechanical part assembly characteristic measurement method based on point cloud deep learning comprises the following steps:
step S1, a scanning platform is established, various mechanical parts are scanned, primitive boundary points and primitive types are marked on scanned point cloud data, and a training data set is formed;
s2, constructing a primitive boundary point detection and primitive type prediction neural network;
s3, training a neural network by adopting a training data set and a cross entropy loss function;
s4, inputting point cloud data of the mechanical parts to be tested into a trained neural network to obtain predicted primitive boundary points and primitive types to which each point belongs;
step S5, all primitive instances are segmented by using a region growing algorithm based on the predicted primitive boundary points and primitive types to which each point belongs;
and S6, carrying out weighted least square fitting on the point cloud of each primitive instance to obtain specific parameters of each primitive instance, namely the assembly characteristics.
In order to optimize the technical scheme, the specific measures adopted further comprise:
the step S1 specifically includes the following steps:
s101, preparing a CAD model of a mechanical part to be virtually scanned, wherein the model comprises various primitive examples;
s102, aiming at a CAD model, adopting Blender to simulate and scan real parts and generate virtual point cloud data;
s103, adding a label of each virtual point cloud midpoint to indicate whether the corresponding point is a primitive boundary point and the primitive type to which the corresponding point belongs;
s104, extracting point cloud local structure blocks from the virtual point cloud data to serve as training data of the neural network, and combining the labels to be used for training the neural network to identify primitive boundary points and primitive types.
The above-mentioned local structural blockComprising a local point cloud structure block->And structural block->Wherein->The number of points in (a) is less than>Points in->For the input of the neural network, +.>For->Provides a local neighborhood and a global neighborhood.
In the step S2, for each point in the point cloud group, the neural network adopts graph convolution, a multi-layer perceptron and maximum pooling to perform feature coding on a local neighborhood and a global neighborhood of the point cloud group; the local features and the global features are further subjected to feature coding by using a transducer module, so that fusion features are obtained; after the fusion feature is obtained, it is input into a regressor to predict the category of each point.
The step S3 specifically includes the following steps:
s301, performing forward propagation on an input sample through a neural network to obtain a prediction probability distribution q of a model;
s302, calculating cross entropy lossWherein p is trueThe probability distribution of the labels is obtained from the training data set;
s303, calculating the gradient of the loss function to the model parameters by using a back propagation algorithm;
s304, the optimizer minimizes a loss function, and updates parameters of the model according to the gradient;
s305, repeating the steps S301-S304 until the stopping condition is reached.
The step S5 specifically includes the following steps:
s501, arbitrarily selecting a point P as a seed point from a point cloud group P formed by a predicted primitive boundary point and primitive types to which each point belongs;
s502, searching adjacent points of the seed point p to obtain an adjacent point set
S503, for each point in the adjacent point setIf->Not primitive boundary point, then the neighboring point +.>Is polymerized with seed point p and the adjacent point +.>As a new seed point, repeating step S502;
s504, if the rest points are not clustered except the primitive boundary points, repeating the step S501 until all the points are clustered to obtain a segmented point cloud group Q, namely all primitive instances.
The step S6 specifically includes the following steps:
s601, inputting the primitive instance segmented in the S5 into a multi-layer perceptron, and further extracting the depth characteristics of the point cloud;
s602, limiting a point cloud depth characteristic value output by a multi-layer perceptron to a weight value ranging from 0 to 1 by adopting a Softmax activation function;
and S603, carrying out weighted least square fitting according to the point cloud depth characteristic value and the corresponding weight thereof to obtain specific parameters of each primitive instance, namely the assembly characteristic.
The invention has the following beneficial effects:
compared with the traditional manual measurement method, the method is more efficient and rapid, is beneficial to improving the production efficiency of an assembly production line and reducing the production period;
the method based on the point cloud deep learning can provide a highly accurate measurement result, can capture the accurate geometric characteristics of parts, including the size, the curvature, the aperture and the like, and is beneficial to ensuring the accuracy and the quality of assembly;
the invention does not need to physically contact the parts, avoids the problem that the parts are possibly worn or damaged, and is very important for the application needing to keep the integrity of the parts;
the invention can realize real-time measurement and feedback in the assembly process, is beneficial to avoiding the production of unqualified products and reduces unnecessary cost and resource waste.
Drawings
FIG. 1 is a flow chart of a method of measuring a mechanical part assembly feature of the present invention;
FIG. 2 is a schematic view of a platform construction of the measurement principle of the assembly features of the mechanical parts of the present invention;
FIG. 3 is a diagram of a network architecture for measurement of the assembly characteristics of mechanical components of the present invention;
FIG. 4 is a point cloud of a scan of a component of the present invention;
FIG. 5 is a primitive boundary point diagram of the component feature prediction of the present invention;
FIG. 6 is a primitive type diagram of the component feature prediction of the present invention;
fig. 7 is a diagram showing an example of the element finally divided by the component part of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present invention more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
Although the steps of the present invention are arranged by reference numerals, the order of the steps is not limited, and the relative order of the steps may be adjusted unless the order of the steps is explicitly stated or the execution of a step requires other steps as a basis. It is to be understood that the term "and/or" as used herein relates to and encompasses any and all possible combinations of one or more of the associated listed items.
Fig. 1-7 show a specific embodiment of the invention, which adopts a deep learning method to extract multi-scale fusion characteristics of point clouds, forecast boundary points and primitive types, and can reconstruct mechanical part assembly characteristics better, thereby improving the precision and accuracy of mechanical part assembly.
Specifically, as shown in fig. 1, the feature extraction method for assembling mechanical parts based on deep learning provided in this embodiment is used for feature measurement of mechanical parts, and includes the following steps:
step S1, a scanning platform is established, a 3D laser scanner or a depth camera is used for scanning various mechanical parts, and primitive boundary points and primitive types are marked on scanned point cloud data to form a training data set;
s2, constructing a primitive boundary point detection and primitive type prediction neural network;
s3, training a neural network by adopting a training data set and a cross entropy loss function;
s4, inputting point cloud data of the mechanical parts to be tested into a trained neural network to obtain predicted primitive boundary points and primitive types to which each point belongs;
step S5, all primitive instances are segmented by using a region growing algorithm based on the predicted primitive boundary points and primitive types to which each point belongs;
and S6, carrying out weighted least square fitting on the point cloud of each primitive instance to obtain specific parameters of each primitive instance, namely the assembly characteristics.
In an embodiment, the step S1 specifically includes the following steps:
s101, preparing a CAD model of a mechanical part to be virtually scanned, wherein the model comprises various primitive examples;
s102, aiming at a CAD model, adopting 3D modeling and rendering software Blender to simulate and scan real parts and generate virtual point cloud data;
s103, adding a label of each virtual point cloud midpoint to indicate whether the corresponding point is a primitive boundary point and the primitive type to which the corresponding point belongs;
s104, extracting point cloud local structure blocks from the virtual point cloud data to serve as training data of the neural network, and combining the labels to be used for training the neural network to identify primitive boundary points and primitive types.
Local structural blockComprising a local point cloud structure block->And structural block->Wherein->The number of points in (a) is less than>Points in->For the input of the neural network, +.>For->Provides a local neighborhood and a global neighborhood.
A separate validation data set is used to evaluate the performance of the trained model. The present embodiment uses 28 curved or planar CAD models containing multiple primitives generate training data. Three different levels of gaussian noise (standard deviations of 0.1%, 0.5% and 1.0%) were added as each CAD model was scanned, and three different sampling resolutions were set in order to simulate as much as possible the measurement data for different distribution states. Then, 40 pairs of local structure blocks are extracted from each point cloud as training data. Each pair of local structural blocksPartial point cloud structure block with fewer points>And a more numerous structural block +.>。/>For network input->For->Provides a local neighborhood and a global neighborhood. Eventually, a total of 28×3×3×40=10080 point cloud local structure blocks are created for training.
The step S2 specifically includes the following steps: the built primitive boundary point detection and primitive type prediction neural network analyzes the local neighborhood information and the global neighborhood information of a certain data point in the point cloud data, so that the network model can accurately sense whether the certain data point in the point cloud data is a primitive boundary point or not;
s201, aiming at each point in a point cloud group, the neural network adopts graph convolution, a multi-layer perceptron and maximum pooling to perform feature coding on a local neighborhood and a global neighborhood of the point cloud group;
s202, further performing feature coding on the extracted local features and global features by using a transducer module to obtain fusion features, and further improving the accuracy of identifying primitive boundary points;
s203, after the fusion characteristics are obtained, the fusion characteristics are input into a regressor to predict the category of each point.
In practice, if only local neighborhood information of each point is perceived, neighboring points of many primitive boundary points may be mistaken for primitive boundary points. Thus, for the followingAt->In using bounding sphere to search for neighbor points to construct local neighborhood +.>And global neighborhood->. Meanwhile, in order to facilitate batch processing of data by the network model, the points in the neighbors with the same scale should be the same, so that the neighbors with insufficient points are supplemented with origin coordinates, and random sampling is carried out on the neighbors with excessive points. According to the test, setting the number of the local neighborhood and the global neighborhood to be k respectively l =16 and k g =128。
In order to ensure that the same structural point clouds at any spatial position have the same primitive detection result, and simultaneously in order to make the network more easy to train, the local neighborhood central point needs to be moved to the origin. Considering that the global neighborhood is more structural and the local neighborhood is more susceptible to outliers, we calculate using principal component analysis (Principal component analysis, PCA)And based on this a local coordinate system is constructed, then +.>And->Is aligned to the local coordinate system Z-axis.
The step S3 specifically comprises the following steps:
s301, performing forward propagation on an input sample through a neural network to obtain a prediction probability distribution q of a model;
s302, calculating cross entropy loss(equation (2)), where p is the probability distribution of the real labels, obtained from the training dataset; its fitting loss to the primitive of formula (3)>The sum is the total loss L (formula (1));
s303, calculating the gradient of the loss function to the model parameters by using a back propagation algorithm;
s304, selecting an optimizer (such as random gradient descent (SGD) or Adam) for minimizing a loss function, and updating parameters of the model according to the gradient;
s305, repeating steps S301-S304 until a stopping condition is reached (e.g. the training rounds reach a predetermined number or the loss function value is sufficiently small).
The present invention defines the network loss function as the sum of:
;(1)
where μ is an equilibrium factor, μ=0.7 is empirically set,is primitive boundary point class loss, < >>Is the primitive fitting penalty.
Primitive boundary point classification loss function: since the proportion of primitive boundary points in the complete measurement point cloud is relatively low, a weighted cross entropy loss function is adopted:
;(2)
wherein w is 0 And w 1 Is the class weight, determined by the number of samples,is a point cloud class label (0 or 1), ->Is the prediction probability of each point, and N is the point cloud point number.
Primitive fitting loss function: the fit loss function is expressed as fit circle parameter +.>Corresponds to true valueDifference between:
;(3)
where ω1, ω2 and ω3 are weight factors, and are set to 0.1, 0.1 and 0.8, respectively, according to a plurality of trials, and k is the number of primitives.
S4, inputting point clouds of mechanical parts to be tested into a network to obtain predicted primitive boundary points and primitive types to which all points belong, wherein the predicted primitive boundary points and the primitive types to which all points belong are shown in fig. 5 and 6 respectively;
as shown in fig. 7, S5 segments all primitive instances using a region growing algorithm based on predicted primitive boundary points and point primitive type segmentation;
by the characteristic that normal vectors of triangular surfaces on the same piece of the part are basically the same, any triangular surface is used as an initial seed surface, a unit normal vector of the initial seed surface is used as a judging condition, a co-edge triangular surface of the seed surface is used as an adjacent surface, an included angle of the unit normal vector of the co-edge triangular surface of the seed surface is calculated, an angle threshold alpha is set as a growing condition, and if the included angle is smaller than alpha, 2 triangular surfaces belong to the same piece set; if it is greater than α, the growth in this direction is stopped. In this way, the point-by-point estimation algorithm vector and the judgment proximity relation can be avoided.
The step S5 specifically includes the following steps:
s501, arbitrarily selecting a point P as a seed point from a point cloud group P formed by a predicted primitive boundary point and primitive types to which each point belongs;
s502, searching adjacent points of the seed point p to obtain an adjacent point set
S503, for each point in the adjacent point setIf->Not primitive boundary point, then the neighboring point +.>Is polymerized with seed point p and the adjacent point +.>As a new seed point, repeating step S502;
s504, if the rest points are not clustered except the primitive boundary points, repeating the step S501 until all the points are clustered to obtain a segmented point cloud group Q, namely all primitive instances.
Step S6, carrying out weighted least square fitting on the point cloud of each primitive instance to obtain specific parameters of each primitive instance, namely assembly characteristics;
the conventional LS fitting method generally minimizes the sum of squares of errors:
in the method, in the process of the invention,r is the primitive radius, q j Is the primitive boundary point p j Projection point on a plane defined by normal n and the mean value of the element boundary points, c being the center of a circle,/->Representing an absolute value operation. When noise and outliers are contained in the measurement points, the fitting accuracy of the above equation is severely reduced. To solve this problem, a weighted least squares fitting method may be employed:
where wj represents the weight of qj. Since the above equation is a nonlinear least squares problem, there is no closed-form solution. An alternative solution to this non-linearity problem is therefore:
in the method, in the process of the invention,by adding->Is changed into->And let->And->Obtaining:
the reuse matrix form is expressed as:
in the method, in the process of the invention,is a diagonal matrix>Is a column vector a j Matrix of->,/>The values of the original variables c and r can also be calculated directly:
however, for weighted least squares fitting, how to design the metric function, computing the weight matrix W is a challenge. The algorithm directly adopts a large number of training samples and adopts a network model to learn weights.
Specifically, the fusion characteristics output by the characteristic fusion module are input to the multi-layer perceptronAnd further extracting the depth characteristics of the point cloud. The output value is then limited to weights between 0 and 1 using a Softmax activation function, which measures the fit contribution of each point. Finally, constructing a diagonal weight matrix W to solve a weighted least square fitting problem;
in the method, in the process of the invention,is a fusion feature->Is a very small constant to ensure stable valueTo avoid the occurrence of zero-matrix conditions.
The step S6 specifically comprises the following steps:
s601, inputting the primitive instance segmented in the S5 into a multi-layer perceptron, further extracting the depth characteristics of the point cloud, and measuring the fitting contribution of each point by the weight;
s602, limiting a point cloud depth characteristic value output by a multi-layer perceptron to a weight value ranging from 0 to 1 by adopting a Softmax activation function;
and S603, carrying out weighted least square fitting according to the point cloud depth characteristic value and the corresponding weight thereof to obtain specific parameters of each primitive instance, namely the assembly characteristic.
In the weighted least squares fitting problem, a diagonal weight matrix W is used to give each data point (or primitive instance) a different weight to better fit the model.
Specifically, when the Softmax activation function limits the weights of the point cloud depth feature values, the output point cloud depth feature values of the MLP need to be normalized by the Softmax activation function to ensure that they are all in the range of 0 to 1. This step aims at converting the point cloud depth feature values into weights, measuring the fit contribution of each point. The function of the Softmax activation function is to translate the eigenvalues into a probability distribution to better represent the relative importance of each point in the fit.
A weighted least squares fit is performed based on the point cloud depth eigenvalues (which have been limited to between 0 and 1 by the Softmax activation function) and the corresponding diagonal weight matrix W. This means that the fit contribution of each primitive instance will be adjusted by its corresponding weight, which is a key step to better adapt the fit contribution of each point, rather than simply applying the same weight to all points. Specifically, the diagonal weight matrix W imparts different importance to different primitive instances, according to their point cloud depth feature values. Specific parameters of each primitive instance will be obtained after fitting, these parameters constituting the fitting features.
It will be evident to those skilled in the art that the invention is not limited to the details of the foregoing illustrative embodiments, and that the present invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof. The present embodiments are, therefore, to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference sign in a claim should not be construed as limiting the claim concerned.
Furthermore, it should be understood that although the present disclosure describes embodiments, not every embodiment is provided with a separate embodiment, and that this description is provided for clarity only, and that the disclosure is not limited to the embodiments described in detail below, and that the embodiments described in the examples may be combined as appropriate to form other embodiments that will be apparent to those skilled in the art.

Claims (7)

1. The mechanical part assembly characteristic measurement method based on the point cloud deep learning is characterized by comprising the following steps of:
step S1, a scanning platform is established, various mechanical parts are scanned, primitive boundary points and primitive types are marked on scanned point cloud data, and a training data set is formed;
s2, constructing a primitive boundary point detection and primitive type prediction neural network;
s3, training a neural network by adopting a training data set and a cross entropy loss function;
s4, inputting point cloud data of the mechanical parts to be tested into a trained neural network to obtain predicted primitive boundary points and primitive types to which each point belongs;
step S5, all primitive instances are segmented by using a region growing algorithm based on the predicted primitive boundary points and primitive types to which each point belongs;
and S6, carrying out weighted least square fitting on the point cloud of each primitive instance to obtain specific parameters of each primitive instance, namely the assembly characteristics.
2. The method for measuring the assembly characteristics of mechanical parts based on the point cloud deep learning according to claim 1, wherein the step S1 specifically comprises the following steps:
s101, preparing a CAD model of a mechanical part to be virtually scanned, wherein the model comprises various primitive examples;
s102, aiming at a CAD model, adopting Blender to simulate and scan real parts and generate virtual point cloud data;
s103, adding a label of each virtual point cloud midpoint to indicate whether the corresponding point is a primitive boundary point and the primitive type to which the corresponding point belongs;
s104, extracting point cloud local structure blocks from the virtual point cloud data to serve as training data of the neural network, and combining the labels to be used for training the neural network to identify primitive boundary points and primitive types.
3. The method for measuring the assembly characteristics of mechanical parts based on point cloud deep learning as claimed in claim 2, wherein the local structural block is formed by the following steps ofComprising a local point cloud structure block->And structural block->Wherein->The number of points in (a) is less than>Points in->For the input of the neural network, +.>For->Provides a local neighborhood and a global neighborhood.
4. The method for measuring the assembly characteristics of mechanical parts based on the deep learning of the point cloud according to claim 3, wherein the neural network in the step S2 adopts graph convolution, a multi-layer perceptron and maximum pooling for each point in the point cloud group to perform characteristic coding on local neighborhood and global neighborhood; the local features and the global features are further subjected to feature coding by using a transducer module, so that fusion features are obtained; after the fusion feature is obtained, it is input into a regressor to predict the category of each point.
5. The method for measuring the assembly characteristics of mechanical parts based on the point cloud deep learning according to claim 1, wherein the step S3 specifically comprises the following steps:
s301, performing forward propagation on an input sample through a neural network to obtain a prediction probability distribution q of a model;
s302, calculating cross entropy lossWhere p is the probability distribution of the real labels, obtained from the training dataset;
s303, calculating the gradient of the loss function to the model parameters by using a back propagation algorithm;
s304, the optimizer minimizes a loss function, and updates parameters of the model according to the gradient;
s305, repeating the steps S301-S304 until the stopping condition is reached.
6. The method for measuring the assembly characteristics of mechanical parts based on the point cloud deep learning according to claim 1, wherein the step S5 specifically comprises the following steps:
s501, arbitrarily selecting a point P as a seed point from a point cloud group P formed by a predicted primitive boundary point and primitive types to which each point belongs;
s502, searching adjacent points of the seed point p to obtain an adjacent point set
S503, for each point in the adjacent point setIf->Not primitive boundary point, then the neighboring point +.>Is polymerized with seed point p and the adjacent point +.>As a new seed point, repeating step S502;
s504, if the rest points are not clustered except the primitive boundary points, repeating the step S501 until all the points are clustered to obtain a segmented point cloud group Q, namely all primitive instances.
7. The method for measuring the assembly characteristics of mechanical parts based on the point cloud deep learning according to claim 1, wherein the step S6 specifically comprises the following steps:
s601, inputting the primitive instance segmented in the S5 into a multi-layer perceptron, and further extracting the depth characteristics of the point cloud;
s602, limiting a point cloud depth characteristic value output by a multi-layer perceptron to a weight value ranging from 0 to 1 by adopting a Softmax activation function;
and S603, carrying out weighted least square fitting according to the point cloud depth characteristic value and the corresponding weight thereof to obtain specific parameters of each primitive instance, namely the assembly characteristic.
CN202410007187.1A 2024-01-03 2024-01-03 Point cloud deep learning-based mechanical part assembly feature measurement method Pending CN117495868A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410007187.1A CN117495868A (en) 2024-01-03 2024-01-03 Point cloud deep learning-based mechanical part assembly feature measurement method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410007187.1A CN117495868A (en) 2024-01-03 2024-01-03 Point cloud deep learning-based mechanical part assembly feature measurement method

Publications (1)

Publication Number Publication Date
CN117495868A true CN117495868A (en) 2024-02-02

Family

ID=89683431

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410007187.1A Pending CN117495868A (en) 2024-01-03 2024-01-03 Point cloud deep learning-based mechanical part assembly feature measurement method

Country Status (1)

Country Link
CN (1) CN117495868A (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110120097A (en) * 2019-05-14 2019-08-13 南京林业大学 Airborne cloud Semantic Modeling Method of large scene
CN111815776A (en) * 2020-02-04 2020-10-23 山东水利技师学院 Three-dimensional building fine geometric reconstruction method integrating airborne and vehicle-mounted three-dimensional laser point clouds and streetscape images
CN115409886A (en) * 2022-11-02 2022-11-29 南京航空航天大学 Part geometric feature measuring method, device and system based on point cloud

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110120097A (en) * 2019-05-14 2019-08-13 南京林业大学 Airborne cloud Semantic Modeling Method of large scene
CN111815776A (en) * 2020-02-04 2020-10-23 山东水利技师学院 Three-dimensional building fine geometric reconstruction method integrating airborne and vehicle-mounted three-dimensional laser point clouds and streetscape images
CN115409886A (en) * 2022-11-02 2022-11-29 南京航空航天大学 Part geometric feature measuring method, device and system based on point cloud

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
赵辉等著: "《计算机图形学三维模型处理算法初步理论与实现C#版》", 31 October 2014, 海洋出版社, pages: 189 - 195 *
陈红华等: ""基于三维点云深度学习的飞机表面多圆孔基元提取方法"", 《机械工程学报》, 31 July 2022 (2022-07-31), pages 190 - 199 *

Similar Documents

Publication Publication Date Title
CN111145236B (en) Product quasi-physical assembly model generation method based on digital twinning and implementation framework
Nishikawa et al. Concrete crack detection by multiple sequential image filtering
CN109284779A (en) Object detecting method based on the full convolutional network of depth
CN112347550A (en) Coupling type indoor three-dimensional semantic graph building and modeling method
WO2022233206A1 (en) Sample database system, method for training and checking printing parameter, and computer
CN115070780B (en) Industrial robot grabbing method and device based on digital twinning and storage medium
Pathak et al. Form error evaluation of noncontact scan data using constriction factor particle swarm optimization
CN115439694A (en) High-precision point cloud completion method and device based on deep learning
CN114881998A (en) Workpiece surface defect detection method and system based on deep learning
CN114022586A (en) Defect image generation method based on countermeasure generation network
CN117372335A (en) Photoelectric simulation image processing method and system based on spearman level correlation coefficient
CN117495868A (en) Point cloud deep learning-based mechanical part assembly feature measurement method
Tao et al. Anomaly detection for fabricated artifact by using unstructured 3D point cloud data
CN111241614B (en) Engineering structure load inversion method based on condition generation confrontation network model
CN111210409B (en) Condition-based generation confrontation network structure damage identification method
Liu et al. DeviationGAN: A generative end-to-end approach for the deviation prediction of sheet metal assembly
CN117128862B (en) Scanning detection analysis method for automobile parts
EP4310779A1 (en) Segmenting a building scene
CN117292067B (en) Virtual 3D model method and system based on scanning real object acquisition
CN117451744B (en) Method, device, equipment and storage medium for detecting defect of infrared lens
Ren et al. Self-learning based highly efficient sampling strategy for complex surface reconstruction on contact measurements
CN116861543B (en) Scaffold mechanics analysis method and system based on BIM model
Mahammad Using AI in Dimensional Metrology
Do Couto et al. Methods of simulating 3D point cloud data to represent unstructured industrial environments.
Sol et al. Visual Deformation Detection Using Soft Material Simulation for Pre-training of Condition Assessment Models

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination