CN116645370A - Real-time detection method for surface defects of high-temperature forging based on three-dimensional point cloud data - Google Patents

Real-time detection method for surface defects of high-temperature forging based on three-dimensional point cloud data Download PDF

Info

Publication number
CN116645370A
CN116645370A CN202310926708.9A CN202310926708A CN116645370A CN 116645370 A CN116645370 A CN 116645370A CN 202310926708 A CN202310926708 A CN 202310926708A CN 116645370 A CN116645370 A CN 116645370A
Authority
CN
China
Prior art keywords
clustering
region
image
point cloud
cloud data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202310926708.9A
Other languages
Chinese (zh)
Other versions
CN116645370B (en
Inventor
袭瑞江
张扬
王永涛
袭英超
李�浩
董国强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong Shunfa Heavy Industry Co ltd
Original Assignee
Shandong Shunfa Heavy Industry Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong Shunfa Heavy Industry Co ltd filed Critical Shandong Shunfa Heavy Industry Co ltd
Priority to CN202310926708.9A priority Critical patent/CN116645370B/en
Publication of CN116645370A publication Critical patent/CN116645370A/en
Application granted granted Critical
Publication of CN116645370B publication Critical patent/CN116645370B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/762Arrangements for image or video recognition or understanding using pattern recognition or machine learning using clustering, e.g. of similar faces in social networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Quality & Reliability (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Multimedia (AREA)
  • Radiation Pyrometers (AREA)

Abstract

The application relates to the technical field of data processing, in particular to a high-temperature forging surface defect real-time detection method based on three-dimensional point cloud data. Three-dimensional point cloud data corresponding to each view angle are converted into two-dimensional images by acquiring three-dimensional point cloud data of the high-temperature forging piece under the plurality of view angles, and images of a plurality of other view angles are respectively derived and generated for each two-dimensional image, so that a view angle image pair is obtained, comparison is carried out based on the view angle image pair to determine a sampling interval, and because the view angle image pair contains images of a plurality of the same view angles, errors caused by different views are avoided, and meanwhile, the derived images contain information of corresponding position images, so that crack defects which exist under certain view angles but are undefined under certain view angles can be accurately detected, rationality and reliability of the finally obtained sampling interval are ensured, and detection precision is improved.

Description

Real-time detection method for surface defects of high-temperature forging based on three-dimensional point cloud data
Technical Field
The application relates to the technical field of data processing, in particular to a high-temperature forging surface defect real-time detection method based on three-dimensional point cloud data.
Background
High temperature forgings are subject to a great deal of deformation and stress during the manufacturing process, and therefore the surfaces thereof are prone to cracking. These cracks may be fine microcracks or obvious tears that have a large impact on the strength and durability of the high temperature forging. In order to conveniently utilize computer vision and image processing techniques for data analysis and processing, it is often necessary to convert three-dimensional point cloud data into image data for analysis. In practice, however, the point cloud data tends to be non-uniform in density, and thus improper sampling intervals can result in insufficient or too dense sampling, affecting the subsequent defect detection process.
Disclosure of Invention
Based on the method, the application provides a high-temperature forging surface defect real-time detection method based on three-dimensional point cloud data, so as to solve the technical problems.
The method for detecting the surface defects of the high-temperature forging piece in real time based on the three-dimensional point cloud data comprises the following steps:
acquiring three-dimensional point cloud data of the high-temperature forging under a plurality of view angles;
converting the three-dimensional point cloud data corresponding to each view angle into a two-dimensional image;
respectively deriving and generating images of a plurality of other visual angles for each two-dimensional image to obtain a plurality of visual image pairs; each of said visual image pairs comprising a plurality of images of the same perspective derived from different ones of said two-dimensional images;
clustering each image in the visual image pair to obtain a clustering area of each image;
determining a position adjacent image of an image to which each clustering region belongs according to each clustering region, determining a clustering region with intersection with the clustering region from the position adjacent image, taking the clustering region as a communication clustering region corresponding to the clustering region, and calculating the intersection ratio between the clustering region and the communication clustering region; the position adjacent image of the image is an image corresponding to a shooting view angle adjacent to the shooting view angle corresponding to the image in the visual image pair to which the image belongs;
determining an abnormality degree value of each clustering region based on the intersection ratio corresponding to each clustering region;
determining an abnormal region corresponding to the high-temperature forging under each view angle according to the abnormal degree value of each clustering region;
determining abnormal three-dimensional point cloud data corresponding to the abnormal region from the three-dimensional point cloud data, and determining a sampling interval based on the abnormal three-dimensional point cloud data;
and sampling the three-dimensional point cloud data according to the sampling interval, and determining a defect area on the surface of the high-temperature forging piece according to the sampled data.
In one embodiment, the acquiring three-dimensional point cloud data of the high temperature forging at a plurality of view angles includes:
and acquiring three-dimensional point cloud data of the high-temperature forging through the shooting view angles of the rotary three-dimensional laser scanner, so as to obtain three-dimensional point cloud data under a plurality of view angles.
In one embodiment, the clustering the images in the visual image pair to obtain clustered regions of the images includes:
and clustering each image in the visual image pair through a density clustering algorithm to obtain a clustering region of each image.
In one embodiment, there are a plurality of position adjacent images of the image to which each of the clustering regions belongs, and the determining the abnormality degree value of each of the clustering regions based on the intersection ratio corresponding to each of the clustering regions includes:
for each clustering region, respectively determining a target communication clustering region corresponding to the clustering region from each position adjacent image according to the intersection ratio between the clustering region and each communication clustering region in the position adjacent image;
taking a set formed by the intersection ratio between the clustering region and the plurality of target communication clustering regions as an intersection ratio sequence corresponding to the clustering region; each of the cross ratios in the cross ratio sequence is sequentially arranged according to the corresponding shooting visual angles;
and calculating the abnormality degree value of the clustering region according to the minimum value in the cross ratio sequence corresponding to the clustering region.
In one embodiment, for each of the clustering regions, determining, from each of the position adjacent images, a target connected clustering region corresponding to the clustering region according to an intersection ratio between the clustering region and each of the connected clustering regions in the position adjacent images, respectively, includes:
and taking the communication clustering area with the largest intersection ratio between the adjacent images and the clustering area as a target communication clustering area corresponding to the clustering area in the adjacent images.
In one embodiment, the calculating the abnormality degree value of each cluster region according to the minimum value in the cross ratio sequence corresponding to the cluster region includes:
according to the formulaCalculating an abnormality degree value of the clustering region;
wherein ,said degree of abnormality value representing said cluster region,/->Said cross-over sequence representing said cluster region,>representing the minimum value in said cross-over sequence,/->Left side forming sequence representing the cross ratio sequenceIs represented by a first maximum principal component direction of the sequence, pl represents the proximity of the first maximum principal component direction to 90 DEG, r represents a second maximum principal component direction of the sequence formed on the right side of the cross ratio sequence,/>Indicating the proximity of the second maximum principal component direction to-90 °;
the left forming sequence is a sequence formed by the element on the left of the maximum cross ratio in the cross ratio sequence, and the right forming sequence is a sequence formed by the element on the right of the maximum cross ratio in the cross ratio sequence;
the first maximum principal component direction takes first coordinate data of the left side forming sequence as input, first characteristic values corresponding to the first coordinate data are calculated through a PCA algorithm, and the direction corresponding to the maximum first characteristic value is the first maximum principal component direction; the abscissa in the first coordinate data is the sequence value of the corresponding element in the left forming sequence, and the ordinate in the first coordinate data is the element value of the corresponding element in the left forming sequence;
the second maximum principal component direction takes second coordinate data of the right side forming sequence as input, second characteristic values corresponding to the second coordinate data are calculated through a PCA algorithm, and the direction corresponding to the maximum second characteristic value is the second maximum principal component direction; and the abscissa in the second coordinate data is the sequence value of the corresponding element in the right forming sequence, and the ordinate in the second coordinate data is the element value of the corresponding element in the right forming sequence.
In one embodiment, the determining, according to the abnormality degree value of each clustered region, an abnormal region corresponding to the high-temperature forging at each view angle includes:
when the abnormal degree value of a certain clustering region in the image is larger than a preset degree threshold, the clustering region is used as an abnormal region of the high-temperature forging piece under the view angle corresponding to the image;
or alternatively, the first and second heat exchangers may be,
and regarding each visual image pair, taking a clustering area corresponding to the minimum value in the abnormality degree values corresponding to each visual image pair as an abnormality area under the visual angle corresponding to the visual image pair.
In one embodiment, the determining abnormal three-dimensional point cloud data corresponding to the abnormal region from the three-dimensional point cloud data, and determining a sampling interval based on the abnormal three-dimensional point cloud data includes:
constructing a corresponding TIN triangular network structure based on the abnormal three-dimensional point cloud data of each abnormal region; each node in the TIN triangle network structure is the abnormal three-dimensional point cloud data;
calculating a distance reference value of the corresponding abnormal region based on each TIN triangular network structure;
and calculating a sampling interval based on the distance reference value corresponding to each abnormal region.
In one embodiment, the calculating the distance reference value of the corresponding abnormal region based on each TIN triangle network structure includes:
and respectively calculating the average value of the side lengths of each TIN triangular net structure, and taking the average value as a distance reference value of the corresponding abnormal region.
In one embodiment, the calculating a sampling interval based on the distance reference value corresponding to each abnormal region includes:
and taking the minimum value in the distance reference values corresponding to the abnormal areas as a sampling interval.
The application provides a real-time detection method of surface defects of a high-temperature forging piece based on three-dimensional point cloud data, which comprises the steps of obtaining three-dimensional point cloud data of the high-temperature forging piece under a plurality of visual angles, converting the three-dimensional point cloud data corresponding to each visual angle into two-dimensional images, respectively deriving and generating images of a plurality of other visual angles aiming at each two-dimensional image to obtain a plurality of visual image pairs, respectively clustering each image in the visual image pairs to obtain a clustering area of each image, determining a position adjacent image of the image of each clustering area, determining a communication clustering area with intersection with the clustering area from the position adjacent image, calculating the intersection ratio between the clustering area and the communication clustering area, determining the abnormal degree value of each clustering area based on the intersection ratio corresponding to each clustering area, according to the abnormal degree value of each clustering area, determining an abnormal region corresponding to the high-temperature forging under each view angle, determining abnormal three-dimensional point cloud data corresponding to the abnormal region from three-dimensional point cloud data, determining a sampling interval based on the abnormal three-dimensional point cloud data, sampling the three-dimensional point cloud data according to the sampling interval, determining a defect region of the surface of the high-temperature forging according to the sampled data, deriving multi-view images from two-dimensional images at different positions, comparing the multi-view images based on the view images to determine the sampling interval, avoiding errors caused by different views because the view images contain a plurality of images at the same view angle, simultaneously deriving the images to contain information of the images at the corresponding positions, further accurately detecting crack defects existing under certain view angles but not definitely under certain view angles, ensuring the rationality and reliability of the finally obtained sampling interval, the detection accuracy is improved.
Drawings
FIG. 1 is a flow diagram of a method for detecting surface defects of a high-temperature forging based on three-dimensional point cloud data in real time, which is provided by the embodiment of the application;
FIG. 2 is a schematic diagram of a rotary three-dimensional laser scanner provided in an embodiment of the present application;
FIG. 3 is a schematic diagram of a clustering result provided in an embodiment of the present application;
FIG. 4 is a schematic diagram of another clustering result provided in an embodiment of the present application;
fig. 5 is a schematic diagram of stacking two clustering results provided in the embodiment of the present application;
FIG. 6 is a flowchart of calculating an anomaly degree value according to an embodiment of the present application;
fig. 7 is a flowchart illustrating a process of calculating a sampling interval according to an embodiment of the present application.
Detailed Description
The present application will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present application more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the application.
In order to solve the problem that in the prior art, when defect detection is performed on the surface of a high-temperature forging piece, sampling is not fine enough or sampling is too dense due to improper sampling interval, and subsequent detection is affected, the embodiment of the application provides a method for detecting the surface defect of the high-temperature forging piece based on three-dimensional point cloud data in real time, which is shown in fig. 1, and can comprise the following steps:
s101: and acquiring three-dimensional point cloud data of the high-temperature forging under a plurality of view angles.
S102: and converting the three-dimensional point cloud data corresponding to each view angle into a two-dimensional image.
S103: respectively deriving and generating images of a plurality of other visual angles aiming at each two-dimensional image to obtain a plurality of visual image pairs; each visual image pair contains a plurality of images of the same view angle derived from different two-dimensional images.
S104: and clustering the images in the visual image pair to obtain clustering areas of the images.
S105: for each clustering region, determining a position adjacent image of an image to which the clustering region belongs, determining a communication clustering region with an intersection with the clustering region from the position adjacent image, and calculating an intersection ratio between the clustering region and the communication clustering region; the position adjacent image of the image is an image corresponding to a shooting view angle adjacent to the shooting view angle corresponding to the image in the visual image pair to which the image belongs.
S106: and determining the abnormality degree value of each clustering area based on the corresponding intersection ratio of each clustering area.
S107: and determining the abnormal region corresponding to the high-temperature forging under each visual angle according to the abnormal degree value of each clustering region.
S108: and determining abnormal three-dimensional point cloud data corresponding to the abnormal region from the three-dimensional point cloud data, and determining a sampling interval based on the abnormal three-dimensional point cloud data.
S109: and sampling the three-dimensional point cloud data according to the sampling interval, and determining a defect area on the surface of the high-temperature forging piece according to the sampled data.
The process involved in each of the above steps will be described in detail below.
Taking the crack defect as an example, since the directions of different cracks are different, the three-dimensional laser scanners in a single shooting direction cannot necessarily obtain the overall crack information, so in step S101, three-dimensional point cloud data under multiple view angles can be obtained by respectively acquiring three-dimensional point cloud data of the high-temperature forging through the three-dimensional laser scanners arranged at multiple positions. In other embodiments, the three-dimensional point cloud data may be acquired by other sensors. It can be understood that in order to obtain three-dimensional point cloud data with more visual angles through fewer three-dimensional laser scanners, the three-dimensional laser scanners at all positions can be rotated, and the three-dimensional point cloud data of the high-temperature forging piece can be acquired after the three-dimensional laser scanners are rotated by preset angles.
Preferably, in order to save cost, a three-dimensional laser scanner can be arranged, and three-dimensional point cloud data of the high-temperature forging piece are acquired by rotating the shooting view angle of the three-dimensional laser scanner, so that three-dimensional point cloud data under a plurality of view angles are obtained. For example, the three-dimensional laser scanner can be rotated by 90 degrees according to the direction shown in fig. 2 and then scanned, so that three-dimensional point cloud data under different viewing angles can be obtained.
When a plurality of three-dimensional laser scanners are arranged, for each three-dimensional laser scanner, corresponding three-dimensional point cloud data can be obtained in the corresponding shooting view angles, and then the three-dimensional point cloud data are converted into two-dimensional images, so that corresponding two-dimensional images can be obtained for the three-dimensional point cloud data corresponding to each shooting view angle. When one three-dimensional laser scanner is arranged, corresponding three-dimensional point cloud data can be obtained for each shooting view angle of the three-dimensional laser scanner, and therefore corresponding two-dimensional images can be obtained. Therefore, in step S102, a two-dimensional image corresponding to each view angle can be obtained. In step S102, when converting the three-dimensional point cloud data into the two-dimensional image, reference may be made to an existing method, for example, each three-dimensional point cloud data may be projected into a two-dimensional plane, so as to obtain a corresponding two-dimensional image, which is not described herein again.
For the same surface of the high-temperature forging piece, three-dimensional point cloud data obtained by collecting the surface by the same three-dimensional laser scanner at different shooting angles are different. The nerve radiation field can obtain images under a plurality of different visual angles through an image of a single shooting direction, and the images under the different visual angles are obtained through information on the image of the shooting direction, namely the images under the different visual angles are equivalent to be derived according to the image of the shooting direction.
For a certain area on the surface of the high-temperature forging piece, if the area does not have crack defects, the degree of similarity of derivative images (each shooting direction corresponds to one derivative image of the view angle) under the same view angle obtained by the three-dimensional laser scanners of a plurality of shooting directions is large, and due to the influence of the width and the depth of a crack, crack information is probably not obtained in a certain shooting direction, but the crack information is obtained in the other direction, so that subsequent calculation can be performed by comparing the derivative images converted into the same view angle.
In step S103, a plurality of images with other viewing angles may be derived for each two-dimensional image, and simply referred to as derived images, so as to obtain a plurality of visual image pairs, where each visual image pair includes a plurality of images with the same viewing angle derived from different two-dimensional images.
In one embodiment, each visual image pair may be composed of a two-dimensional image and a derivative image of the same viewing angle. In further embodiments, each visual image pair may consist of only derived images from the same perspective.
It will be appreciated that each derived image is derived from two-dimensional images at other viewing angles. Assuming that two-dimensional images corresponding to the high-temperature forging piece at the C1 view angle, the C2 view angle, the C3 view angle and the C4 view angle are obtained through a three-dimensional laser scanner, the two-dimensional images corresponding to each view angle can be derived to generate derived images of other view angles. For example, for a two-dimensional image of a C2 view angle, a derivative image of a C1 view angle, a derivative image of a C3 view angle, and a derivative image of a C4 view angle may be derived, and the above processing is performed on each two-dimensional image, so as to obtain a view image pair corresponding to the C1 view angle, a view image pair corresponding to the C2 view angle, a view image pair corresponding to the C3 view angle, and a view image pair corresponding to the C4 view angle.
Since each two-dimensional image is obtained after being scanned by the three-dimensional laser scanner corresponding to the shooting view angle, each derivative image also corresponds to one shooting view angle, that is, each image in each visual image pair has its corresponding shooting view angle, for example, the shooting view angle corresponding to the derivative image of the C3 view angle in the above example is the C2 view angle, and the shooting view angle corresponding to the derivative image of the C4 view angle is the C2 view angle, because these derivative images are derived from the two-dimensional images of the C2 view angle. In order to facilitate subsequent calculation, in the embodiment of the present application, each image in each visual image pair may be arranged according to a corresponding shooting viewing angle, that is, a shooting viewing angle of the three-dimensional laser scanner.
For example, after three-dimensional point cloud data of the high-temperature forging piece under a plurality of view angles is obtained, three-dimensional point cloud data corresponding to each surface of the high-temperature forging piece can be sampled by a uniform sampling method, then corresponding two-dimensional images are respectively converted, each surface is provided with a corresponding two-dimensional image, each two-dimensional image is from a three-dimensional laser scanner with different shooting view angles, in step S103, images under a plurality of other view angles can be obtained for each two-dimensional image through a nerve radiation field method, which are called derivative images, the view angles are the same, but derivative images derived from other different two-dimensional images are taken as a view angle image pair, and each view angle image pair is provided with a plurality of derivative images under the same view angle.
In step S104, each image in the pair of visual images may be clustered by a density clustering algorithm to obtain a clustered region of each image. It should be noted that, in other embodiments, the clustering may be performed by other clustering algorithms.
For a certain area of the surface of the high-temperature forging, if the area does not have a crack defect, different images in the visual angle image pair should be similar. The three-dimensional laser scanners at different positions obtain distance information from the three-dimensional laser scanner to the surface of the object, and the three-dimensional laser scanners are different in position, so that the height values of the point cloud data are different although the three-dimensional laser scanners are the same surface, and the point cloud data cannot be directly compared. In the embodiment of the application, the three-dimensional point cloud data is not directly compared, but the clustering results of the images in the pair are compared based on the visual images. Because each image in the visual image pair corresponds to the same vision, namely the same surface is measured, the clustering result is less influenced, namely the clustering result of one image in the visual image pair is similar to the clustering result of the other image, and the situation that the values of the images cannot be directly compared due to the difference of the values is avoided.
It can be understood that the smaller the overlap ratio of the same-position areas of the two images in the same view angle image pair, the greater the degree of abnormality corresponding thereto. Therefore, in the embodiment of the application, the crack can be analyzed by the size of the intersection ratio under the adjacent visual angles.
For ease of understanding, step S105 is described herein with a specific example.
It is assumed that after step S104 is performed, a cluster result map as shown in fig. 3 and fig. 4 is obtained, fig. 3 and fig. 4 are respectively a cluster result map of an image a and an image B, the image a and the image B are images in the same visual image pair, and the shooting view corresponding to the image a is adjacent to the shooting view corresponding to the image B, that is, the image B is a position adjacent image of the image a, the area a is one cluster area of the image a, the area B is one cluster area of the image B, and the image a and the image B are superimposed, that is, as shown in fig. 5, for each cluster area after the superimposition, a corresponding connected cluster area thereof can be obtained, for example, the middle part in fig. 5 is an intersection between the cluster area a and the cluster area B, and for the cluster area a, the area B can be referred to as a connected cluster area of the area a. Therefore, in step S105, the intersection ratio between the cluster region a and the connected cluster region b can be calculated.
It should be noted that, in general, there are a plurality of images adjacent to each other in the image to which each cluster area belongs, and in this case, referring to fig. 6, step S106 may include the following sub-steps:
s1061: and for each clustering region, respectively determining a target communication clustering region corresponding to the clustering region from each position adjacent image according to the intersection ratio between the clustering region and each communication clustering region in the position adjacent image.
S1062: taking a set formed by the cross ratio between the clustering region and a plurality of target communication clustering regions as a cross ratio sequence corresponding to the clustering region; the cross ratios in the cross ratio sequence are sequentially arranged according to the corresponding shooting visual angles.
S1063: and calculating the abnormality degree value of the clustering region according to the minimum value in the cross ratio sequence corresponding to the clustering region.
For step S1061, when there are at least 2 connected cluster areas corresponding to the cluster areas in a certain position-adjacent image, the connected cluster area with the largest intersection ratio between the position-adjacent image and the cluster area may be used as the target connected cluster area corresponding to the cluster area in the position-adjacent image. That is, for each cluster region, when there are at least 2 connected cluster regions having intersections with the cluster region in a position adjacent image to which the cluster region belongs, the intersection ratio between the cluster region and each connected cluster region in the position adjacent image may be calculated, respectively, and the connected cluster region corresponding to the maximum intersection ratio may be used as the target connected region corresponding to the cluster region in the position adjacent image. It can be understood that when only one connected cluster region corresponding to the cluster region exists in a certain position adjacent image, the connected cluster region can be directly used as a target connected region corresponding to the cluster region in the position adjacent image.
Continuing with the above example, assuming that there are other connected cluster regions corresponding to the cluster region a on the image B, the intersection ratio between each connected cluster region and the cluster region a in the image B may be calculated, and the connected cluster region corresponding to the maximum intersection ratio may be used as the target connected cluster region corresponding to the cluster region a.
When there are a plurality of adjacent images in the image to which the clustering region belongs, for each adjacent image, a target connected clustering region corresponding to the clustering image is obtained, and in step S1062, the intersection ratio between the clustering region and each target connected clustering region is formed into a set, so as to obtain an intersection ratio sequence corresponding to the clustering region.
If no crack exists, the clustering results corresponding to the derivative images at the same view angle should be similar, namely, the clustering results of the images in the image pair at the same view angle should be similar, and the crack can be detected through the cross-over ratio because the crack exists to cause the existence of a smaller cross-over ratio. Therefore, in some embodiments, after the cross-over sequence of each cluster region is obtained, the minimum value in the cross-over sequence may be directly used as the abnormal degree value of the corresponding cluster region, that is, when the minimum cross-over corresponding to a certain cluster region is smaller than the preset cross-over threshold, the region may be used as the abnormal region under the corresponding view angle. Alternatively, for each visual image pair, the cluster region corresponding to the minimum value in the intersection ratio corresponding to each visual image pair may be used as the abnormal region at the angle of view corresponding to the visual image pair.
It can be understood that in the gradual movement of the crack region of the high-temperature forging in the adjacent shooting directions, the abnormal region has gradual change, that is, the defect of the derivative image corresponding to a certain shooting direction is most obvious, and if other shooting directions have no obvious degree of the shooting direction, the defect is understood to have a larger defect tendency in the direction with the most obvious defect, and in the embodiment of the application, the abnormal degree value of the clustering region can be calculated based on the property.
Specifically, in step S1063, the formula may be usedCalculating an abnormal degree value of the clustering area; the logic of the formula is:
degree of abnormality value = intersection ratio sequence minimum value's minor × side-to-side tilt larger value.
The smaller the intersection ratio is, the more the intersection ratio is, which indicates that the intersection ratio does not appear under other angles, and the probability of defects is high when the intersection ratio appears under the angles; since the view angle corresponding to the cross-over sequence is gradually shifted from a view angle direction, in which the defect is most pronounced, there is a tendency that if the maximum tendency is smaller, the probability of defect is small, and thus the maximum tendency is calculated.
wherein ,said degree of abnormality value representing said cluster region,/->Said cross-over sequence representing said cluster region,>the smaller the minimum value in the cross ratio sequence is, the larger the probability that the defect is detected on the image of the derivative image or the two-dimensional image corresponding to the shooting direction is, and the probability that the defect is a crack defect is larger; />A first maximum principal component direction indicating that a sequence is formed on the left side of the overlap ratio sequence, pl indicating the degree of approach of the first maximum principal component direction to 90 °, the closer the first principal component direction is to 90 °, the greater the probability that the corresponding region is a defective region; r represents the second largest principal component direction of the right-hand forming sequence of the cross-over sequence,/->And representing the approaching degree of the second maximum principal component direction to-90 degrees, wherein the probability that the corresponding region is a defect region is larger as the second maximum principal component direction approaches to-90 degrees.
The left forming sequence is a sequence formed by the element on the left of the maximum cross ratio in the cross ratio sequence, and the right forming sequence is a sequence formed by the element on the right of the maximum cross ratio in the cross ratio sequence; the first maximum principal component direction takes first coordinate data of the left side forming sequence as input, first characteristic values corresponding to the first coordinate data are calculated through a PCA algorithm, and the direction corresponding to the maximum first characteristic value is the first maximum principal component direction; the abscissa in the first coordinate data is the sequence value of the corresponding element in the left forming sequence, and the ordinate in the first coordinate data is the element value of the corresponding element in the left forming sequence; the second maximum principal component direction takes second coordinate data of the right side forming sequence as input, second characteristic values corresponding to the second coordinate data are calculated through a PCA algorithm, and the direction corresponding to the maximum second characteristic value is the second maximum principal component direction; and the abscissa in the second coordinate data is the sequence value of the corresponding element in the right forming sequence, and the ordinate in the second coordinate data is the element value of the corresponding element in the right forming sequence. Since the images in the pair of view angle images are arranged in the order of adjacent positions by corresponding to the positions of the three-dimensional laser scanners, each of the cross ratio sequences may be arranged in the order, so that the left and right sides can be obtained.
The maximum principal component direction represents the maximum projection of the data in the direction, namely the main distribution direction of the data, and the elements are orderly, so that the maximum principal component direction on the left side is closer to 90 degrees and represents the increasing trend of the data from other positions to the point; the right hand intersection ratio shows a decreasing trend, indicating that the intersection ratio shows a decreasing trend from that location to other data. For a crack, the point is the nature of the crack position, the apparent degree is the largest under the visual angle, the intersection ratio is the largest, and other positions are influenced by the visual angle or are smaller or even none.
In step S107 of the embodiment of the present application, the abnormal region corresponding to the high temperature forging under each view angle may be determined by any one of the following methods:
mode one: threshold determination method
When the abnormal degree value of a certain clustering region in the image is larger than a preset degree threshold, the clustering region is used as an abnormal region of the high-temperature forging piece under the view angle corresponding to the image.
The preset degree threshold value can be flexibly set by a developer according to actual conditions, for example, can be set to be 0.6, and a clustering area with an abnormality degree value larger than 0.6 is considered to be an abnormal area caused by cracks with high probability. A cluster area having an abnormality degree value of less than or equal to 0.6 is considered to be likely to be caused by a systematic error or an illumination influence of the photographing device.
Mode two: method of determining the maximum value
And taking the clustering area corresponding to the minimum value in the abnormal degree values corresponding to the visual image pairs as the abnormal area under the visual angle corresponding to the visual image pairs.
Each visual image pair corresponds to a shooting visual angle, each image in each visual image pair has a respective clustering area, each clustering area has a respective abnormal degree value, and in the method, the clustering area corresponding to the minimum value in the abnormal degree values is used as the abnormal area under the shooting visual angle.
It will be appreciated that in other embodiments, the anomaly region may also be determined in other ways. For example, when the abnormality degree value of a certain clustering region in a certain image is greater than a preset degree threshold, the clustering region is used as a possible abnormality region of the high-temperature forging piece under the view angle corresponding to the image, intersection is obtained on each possible abnormality region corresponding to the image with the same view angle, and the finally obtained intersection region is used as the corresponding abnormality region under the view angle.
Referring to fig. 7, step S108 in the embodiment of the present application may include the following sub-steps:
s1081: constructing a corresponding TIN triangular network structure based on the abnormal three-dimensional point cloud data of each abnormal region; and each node in the TIN triangle network structure is the abnormal three-dimensional point cloud data.
By calculating each surface of the high temperature forging, that is, calculating each view angle, abnormal areas of all surfaces can be obtained. Meanwhile, in the dimension reduction process, abnormal three-dimensional point cloud data corresponding to each abnormal region can be obtained, and a corresponding TIN triangle network structure is constructed on the abnormal three-dimensional point cloud data of each abnormal region through a Delaunay method.
S1082: and calculating a distance reference value of the corresponding abnormal region based on each TIN triangle network structure.
S1083: and calculating a sampling interval based on the distance reference value corresponding to each abnormal region.
In step S1082, for each TIN triangle mesh structure, an average value of the edge lengths of the two connected abnormal three-dimensional point cloud data may be calculated, and the average value may be used as a distance reference value of the corresponding abnormal region. In some embodiments, the shortest side length of the TIN triangle mesh structure may also be used as a distance reference value for the corresponding anomaly area.
In step S1083, the minimum value of the distance reference values corresponding to the abnormal regions may be set as a sampling interval. The characteristics of the crack defect area can be best reserved by the sampled data, and the accuracy and reliability of the detection result are improved.
In step S109, the three-dimensional point cloud data in step S101 may be sampled at the sampling interval obtained by the above calculation, so as to convert the three-dimensional point cloud data into a two-dimensional depth image, and the defect area on the surface of the high-temperature forging is obtained by VGG-NET recognition network recognition.
According to the method for detecting the surface defects of the high-temperature forging based on the three-dimensional point cloud data in real time, the multi-view images can be derived from the two-dimensional images at different positions, the sampling interval is determined by comparing the pair of the view images, the error caused by different vision is avoided because the pair of the view images comprises a plurality of images at the same view, meanwhile, the derived images comprise information of the images at the corresponding positions, and further, the crack defects which exist under certain view but are undefined under certain view can be accurately detected, the rationality and the reliability of the finally obtained sampling interval are ensured, and the detection precision is improved.
It should be understood that, although the steps in the above-described flowcharts are shown in order as indicated by the arrows, these steps are not necessarily performed in order as indicated by the arrows. The steps are not strictly limited to the order of execution unless explicitly recited herein, and the steps may be executed in other orders. Moreover, at least some of the steps in the flowcharts described above may include a plurality of sub-steps or stages that are not necessarily performed at the same time, but may be performed at different times, and the order of execution of the sub-steps or stages is not necessarily sequential, but may be performed alternately or alternately with at least a part of the sub-steps or stages of other steps or other steps.
It should be noted that, the illustrations provided in the present embodiment merely illustrate the basic concept of the present application by way of illustration, and only the components related to the present application are shown in the drawings and are not drawn according to the number, shape and size of the components in actual implementation, and the form, number and proportion of the components in actual implementation may be arbitrarily changed, and the layout of the components may be more complex. The structures, proportions, sizes, etc. shown in the drawings attached hereto are for illustration purposes only and are not intended to limit the scope of the application, which is defined by the claims, but rather by the claims. Also, the terms such as "upper," "lower," "left," "right," "middle," and "a" and the like recited in the present specification are merely for descriptive purposes and are not intended to limit the scope of the application, but are intended to provide relative positional changes or modifications without materially altering the technical context in which the application may be practiced.
The technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The above examples illustrate only a few embodiments of the application, which are described in detail and are not to be construed as limiting the scope of the application. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the application, which are all within the scope of the application. Accordingly, the scope of protection of the present application is to be determined by the appended claims.

Claims (10)

1. The method for detecting the surface defects of the high-temperature forging piece in real time based on the three-dimensional point cloud data is characterized by comprising the following steps of:
acquiring three-dimensional point cloud data of the high-temperature forging under a plurality of view angles;
converting the three-dimensional point cloud data corresponding to each view angle into a two-dimensional image;
respectively deriving and generating images of a plurality of other visual angles for each two-dimensional image to obtain a plurality of visual image pairs; each of said visual image pairs comprising a plurality of images of the same perspective derived from different ones of said two-dimensional images;
clustering each image in the visual image pair to obtain a clustering area of each image;
determining a position adjacent image of an image to which each clustering region belongs according to each clustering region, determining a communication clustering region with intersection with the clustering region from the position adjacent image, and calculating the intersection ratio between the clustering region and the communication clustering region; the position adjacent image of the image is an image corresponding to a shooting view angle adjacent to the shooting view angle corresponding to the image in the visual image pair to which the image belongs;
determining an abnormality degree value of each clustering region based on the intersection ratio corresponding to each clustering region;
determining an abnormal region corresponding to the high-temperature forging under each view angle according to the abnormal degree value of each clustering region;
determining abnormal three-dimensional point cloud data corresponding to the abnormal region from the three-dimensional point cloud data, and determining a sampling interval based on the abnormal three-dimensional point cloud data;
and sampling the three-dimensional point cloud data according to the sampling interval, and determining a defect area on the surface of the high-temperature forging piece according to the sampled data.
2. The method for detecting the surface defects of the high-temperature forging based on the three-dimensional point cloud data in real time according to claim 1, wherein the step of obtaining the three-dimensional point cloud data of the high-temperature forging under a plurality of view angles comprises the following steps:
and acquiring three-dimensional point cloud data of the high-temperature forging through the shooting view angles of the rotary three-dimensional laser scanner, so as to obtain three-dimensional point cloud data under a plurality of view angles.
3. The method for detecting surface defects of high-temperature forgings in real time based on three-dimensional point cloud data according to claim 1, wherein the clustering of the images in the visual image pair to obtain a clustering area of each image comprises the following steps:
and clustering each image in the visual image pair through a density clustering algorithm to obtain a clustering region of each image.
4. The method for detecting surface defects of high-temperature forgings based on three-dimensional point cloud data according to claim 3, wherein a plurality of images adjacent to each other in position exist in the image to which each of the clustering areas belongs, and the determining the abnormality degree value of each of the clustering areas based on the intersection ratio corresponding to each of the clustering areas comprises:
for each clustering region, respectively determining a target communication clustering region corresponding to the clustering region from each position adjacent image according to the intersection ratio between the clustering region and each communication clustering region in the position adjacent image;
taking a set formed by the intersection ratio between the clustering region and the plurality of target communication clustering regions as an intersection ratio sequence corresponding to the clustering region; each of the cross ratios in the cross ratio sequence is sequentially arranged according to the corresponding shooting visual angles;
and calculating the abnormality degree value of the clustering region according to the minimum value in the cross ratio sequence corresponding to the clustering region.
5. The method for detecting surface defects of high-temperature forgings based on three-dimensional point cloud data according to claim 4, wherein for each clustering region, determining a target communication clustering region corresponding to the clustering region from each adjacent image according to an intersection ratio between the clustering region and each communication clustering region in the adjacent image comprises:
and taking the communication clustering area with the largest intersection ratio between the adjacent images and the clustering area as a target communication clustering area corresponding to the clustering area in the adjacent images.
6. The method for real-time detection of surface defects of high-temperature forgings based on three-dimensional point cloud data as set forth in claim 4, wherein the calculating the abnormality degree value of the clustered region according to the minimum value in the cross ratio sequence corresponding to the clustered region comprises:
according to the formulaCalculating an abnormality degree value of the clustering region;
wherein ,said degree of abnormality value representing said cluster region,/->Representing the clustered regionsThe cross ratio sequence,>representing the minimum value in said cross-over sequence,/->A first maximum principal component direction indicating a left side formation sequence of the cross ratio sequence, pl indicating a degree of proximity of the first maximum principal component direction to 90 DEG, r indicating a second maximum principal component direction of a right side formation sequence of the cross ratio sequence,>indicating the proximity of the second maximum principal component direction to-90 °;
the left forming sequence is a sequence formed by the element on the left of the maximum cross ratio in the cross ratio sequence, and the right forming sequence is a sequence formed by the element on the right of the maximum cross ratio in the cross ratio sequence;
the first maximum principal component direction takes first coordinate data of the left side forming sequence as input, first characteristic values corresponding to the first coordinate data are calculated through a PCA algorithm, and the direction corresponding to the maximum first characteristic value is the first maximum principal component direction; the abscissa in the first coordinate data is the sequence value of the corresponding element in the left forming sequence, and the ordinate in the first coordinate data is the element value of the corresponding element in the left forming sequence;
the second maximum principal component direction takes second coordinate data of the right side forming sequence as input, second characteristic values corresponding to the second coordinate data are calculated through a PCA algorithm, and the direction corresponding to the maximum second characteristic value is the second maximum principal component direction; and the abscissa in the second coordinate data is the sequence value of the corresponding element in the right forming sequence, and the ordinate in the second coordinate data is the element value of the corresponding element in the right forming sequence.
7. The method for detecting surface defects of high-temperature forgings based on three-dimensional point cloud data in real time according to any one of claims 1 to 6, wherein determining the abnormal region corresponding to the high-temperature forgings under each view angle according to the abnormal degree value of each clustered region comprises:
when the abnormal degree value of a certain clustering region in the image is larger than a preset degree threshold, the clustering region is used as an abnormal region of the high-temperature forging piece under the view angle corresponding to the image;
or alternatively, the first and second heat exchangers may be,
and taking a clustering area corresponding to the minimum value in the abnormality degree values corresponding to the visual image pairs as an abnormal area under the visual angle corresponding to the visual image pairs.
8. The method for real-time detection of surface defects of high-temperature forgings based on three-dimensional point cloud data according to any one of claims 1 to 6, wherein determining abnormal three-dimensional point cloud data corresponding to the abnormal region from the three-dimensional point cloud data and determining a sampling interval based on the abnormal three-dimensional point cloud data comprises:
constructing a corresponding TIN triangular network structure based on the abnormal three-dimensional point cloud data of each abnormal region; each node in the TIN triangle network structure is the abnormal three-dimensional point cloud data;
calculating a distance reference value of the corresponding abnormal region based on each TIN triangular network structure;
and calculating a sampling interval based on the distance reference value corresponding to each abnormal region.
9. The method for detecting surface defects of high-temperature forgings in real time based on three-dimensional point cloud data according to claim 8, wherein the calculating distance reference values of the corresponding abnormal areas based on the TIN triangulation network structures comprises the following steps:
and respectively calculating the average value of the side lengths of each TIN triangular net structure, and taking the average value as a distance reference value of the corresponding abnormal region.
10. The method for real-time detection of surface defects of high-temperature forgings based on three-dimensional point cloud data as set forth in claim 8, wherein the calculating a sampling interval based on the distance reference value corresponding to each abnormal region comprises:
and taking the minimum value in the distance reference values corresponding to the abnormal areas as a sampling interval.
CN202310926708.9A 2023-07-27 2023-07-27 Real-time detection method for surface defects of high-temperature forging based on three-dimensional point cloud data Active CN116645370B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310926708.9A CN116645370B (en) 2023-07-27 2023-07-27 Real-time detection method for surface defects of high-temperature forging based on three-dimensional point cloud data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310926708.9A CN116645370B (en) 2023-07-27 2023-07-27 Real-time detection method for surface defects of high-temperature forging based on three-dimensional point cloud data

Publications (2)

Publication Number Publication Date
CN116645370A true CN116645370A (en) 2023-08-25
CN116645370B CN116645370B (en) 2023-10-10

Family

ID=87625161

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310926708.9A Active CN116645370B (en) 2023-07-27 2023-07-27 Real-time detection method for surface defects of high-temperature forging based on three-dimensional point cloud data

Country Status (1)

Country Link
CN (1) CN116645370B (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102003938A (en) * 2010-10-11 2011-04-06 中国人民解放军信息工程大学 Thermal state on-site detection method for large high-temperature forging
KR20150128300A (en) * 2014-05-09 2015-11-18 한국건설기술연구원 method of making three dimension model and defect analysis using camera and laser scanning
CN109658398A (en) * 2018-12-12 2019-04-19 华中科技大学 A kind of surface defects of parts identification and appraisal procedure based on three-dimensional measurement point cloud
CN112326673A (en) * 2020-11-13 2021-02-05 南京航空航天大学 Injection molding surface defect detection method and device based on machine vision
US11150200B1 (en) * 2020-06-15 2021-10-19 Mitutoyo Corporation Workpiece inspection and defect detection system indicating number of defect images for training
WO2022012796A1 (en) * 2020-07-16 2022-01-20 Siemens Aktiengesellschaft Computer-implemented, adapted anomaly detection method for powder-bed-based additive manufacturing
CN114529679A (en) * 2022-04-19 2022-05-24 清华大学 Method and device for generating computed holographic field based on nerve radiation field
CN115546153A (en) * 2022-10-11 2022-12-30 苏州鼎纳自动化技术有限公司 Product surface defect detection method based on three-dimensional point cloud and two-dimensional image fusion perception
CN115641553A (en) * 2022-12-26 2023-01-24 太原理工大学 Online detection device and method for invaders in heading machine working environment
CN115861274A (en) * 2022-12-25 2023-03-28 北京工业大学 Crack detection method integrating three-dimensional point cloud and two-dimensional image
CN116168351A (en) * 2023-04-26 2023-05-26 佰聆数据股份有限公司 Inspection method and device for power equipment

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102003938A (en) * 2010-10-11 2011-04-06 中国人民解放军信息工程大学 Thermal state on-site detection method for large high-temperature forging
KR20150128300A (en) * 2014-05-09 2015-11-18 한국건설기술연구원 method of making three dimension model and defect analysis using camera and laser scanning
CN109658398A (en) * 2018-12-12 2019-04-19 华中科技大学 A kind of surface defects of parts identification and appraisal procedure based on three-dimensional measurement point cloud
US11150200B1 (en) * 2020-06-15 2021-10-19 Mitutoyo Corporation Workpiece inspection and defect detection system indicating number of defect images for training
WO2022012796A1 (en) * 2020-07-16 2022-01-20 Siemens Aktiengesellschaft Computer-implemented, adapted anomaly detection method for powder-bed-based additive manufacturing
CN112326673A (en) * 2020-11-13 2021-02-05 南京航空航天大学 Injection molding surface defect detection method and device based on machine vision
CN114529679A (en) * 2022-04-19 2022-05-24 清华大学 Method and device for generating computed holographic field based on nerve radiation field
CN115546153A (en) * 2022-10-11 2022-12-30 苏州鼎纳自动化技术有限公司 Product surface defect detection method based on three-dimensional point cloud and two-dimensional image fusion perception
CN115861274A (en) * 2022-12-25 2023-03-28 北京工业大学 Crack detection method integrating three-dimensional point cloud and two-dimensional image
CN115641553A (en) * 2022-12-26 2023-01-24 太原理工大学 Online detection device and method for invaders in heading machine working environment
CN116168351A (en) * 2023-04-26 2023-05-26 佰聆数据股份有限公司 Inspection method and device for power equipment

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
JIALE LI ET AL.: "From voxel to point:IOU-guided 3D object detection for point cloud with voxel-to-point decoder", 《ARXIV》, pages 1 - 10 *
何瑞函 等: "基于优化多视角图像采集的点云分类", 《计算机技术与应用》, vol. 47, no. 10, pages 82 - 85 *
周卓 等: "基于三维点云数据的沥青路面裂缝识别方法", 《交通科技》, pages 9 - 14 *

Also Published As

Publication number Publication date
CN116645370B (en) 2023-10-10

Similar Documents

Publication Publication Date Title
CN108010116B (en) Point cloud feature point detection method and point cloud feature extraction method
EP3618980B1 (en) Methods and systems for improved quality inspection of products using a robot
TWI644099B (en) Pattern measuring device and semiconductor measuring system
Tazir et al. CICP: Cluster Iterative Closest Point for sparse–dense point cloud registration
US7804586B2 (en) Method and system for image processing for profiling with uncoded structured light
US8121415B2 (en) Combining feature boundaries
CA2507174A1 (en) Method of registering and aligning multiple images
US20130058526A1 (en) Device for automated detection of feature for calibration and method thereof
JP5123244B2 (en) Shape defect inspection device, shape modeling device, and shape defect inspection program
CN114972575A (en) Linear fitting algorithm based on contour edge
CN107274446B (en) Method for identifying sharp geometric edge points by using normal consistency
CN113728360A (en) Method and apparatus for pose, size and shape measurement of objects in 3D scene
Zhao et al. Vision-based adaptive stereo measurement of pins on multi-type electrical connectors
CN113962967B (en) Object shot image ellipse detection algorithm based on Markuling theorem constraint
Zhao et al. Defect detection of 3D printing surface based on geometric local domain features
CN116645370B (en) Real-time detection method for surface defects of high-temperature forging based on three-dimensional point cloud data
JP6019924B2 (en) Matching support apparatus, matching support method, and program thereof
JP3460824B2 (en) Shape recognition method and device
TWI807346B (en) Dimension measuring device, semiconductor manufacturing device and semiconductor device manufacturing system
US20220358321A1 (en) Detection Method And Device For Assembly Body Multi-View Change Based On Feature Matching
CN115049687B (en) Point cloud extraction method, device, equipment and medium
WO2008000055A1 (en) Methods for simultaneous multi-set point matching
CN116740101B (en) Plane segmentation method for point cloud object
Tian et al. Automatically pick fiducial markers in electron tomography tilt images
WO2021240610A1 (en) Pattern inspection/measurement device, and pattern inspection/ measurement program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant