CN116277973B - 3D prints detecting system - Google Patents

3D prints detecting system Download PDF

Info

Publication number
CN116277973B
CN116277973B CN202310192167.1A CN202310192167A CN116277973B CN 116277973 B CN116277973 B CN 116277973B CN 202310192167 A CN202310192167 A CN 202310192167A CN 116277973 B CN116277973 B CN 116277973B
Authority
CN
China
Prior art keywords
pixel points
pixel point
distance
actual
target pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310192167.1A
Other languages
Chinese (zh)
Other versions
CN116277973A (en
Inventor
辛志远
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Henan University
Original Assignee
Henan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Henan University filed Critical Henan University
Priority to CN202310192167.1A priority Critical patent/CN116277973B/en
Publication of CN116277973A publication Critical patent/CN116277973A/en
Application granted granted Critical
Publication of CN116277973B publication Critical patent/CN116277973B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B29WORKING OF PLASTICS; WORKING OF SUBSTANCES IN A PLASTIC STATE IN GENERAL
    • B29CSHAPING OR JOINING OF PLASTICS; SHAPING OF MATERIAL IN A PLASTIC STATE, NOT OTHERWISE PROVIDED FOR; AFTER-TREATMENT OF THE SHAPED PRODUCTS, e.g. REPAIRING
    • B29C64/00Additive manufacturing, i.e. manufacturing of three-dimensional [3D] objects by additive deposition, additive agglomeration or additive layering, e.g. by 3D printing, stereolithography or selective laser sintering
    • B29C64/30Auxiliary operations or equipment
    • B29C64/386Data acquisition or data processing for additive manufacturing
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B29WORKING OF PLASTICS; WORKING OF SUBSTANCES IN A PLASTIC STATE IN GENERAL
    • B29CSHAPING OR JOINING OF PLASTICS; SHAPING OF MATERIAL IN A PLASTIC STATE, NOT OTHERWISE PROVIDED FOR; AFTER-TREATMENT OF THE SHAPED PRODUCTS, e.g. REPAIRING
    • B29C64/00Additive manufacturing, i.e. manufacturing of three-dimensional [3D] objects by additive deposition, additive agglomeration or additive layering, e.g. by 3D printing, stereolithography or selective laser sintering
    • B29C64/30Auxiliary operations or equipment
    • B29C64/386Data acquisition or data processing for additive manufacturing
    • B29C64/393Data acquisition or data processing for additive manufacturing for controlling or regulating additive manufacturing processes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B33ADDITIVE MANUFACTURING TECHNOLOGY
    • B33YADDITIVE MANUFACTURING, i.e. MANUFACTURING OF THREE-DIMENSIONAL [3-D] OBJECTS BY ADDITIVE DEPOSITION, ADDITIVE AGGLOMERATION OR ADDITIVE LAYERING, e.g. BY 3-D PRINTING, STEREOLITHOGRAPHY OR SELECTIVE LASER SINTERING
    • B33Y50/00Data acquisition or data processing for additive manufacturing
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B33ADDITIVE MANUFACTURING TECHNOLOGY
    • B33YADDITIVE MANUFACTURING, i.e. MANUFACTURING OF THREE-DIMENSIONAL [3-D] OBJECTS BY ADDITIVE DEPOSITION, ADDITIVE AGGLOMERATION OR ADDITIVE LAYERING, e.g. BY 3-D PRINTING, STEREOLITHOGRAPHY OR SELECTIVE LASER SINTERING
    • B33Y50/00Data acquisition or data processing for additive manufacturing
    • B33Y50/02Data acquisition or data processing for additive manufacturing for controlling or regulating additive manufacturing processes
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P10/00Technologies related to metal processing
    • Y02P10/25Process efficiency

Landscapes

  • Chemical & Material Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Materials Engineering (AREA)
  • Manufacturing & Machinery (AREA)
  • Physics & Mathematics (AREA)
  • Mechanical Engineering (AREA)
  • Optics & Photonics (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to the field of 3D printing, in particular to a 3D printing detection system; comprising the following steps: the system comprises an image processing module, a data processing module and a detection module, wherein the image processing module, the data processing module and the detection module are used for acquiring a correction edge image; the correction edge image comprises an actual pixel point and a target pixel point, the correction edge image is subjected to corner detection, and when the position of the corner is the target pixel point, each corner divides an edge line corresponding to the target pixel point into each short edge line; calculating the distance difference value of each actual pixel point; clustering actual pixel points corresponding to target pixel points on any short edge line according to the distance difference value to obtain clusters; calculating the path deviation degree of each short edge line based on the number of clusters; further, the degree of deviation is obtained, and the printing accuracy is determined based on the degree of deviation. The invention can accurately acquire the judgment result of the printing precision and realize the detection of the printing precision.

Description

3D prints detecting system
Technical Field
The invention relates to the field of 3D printing, in particular to a 3D printing detection system.
Background
The 3D printing technology is widely applied to the fields of mechanical manufacture, construction, medicine, aerospace and the like, and is mainly divided into fused deposition rapid forming, photo-curing forming, three-dimensional powder bonding, selective laser sintering and the like according to different printing principles. The fused deposition rapid prototyping process has the advantages of low equipment cost, high material utilization rate, short research and development period, multiple materials and the like, but has low prototyping precision, and is easy to cause the problems of dislocation, overflow, wire shortage and the like, and the printing precision of the printing process is usually checked by adopting machine vision.
In the prior art, the printing precision is directly checked by utilizing the distance difference between the positions of the printed actual contour and the preset model contour, namely, the larger the distance difference is, the lower the printing precision is represented, the smaller the distance difference is, and the printing precision is higher, but the checking method does not consider whether the deviation between the actual contour and the preset model contour is inside or outside the printing object, and the analysis weights of the deviation inside and outside the printing object are the same.
Disclosure of Invention
In order to solve the technical problem that in the prior art, whether deviation between an actual contour and a preset model contour is inside or outside a printed object is not considered, and deviation at different positions has the same analysis weight, so that printing precision cannot be accurately acquired, the invention aims to provide a 3D printing detection system, which adopts the following technical scheme:
The image processing module is used for acquiring an edge image corresponding to a current layer printing object when the 3D printer prints, marking edge pixel points in the edge image as actual pixel points, marking edge pixel points corresponding to a standard path in the edge image as first target pixel points, filling the edge pixel points in the edge image according to the standard path, marking the filled edge pixel points as second target pixel points, and marking the first target pixel points and the second target pixel points as target pixel points to obtain a corrected edge image;
the data processing module is used for detecting angular points of the corrected edge image, and when the position of the angular point is a target pixel point, each angular point divides an edge line corresponding to the target pixel point into each short edge line; for any one actual pixel point, calculating the distance between all the target pixel points and the actual pixel point, and calculating the distance difference value of the actual pixel point according to the minimum distance and the number of the target pixel points on the short edge line where the target pixel point corresponding to the minimum distance is located;
The detection module is used for calculating the distance between all the actual pixel points and any target pixel point on any short edge line, acquiring the actual pixel point corresponding to the minimum distance, further acquiring the actual pixel point corresponding to the minimum distance of all the target pixel points on the short edge line, and clustering the acquired actual pixel points according to the distance difference value to acquire each cluster; obtaining outliers in the actual pixel points according to the distance difference values, and calculating the path deviation degree of the short edge line according to the number of the outliers, the range and standard deviation corresponding to the distance difference values, the number of target pixel points on the short edge line, the number of all clusters and the maximum actual pixel points contained in the clusters; and acquiring the deviation degree according to the path deviation degree, and judging the printing precision based on the deviation degree.
Preferably, the calculating the path deviation degree of the short edge line according to the number of outliers, the extreme difference and standard deviation corresponding to the distance difference value, the number of target pixel points on the short edge line, the number of all clusters, and the maximum actual pixel points contained in the clusters includes:
Counting the number of actual pixel points contained in each cluster, obtaining the maximum number of actual pixel points contained in the cluster, calculating the ratio of the maximum number of actual pixel points contained in the cluster to the number of target pixel points on the short edge line, and taking the ratio of the number of all clusters to the ratio as a first characteristic value; and calculating the product of the range corresponding to the distance difference value, the standard deviation corresponding to the distance difference value and the number of outliers, and recording the product as a second characteristic value, wherein the product of the first characteristic value and the second characteristic value is used as the path deviation degree of the short edge line.
Preferably, before calculating the distance difference value of the actual pixel point, the method further includes: judging whether the actual pixel point is in an area formed by all target pixel points, and if the actual pixel point is in the area, marking the actual pixel point as a first type pixel point; and if the actual pixel point is outside the area, marking the actual pixel point as a second type pixel point.
Preferably, calculating the distance difference value of the actual pixel point according to the minimum distance and the number of target pixel points on the short edge line where the target pixel point corresponding to the minimum distance is located includes:
when the actual pixel point is a first type pixel point, the distance is Manhattan distance, the ratio of the minimum Manhattan distance to the number of target pixel points on a short edge line where the target pixel point corresponding to the minimum Manhattan distance is located is calculated and recorded as a first ratio, and the product of the minimum Manhattan distance, the first ratio and the adjustment coefficient is calculated to obtain a distance difference value of the actual pixel point;
when the actual pixel point is the second type pixel point, the distance is Euclidean distance, the ratio of the minimum Euclidean distance to the number of the target pixel points on the short edge line where the target pixel point corresponding to the minimum Euclidean distance is calculated and recorded as a second ratio, the product of the minimum Euclidean distance, the second ratio and the adjustment coefficient is calculated, and the value obtained by squaring the product is used as the distance difference value of the actual pixel point; wherein the adjustment coefficient is greater than 0.
Preferably, the method for obtaining the outlier in the actual pixel point according to the distance difference value includes: sorting the distance difference values corresponding to the actual pixel points according to the manually set sequence to obtain a distance difference value sequence, acquiring the removed distance difference value in the distance difference value sequence by using an MAD algorithm, and taking the actual pixel points corresponding to the removed distance difference value as outliers.
Preferably, the method for obtaining the deviation according to the path deviation comprises the following steps: and taking the normalized value of the path deviation degree as the deviation degree.
Preferably, the method for clustering the obtained actual pixel points according to the distance difference value to obtain each cluster includes: and clustering the obtained actual pixel points by using a DBSCAN algorithm to obtain each cluster by taking the distance difference value as the obtained actual pixel point value.
Preferably, the method for determining the printing precision based on the deviation degree comprises the following steps: arranging the deviation degrees corresponding to the short edge lines in order from small to large, calculating the difference value of the two adjacent deviation degrees, and judging that the printing precision is qualified if the difference value is not greater than or equal to a threshold value; if the difference value is greater than or equal to the threshold value, the printing precision is judged to be unqualified.
The embodiment of the invention has at least the following beneficial effects:
According to the method, the edge image corresponding to the current layer printing object is obtained when the 3D printer prints, and then edge pixel points in the edge image are filled according to a standard path, so that a corrected edge image is obtained; the correction edge image comprises a target pixel point and an actual pixel point, the correction edge image is subjected to corner detection, and when the position of the corner point is the target pixel point, each corner point divides an edge line corresponding to the target pixel point into each short edge line; for any one actual pixel point, calculating the distance between all the target pixel points and the actual pixel point, and calculating the distance difference value of the actual pixel point according to the minimum distance and the number of the target pixel points on the short edge line where the target pixel point corresponding to the minimum distance is located; the distance difference value reflects the deviation degree of the actual pixel point relative to the target pixel point, the deviation degree is different to represent the printing precision of the printing position corresponding to the actual pixel point, the distance difference value provides good data conditions for the judgment of the subsequent printing precision, and the accuracy of the judgment result of the printing precision is improved. The number of target pixel points on the short edge line can represent the change degree of the standard path, the number of different target pixel points represents different change degrees, the calculated distance difference values of different target pixel points are also different, and as the distance difference values reflect the deviation degree of the actual pixel points relative to the target pixel points, the deviation degree is different to represent the printing precision of the printing position corresponding to the actual pixel points, the number of target pixel points on the short edge line reflects the different tolerance of the printing precision to different change degrees, different analysis weights are shown, and a more accurate judgment result of the printing precision can be obtained. The invention clusters the actual pixel points corresponding to the target pixel points on any short edge line according to the distance difference value to obtain each cluster; acquiring outliers, and calculating the path deviation degree of each short edge line based on the number of clusters, the maximum actual pixel point number contained in the clusters and the number of outliers; and acquiring the deviation degree according to the path deviation degree, and judging the printing precision based on the deviation degree. When the printing precision is different, the number of the corresponding outliers, the number of clusters and the number of the maximum actual pixel points contained in the clusters are different, and when the path deviation degree is calculated, the number of the outliers, the number of the clusters and the number of the maximum actual pixel points contained in the clusters are combined, so that errors caused by a single factor are avoided, the calculation result of the path deviation degree is more accurate, the judgment result of the printing precision can be obtained more accurately, and the detection of the printing precision is realized.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions and advantages of the prior art, the following description will briefly explain the drawings used in the embodiments or the description of the prior art, and it is obvious that the drawings in the following description are only some embodiments of the invention, and other drawings can be obtained according to the drawings without inventive effort for a person skilled in the art.
FIG. 1 is a block diagram of a 3D print detection system embodiment of the present invention;
FIG. 2 is an overall schematic of a 3D printer;
FIG. 3 is a partial schematic view of a 3D printer;
FIG. 4 is a schematic diagram of an edge image;
The reference numerals are: 1. the spray head comprises a feed cylinder assembly, a spray head assembly, a power supply assembly, a control screen assembly, a main board box assembly, a X-axis assembly, a main board box assembly, a double-drive Z-axis assembly, a Y-axis assembly, a gantry beam, a Z-axis guide rod, a Z-axis lead screw, a Z-axis coupler, a Z-axis motor and a Z-axis motor.
Detailed Description
In order to further describe the technical means and effects adopted by the present invention to achieve the preset purpose, the following detailed description of the specific embodiments, structures, features and effects thereof according to the present invention is given with reference to the accompanying drawings and the preferred embodiments. In the following description, different "one embodiment" or "another embodiment" means that the embodiments are not necessarily the same. Furthermore, the particular features, structures, or characteristics of one or more embodiments may be combined in any suitable manner.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
Referring to fig. 1, a block diagram of a 3D printing detection system according to an embodiment of the invention includes an image processing module, a data processing module, and a detection module.
The image processing module is used for acquiring an edge image corresponding to a current layer printing object when the 3D printer prints, marking edge pixel points in the edge image as actual pixel points, marking edge pixel points corresponding to a standard path in the edge image as first target pixel points, filling the edge pixel points in the edge image according to the standard path, marking the filled edge pixel points as second target pixel points, and marking the first target pixel points and the second target pixel points as target pixel points to obtain the corrected edge image.
When the 3D printer prints, the slicing software controls the heating extrusion head, the hot bed and the motor according to standard path information, stacks the hot melt materials layer by layer, prints the next layer after the materials are solidified, and each layer of patterns are accumulated layer by layer to finish the printing of the three-dimensional object. The overall schematic of the 3D printer is shown in fig. 2, and the marks in fig. 2 are: 1. the spray head comprises a feed cylinder assembly, a spray head assembly, a power supply assembly, a control screen assembly, an X-axis assembly, a main board box assembly, a double-drive Z-axis assembly, a Y-axis assembly and a control screen assembly, wherein the feed cylinder assembly, the spray head assembly, the power supply assembly, the control screen assembly, the X-axis assembly, the main board box assembly, the double-drive Z-axis assembly and the Y-axis assembly are respectively arranged; a partial schematic of a 3D printer is shown in fig. 3. The labels in fig. 3 are: 9. the gantry beam, the Z-axis guide rod, the Z-axis lead screw, the coupling, the Z-axis motor and the Z-axis motor are arranged in the gantry beam, the Z-axis guide rod, the Z-axis lead screw and the coupling; before the 3D printer prints, the slicing software completes the printed path planning, prints according to the path planning, extracts the path information of the current layer of hot melt consumable printing in the path planning, and marks the path information as a standard path. It should be noted that, the implementer may acquire the standard path by other methods, for example, by acquiring big data.
In the embodiment, after stacking of the current layer of materials is completed, a CCD camera is used for shooting, an image corresponding to a current layer of printing object is obtained, the CCD camera is fixed on the side edge of a spray head of the 3D printer, and the image is acquired in a overlooking mode. It should be noted that, after the shower nozzle is accomplished when printing of layer consumable, use X axle subassembly, Y axle subassembly and Z axle subassembly to adjust the CCD camera for the position of printing the object, when layer consumable and camera distance the same and lie in the viewfinder central authorities of CCD camera when making every turn printing, guarantee that the printing accuracy decision result of obtaining when every layer prints in subsequent process does not receive the influence of consumable and camera distance.
Since the image obtained by the CCD camera is an RGB image, in order to reduce the amount of calculation, the embodiment uses a weighted average method to convert the RGB image into a gray scale image, where the weighted average method is a known technique and will not be described in detail. As another embodiment, the practitioner may select a maximum value method, a component method, or the like to acquire the grayscale image.
In the process of image acquisition, due to the diversity and uncertainty of the environment, external illumination, human factors, interference in the signal transmission process and the like, the quality of the acquired image is affected, and noise is formed in the image. Therefore, in order to ensure the accuracy of information extraction in the image, the image needs to be denoised, and in the embodiment, the gaussian filter and the gray image are used for convolution operation, so that the denoising processing of the gray image is realized, and the precision and quality of the gray image are improved. The gaussian filtering denoising is a known technique, the specific process is not repeated, and in the actual operation process, an operator can select other denoising modes to denoise the gray image, for example, median filtering; because the hot-melt consumable extruded by the nozzle is narrow, the analysis result is inaccurate due to the fact that the hot-melt consumable is susceptible to distortion in the analysis process, a Zhang Zhengyou camera calibration method is used for calibrating the gray level image, and distortion caused by the gray level image due to the camera is corrected. The Zhang Zhengyou camera calibration method is a well-known technique and will not be described in detail.
After the gray image is obtained, edge detection is performed on the gray image by using a canny operator to obtain an edge image, wherein the edge image is shown in fig. 4, and the edge image is a binary image.
Because the hot-melt consumables sprayed by the spray heads according to the standard paths are sequentially and closely adjacent in the stacking process, the gaps among different tracks are consistent, the characteristic is represented as a plurality of straight lines which are mutually parallel in the edge image, the straight lines correspond to the gaps among the hot-melt consumables, and the gaps among the different tracks are consistent, so that the intervals among the straight lines are consistent; the characteristic is represented in the gray level image as a plurality of strip-shaped areas with uniform width and orderly arrangement, and each strip-shaped area corresponds to each ejected hot melting consumable.
The straight line in the edge image is an edge line, all the edge lines form an actual path when the 3D printer prints, then edge pixel points in the edge image are marked as actual pixel points, edge pixel points corresponding to the standard path in the edge image are marked as first target pixel points, then the edge pixel points in the edge image are filled according to the standard path, the filled edge pixel points are marked as second target pixel points, and the first target pixel points and the second target pixel points are used as target pixel points, so that the corrected edge image is obtained. The rectified edge image includes actual edge points and target edge points.
It should be noted that, for the edge pixel points before being not filled in the edge image, the edge pixel points can be marked as an actual pixel point and a target pixel point at the same time, and for the edge pixel points marked as the actual pixel point and the target pixel point at the same time, the edge pixel points of the type correspond to both the actual path and the standard path, and therefore, no deviation exists between the actual path and the standard path of the edge pixel points of the type, and the printing precision is high.
The data processing module is used for detecting angular points of the corrected edge image, and when the position of the angular point is a target pixel point, each angular point divides an edge line corresponding to the target pixel point into each short edge line; for any one actual pixel point, calculating the distance between all the target pixel points and the actual pixel point, and calculating the distance difference value of the actual pixel point according to the minimum distance and the number of the target pixel points on the short edge line where the target pixel point corresponding to the minimum distance is located.
Specifically, there are various algorithms for detecting the corner of the corrected edge image, and an operator may select a specific algorithm according to the actual situation. The correction edge image comprises target pixel points and actual pixel points, so that for each corner point obtained after corner point detection, the position of each corner point comprises both the actual pixel point and the target pixel point, and as the target pixel point corresponds to a standard path, each corner point when the position of the corner point is selected as the target pixel point is analyzed, and each corner point when the position of the corner point is the target pixel point divides an edge line corresponding to the target pixel point into short edge lines. For example, if the edge line corresponding to the target pixel point is a rugged bending edge line, the position corresponding to the corner point represents the bending position, and thus, the corner point divides the rugged bending edge line into each short edge line.
For any one actual pixel point, calculating the distance between all the target pixel points and the actual pixel point, and calculating the distance difference value of the actual pixel point according to the minimum distance and the number of the target pixel points on the short edge line where the target pixel point corresponding to the minimum distance is located.
After 3D printing is finished, the forming surface of the printed three-dimensional object is subjected to polishing treatment, so that the printed three-dimensional object is more regular and smooth, and the accuracy is higher, therefore, when the hot-melt consumable slightly exceeds the position outside a standard path, after the subsequent treatment processes such as polishing and polishing, the hot-melt consumable exceeding the position can be eliminated, and the printing accuracy has higher tolerance to the condition, namely slight deviation is allowed to occur; however, when the hot-melting consumable material is in a standard path and is in a recess caused by the lack, the missing part cannot be filled after the subsequent processing technologies such as grinding, polishing and the like, so that the printing precision has lower tolerance to the missing.
Based on this, the embodiment further includes classifying the actual pixel points before calculating the distance difference value of the actual pixel points, specifically, for any one actual pixel point, determining the positional relationship between the actual pixel point and the region formed by all the target pixel points, and if the actual pixel point is in the region, marking the actual pixel point as a first type pixel point, and expressing the first type pixel point as: p (x, y) E A, wherein P (x, y) represents an actual pixel point P at the coordinates (x, y), A represents a first type pixel point, E represents a symbol; the first type of pixel points represent that the actual path corresponding to the actual pixel points is in the standard path, and the situation that the hot melting consumable possibly lacks to cause sinking is likely to occur; the printing accuracy has lower tolerance to the situation; if the actual pixel is outside the area, the actual pixel is marked as a second type pixel, and the second type pixel is expressed as follows by a formula: p (x, y) ε B, where P (x, y) represents the actual pixel point P at coordinates (x, y), B represents the second class of pixel points, ε represents the belonging to the symbol; the second type of pixel points represent that the actual paths corresponding to the actual pixel points are outside the standard paths, and the hot melting consumable materials are possibly out of the standard paths; the printing accuracy is tolerant to this situation.
When the actual pixel points are the first type of pixel points, calculating distances between all target pixel points and the actual pixel points as Manhattan distances, calculating a ratio of the minimum Manhattan distance to the number of target pixel points on a short edge line where the target pixel points corresponding to the minimum Manhattan distance are located, recording the ratio as a first ratio, and calculating a product of the minimum Manhattan distance, the first ratio and an adjustment coefficient to obtain a distance difference value of the actual pixel points. The calculation formula of the manhattan distance is a known technology and will not be described in detail.
When the actual pixel point is the first type pixel point, the distance difference value of the actual pixel point is expressed as:
Wherein dv represents the distance difference value of any one actual pixel point when the actual pixel point is the first type pixel point; d P(x,y) represents the minimum manhattan distance between all target pixel points and the actual pixel point P at (x, y) coordinates, P (x, y) representing the actual pixel point P at (x, y) coordinates; n1 represents the number of target pixel points on the short edge line where the target pixel point corresponding to the minimum Manhattan distance is located; a1 represents an adjustment coefficient, the adjustment coefficient is larger than 0, the adjustment coefficient is used for adjusting the value of the distance difference value, the value of the adjustment coefficient in the embodiment is 100, and an operator can adjust the value of the adjustment coefficient according to actual conditions; a represents a first type of pixel point, and E represents a symbol; The first ratio is characterized.
The distance difference value reflects the deviation degree of the actual pixel point relative to the target pixel point, and the larger the deviation degree is, the larger the value of the distance difference value is, and the worse the printing precision of the printing position corresponding to the actual pixel point is; conversely, the smaller the deviation degree is, the smaller the distance difference value is, and the higher the printing precision of the printing position corresponding to the actual pixel point is.
D P(x,y) represents the difference between the actual path and the standard path, the actual pixel point corresponds to the actual path, the target pixel point corresponding to the minimum Manhattan distance corresponds to the standard path, and the larger the difference is, the lower the printing precision is, the smaller the difference is, and the higher the printing precision is.
N1 represents the number of target pixel points on the corresponding short edge line, the number of target pixel points can reflect the change condition of a standard path at the position corresponding to the short edge line, if the number of target pixel points is larger, the path at the position is represented to change less and change more slowly, the tolerance of the printing precision on the position with slow change is higher, namely when the position has fine deviation, the influence on the overall precision of the finally obtained three-dimensional object is not great; however, if the number of the target pixel points is smaller, the more the path change at the position is represented, the more the change is urgent, the lower the tolerance of the printing precision to the position with urgent change is, namely, when the position has fine deviation, the influence on the overall precision of the finally obtained three-dimensional object is larger, so that the deviation at the position should be more focused; according to the analysis, the negative correlation relationship between the number of the target pixel points and the printing precision can be obtained, and the negative correlation relationship between the number of the target pixel points and the distance difference value can be obtained because the distance difference value can reflect the printing precision.
First ratio valueThe relative proportion of the minimum Manhattan distance corresponding to the actual pixel point is characterized in that the smaller the relative proportion is, the higher the printing precision at the position corresponding to the actual pixel point is, and the higher the relative proportion is, the lower the printing precision at the position corresponding to the actual pixel point is.
When the actual pixel points are the second type of pixel points, the distance is the Euclidean distance, the distances between all the target pixel points and the actual pixel points are calculated as the Euclidean distance, the ratio of the minimum Euclidean distance to the number of the target pixel points on the short edge line where the target pixel points corresponding to the minimum Euclidean distance are calculated as the second ratio, the product of the minimum Euclidean distance, the second ratio and the adjustment coefficient is calculated, and the value obtained by squaring the product is taken as the distance difference value of the actual pixel points.
When the actual pixel point is the second type pixel point, the distance difference value of the actual pixel point is expressed as:
Wherein dv represents the distance difference value of any one actual pixel point when the actual pixel point is the second type pixel point; d' P(x,y) denotes the minimum euclidean distance between all target pixel points and the actual pixel point P at (x, y) coordinates, P (x, y) denotes the actual pixel point P at (x, y) coordinates; n1' represents the number of target pixel points on the short edge line where the target pixel point corresponding to the minimum euclidean distance is located; a1 represents an adjustment coefficient, the adjustment coefficient is larger than 0, the adjustment coefficient is used for adjusting the value of the distance difference value, the value of the adjustment coefficient in the embodiment is 100, and an operator can adjust the value of the adjustment coefficient according to actual conditions; b represents a second class of pixel points, and E represents a symbol; The second ratio is characterized.
When the actual pixel is the second type pixel, the explanation of the distance difference value of the actual pixel and each parameter for calculating the distance difference value is the same as the corresponding explanation when the actual pixel is the first type pixel, and will not be repeated.
In step 1 of this embodiment, it is mentioned that, for the edge pixel point before being not filled in the edge image, the edge pixel point may be marked as an actual pixel point and a target pixel point at the same time, and for the edge pixel point marked as an actual pixel point and a target pixel point at the same time, this type of edge pixel point corresponds to both the actual path and the standard path, which means that there is no deviation between the actual path and the standard path of this type of edge pixel point, and the printing precision is high, so the value of the distance difference value corresponding to the edge pixel point marked as an actual pixel point and a target pixel point at the same time is 0.
It should be noted that, in this embodiment, different distance calculation manners are selected to calculate the distance difference value according to different types of the corresponding actual pixel points, because the printing precision has different tolerance to different types of pixel points, for the pixel point type corresponding to the pixel point type with higher tolerance, the euclidean distance is adopted as the distance between the actual pixel point and the corresponding target pixel point, and for the pixel point type with lower tolerance, the manhattan distance is adopted as the distance between the actual pixel point and the corresponding target pixel point; because the Euclidean distance is larger than or equal to the Manhattan distance, different distance calculation modes are selected to calculate the distance difference value according to different types of the corresponding actual pixel points, different tolerance can be given to different types of pixel points, good data conditions are provided for the judgment of the subsequent printing precision, and the accuracy of the printing precision judgment result is improved.
The detection module is used for calculating the distance between all the actual pixel points and any one target pixel point on any one short edge line, acquiring the actual pixel point corresponding to the minimum distance, further acquiring the actual pixel point corresponding to the minimum distance of all the target pixel points on the short edge line, and clustering the acquired actual pixel points according to the distance difference value to obtain each cluster; obtaining outliers in the actual pixel points according to the distance difference values, and calculating the path deviation degree of the short edge line according to the number of the outliers, the range and standard deviation corresponding to the distance difference values, the number of target pixel points on the short edge line, the number of all clusters and the maximum actual pixel points contained in the clusters; and acquiring the deviation degree according to the path deviation degree, and judging the printing precision based on the deviation degree.
Preferably, for any one target pixel point on any one short edge line, when distances between all actual pixel points and the target pixel point are calculated, the distances include euclidean distances and manhattan distances, that is, when the actual pixel point is a first type pixel point, the distances are manhattan distances, and when the actual pixel point is a second type pixel point, the distances are euclidean distances. And acquiring an actual pixel point corresponding to the minimum distance, further acquiring the actual pixel point corresponding to the minimum distance of all the target pixel points on the short edge line, and clustering the acquired actual pixel points according to the distance difference value.
Specifically, clustering the obtained actual pixel points according to the distance difference value by using a DBSCAN algorithm to obtain each cluster, wherein the distance difference value corresponding to the obtained actual pixel points is used as the value of the actual pixel points, and when clustering is performed by using the DBSCAN algorithm, 2 is set as a clustering radius, and 4 is the minimum point number to obtain each cluster; in a specific operation process, an implementer can adjust the clustering radius and the value of the minimum point number according to actual conditions; then, the number of the obtained clusters is denoted as n2, and the number of actual pixel points contained in each cluster is denoted as c 1,c2,…,cn2, where c 1 represents the number of actual pixel points contained in the 1 st cluster, c 2 represents the number of actual pixel points contained in the 2 nd cluster, and c n2 represents the number of actual pixel points contained in the n2 nd cluster. The DBSCAN algorithm is a known technology, the specific clustering process is not repeated, and an implementer can select other algorithms to cluster the actual pixel points, such as a k-means algorithm.
The method for acquiring the outlier in the actual pixel point according to the distance difference value comprises the following steps: sorting the distance difference values corresponding to the actual pixel points according to the manually set sequence to obtain a distance difference value sequence, acquiring the removed distance difference value in the distance difference value sequence by using an MAD algorithm, and taking the actual pixel points corresponding to the removed distance difference value as outliers. In this embodiment, the manually set order is to sort the distance difference values in order from large to small, so as to obtain a distance difference value sequence, and as another embodiment, the practitioner may sort the distance difference values in order from small to large, so as to obtain a distance difference value sequence. The MAD algorithm is a well-known technique, and the detailed process is not repeated. The general idea of the MAD algorithm is to determine whether each element is an outlier by determining whether the deviation of the element from the median is within a reasonable range. In this embodiment, the distance difference value that is not in a reasonable range is removed by the MAD algorithm, where the reasonable range is set by the practitioner according to the actual situation.
In general, the more likely that the printing accuracy of the position corresponding to the direction change is wrong, when the printing accuracy is wrong, the larger the value of the distance difference value of the actual pixel point corresponding to the position is, the more the number of the outliers is, the worse the printing accuracy of the part is.
Calculating the path deviation degree of the short edge line according to the quantity of outliers, the extreme difference and standard deviation corresponding to the distance difference value, the quantity of target pixel points on the short edge line, the quantity of all clusters and the quantity of maximum actual pixel points contained in the clusters, specifically, counting the quantity of actual pixel points contained in each cluster, obtaining the quantity of maximum actual pixel points contained in the clusters, calculating the ratio of the quantity of maximum actual pixel points contained in the clusters to the quantity of target pixel points on the short edge line, and taking the ratio of the quantity of all clusters to the ratio as a first characteristic value; and calculating the product of the range corresponding to the distance difference value, the standard deviation corresponding to the distance difference value and the number of outliers, and recording the product as a second characteristic value, wherein the product of the first characteristic value and the second characteristic value is used as the path deviation degree of the short edge line.
The path deviation is formulated as:
Wherein p represents the path deviation degree of any one short edge line; n2 represents the number of all clusters; n1 "represents the number of target pixels on the short edge line; s represents the standard deviation corresponding to the distance difference value; r represents the range corresponding to the distance difference value; n3 represents the number of outliers; c 1 represents the number of actual pixels contained in the 1 st cluster; c 2 represents the number of actual pixels contained in the 2 nd cluster; c n2 represents the number of actual pixels contained in the n2 nd cluster; max { } represents a function for finding the maximum value; representing a first characteristic value; (s×r×n3) represents a second characteristic value.
The path deviation degree reflects the deviation degree of the actual path and the standard path, and the greater the deviation degree is, the greater the path deviation degree is, namely the lower the printing precision of the actual path is, whereas the smaller the deviation degree is, the smaller the path deviation degree is, namely the higher the printing precision of the actual path is.
For all target pixel points on any one short edge line, acquiring the actual pixel point corresponding to the minimum distance, if the distance difference value of each actual pixel point tends to be consistent, more actual pixel points are divided into the same cluster during clustering, namely, the larger the value of max { c 1,c2,…,cn2 } is, the closer the value of max { c 1,c2,…,cn2 } is to the number n 1' -of target pixel points on the short edge line,The larger the value of the mark is, the smaller the difference between the actual path corresponding to the actual pixel point and the standard path corresponding to the target pixel point is, and the higher the printing precision is. The distance difference value of each actual pixel point tends to be consistent, the number of clusters obtained by clustering is smaller, namely the value of the number n2 of all clusters is smaller, the difference between the actual path corresponding to the actual pixel point and the standard path corresponding to the target pixel point is smaller, and the printing precision is higher.
The standard deviation s corresponding to the distance difference value reflects the discrete degree of the distance difference value, and the higher the discrete degree is, the less the distance difference value tends to be consistent, the larger the difference between the actual path corresponding to the actual pixel point and the standard path corresponding to the target pixel point is represented, and the lower the printing precision is indicated; conversely, the lower the degree of dispersion, the more consistent the distance difference value tends to be, the smaller the difference between the actual path corresponding to the actual pixel point and the standard path corresponding to the target pixel point is represented, and the higher the printing precision is indicated.
The range r corresponding to the distance difference value reflects the difference between the maximum distance difference value and the minimum distance difference value, and the larger the difference is, the less the distance difference value tends to be consistent, the larger the difference between the actual path corresponding to the actual pixel point and the standard path corresponding to the target pixel point is, and the lower the printing precision is; conversely, the smaller the difference, the more consistent the distance difference value tends to be, the smaller the difference between the actual path corresponding to the actual pixel point and the standard path corresponding to the target pixel point is represented, and the higher the printing precision is indicated.
The number n3 of discrete points represents the variability of the distance difference value, and the stronger the variability is, the more the number of discrete points is, the lower the corresponding printing precision is, namely the larger the value of the path deviation degree is. The lower the variability, the fewer the number of discrete dots, and the higher the corresponding print accuracy, i.e., the smaller the value of the path deviation degree.
From the analysis, n2 and the path deviation degree show positive correlation,And the total range r corresponding to the distance difference value, the standard deviation s corresponding to the distance difference value and the number n3 of outliers are in positive correlation with the path deviation degree, and a calculation formula of the path deviation degree is obtained based on the total range r, the standard deviation s and the number n 3.
Finally, obtaining the deviation degree according to the path deviation degree, namely taking the normalized value of the path deviation degree as the deviation degree, wherein the value of the deviation degree is between intervals [0,1], and normalizing the path deviation degree through an arctangent function to obtain the deviation degree. In the actual operation process, the practitioner selects a normalization processing method according to specific situations, and the normalization processing is a known technology and is not repeated.
After the deviation degree is obtained, the printing precision is judged based on the deviation degree, specifically, the deviation degree corresponding to each short edge line is arranged in the order from small to large, the difference value of two adjacent deviation degrees is calculated, if the difference value is not greater than or equal to a threshold value, the printing precision is judged to be qualified, and at the moment, the printing of the next layer of hot melting consumable material can be continued; if the difference value is greater than or equal to the threshold value, judging that the printing precision is unqualified, and starting from the larger deviation degree of the two adjacent deviation degrees, considering that the printing precision of the positions corresponding to the deviation degrees which are greater than or equal to the larger deviation degree is unqualified, and then, after the printer is adjusted by related technicians, printing of the next layer of hot-melt consumable material can be continued. In this embodiment, the threshold value is 0.3, and the practitioner can adjust the threshold value according to the actual situation.
The above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the application and are intended to be included within the scope of the application.

Claims (6)

1. A 3D printing inspection system, comprising:
The image processing module is used for acquiring an edge image corresponding to a current layer printing object when the 3D printer prints, marking edge pixel points in the edge image as actual pixel points, marking edge pixel points corresponding to a standard path in the edge image as first target pixel points, filling the edge pixel points in the edge image according to the standard path, marking the filled edge pixel points as second target pixel points, and marking the first target pixel points and the second target pixel points as target pixel points to obtain a corrected edge image;
the data processing module is used for detecting angular points of the corrected edge image, and when the position of the angular point is a target pixel point, each angular point divides an edge line corresponding to the target pixel point into each short edge line; for any one actual pixel point, calculating the distance between all the target pixel points and the actual pixel point, and calculating the distance difference value of the actual pixel point according to the minimum distance and the number of the target pixel points on the short edge line where the target pixel point corresponding to the minimum distance is located;
The detection module is used for calculating the distance between all the actual pixel points and any target pixel point on any short edge line, acquiring the actual pixel point corresponding to the minimum distance, further acquiring the actual pixel point corresponding to the minimum distance of all the target pixel points on the short edge line, and clustering the acquired actual pixel points according to the distance difference value to acquire each cluster; obtaining outliers in the actual pixel points according to the distance difference values, and calculating the path deviation degree of the short edge line according to the number of the outliers, the range and standard deviation corresponding to the distance difference values, the number of target pixel points on the short edge line, the number of all clusters and the maximum actual pixel points contained in the clusters; acquiring deviation degree according to the path deviation degree, and judging printing precision based on the deviation degree;
Before calculating the distance difference value of the actual pixel point, the method further comprises: judging whether the actual pixel point is in an area formed by all target pixel points, and if the actual pixel point is in the area, marking the actual pixel point as a first type pixel point; if the actual pixel point is outside the area, the actual pixel point is marked as a second type pixel point;
according to the minimum distance and the number of target pixel points on the short edge line where the target pixel points corresponding to the minimum distance are located, calculating the distance difference value of the actual pixel points includes:
when the actual pixel point is a first type pixel point, the distance is Manhattan distance, the ratio of the minimum Manhattan distance to the number of target pixel points on a short edge line where the target pixel point corresponding to the minimum Manhattan distance is located is calculated and recorded as a first ratio, and the product of the minimum Manhattan distance, the first ratio and the adjustment coefficient is calculated to obtain a distance difference value of the actual pixel point;
when the actual pixel point is the second type pixel point, the distance is Euclidean distance, the ratio of the minimum Euclidean distance to the number of the target pixel points on the short edge line where the target pixel point corresponding to the minimum Euclidean distance is calculated and recorded as a second ratio, the product of the minimum Euclidean distance, the second ratio and the adjustment coefficient is calculated, and the value obtained by squaring the product is used as the distance difference value of the actual pixel point; wherein the adjustment coefficient is greater than 0.
2. The 3D printing detection system according to claim 1, wherein the calculating the path deviation degree of the short edge line according to the number of outliers, the range difference and standard deviation corresponding to the distance difference value, the number of target pixel points on the short edge line, the number of all clusters, and the maximum actual pixel points contained in the clusters includes:
Counting the number of actual pixel points contained in each cluster, obtaining the maximum number of actual pixel points contained in the cluster, calculating the ratio of the maximum number of actual pixel points contained in the cluster to the number of target pixel points on the short edge line, and taking the ratio of the number of all clusters to the ratio as a first characteristic value; and calculating the product of the range corresponding to the distance difference value, the standard deviation corresponding to the distance difference value and the number of outliers, and recording the product as a second characteristic value, wherein the product of the first characteristic value and the second characteristic value is used as the path deviation degree of the short edge line.
3. The 3D printing detection system according to claim 1, wherein the method for obtaining the outlier in the actual pixel according to the distance difference value comprises: sorting the distance difference values corresponding to the actual pixel points according to the manually set sequence to obtain a distance difference value sequence, acquiring the removed distance difference value in the distance difference value sequence by using an MAD algorithm, and taking the actual pixel points corresponding to the removed distance difference value as outliers.
4. The 3D printing detection system according to claim 1, wherein the method for obtaining the deviation according to the path deviation is as follows: and taking the normalized value of the path deviation degree as the deviation degree.
5. The 3D printing detection system according to claim 1, wherein the method for clustering the obtained actual pixel points according to the distance difference value to obtain each cluster comprises the following steps: and clustering the obtained actual pixel points by using a DBSCAN algorithm to obtain each cluster by taking the distance difference value as the obtained actual pixel point value.
6. The 3D printing detection system according to claim 1, wherein the method for determining printing accuracy based on the degree of deviation is: arranging the deviation degrees corresponding to the short edge lines in order from small to large, calculating the difference value of the two adjacent deviation degrees, and judging that the printing precision is qualified if the difference value is not greater than or equal to a threshold value; if the difference value is greater than or equal to the threshold value, the printing precision is judged to be unqualified.
CN202310192167.1A 2023-03-02 2023-03-02 3D prints detecting system Active CN116277973B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310192167.1A CN116277973B (en) 2023-03-02 2023-03-02 3D prints detecting system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310192167.1A CN116277973B (en) 2023-03-02 2023-03-02 3D prints detecting system

Publications (2)

Publication Number Publication Date
CN116277973A CN116277973A (en) 2023-06-23
CN116277973B true CN116277973B (en) 2024-05-28

Family

ID=86799001

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310192167.1A Active CN116277973B (en) 2023-03-02 2023-03-02 3D prints detecting system

Country Status (1)

Country Link
CN (1) CN116277973B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116563296B (en) * 2023-07-12 2023-09-05 吉林省禹语网络科技有限公司 Identification method for abdomen CT image
CN116824516B (en) * 2023-08-30 2023-11-21 中冶路桥建设有限公司 Road construction safety monitoring and management system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016123549A1 (en) * 2015-01-29 2016-08-04 Alcoa Inc. Systems and methods for modelling additively manufactured bodies
CN108819256A (en) * 2018-06-15 2018-11-16 湖南华曙高科技有限责任公司 Scan control method, device, computer equipment and storage medium
CN113400650A (en) * 2020-08-19 2021-09-17 珠海赛纳三维科技有限公司 Data processing method, data processing device, storage medium and three-dimensional printing device
CN113560574A (en) * 2021-06-10 2021-10-29 广东工业大学 3D printing defect repairing method
CN113619121A (en) * 2021-08-25 2021-11-09 珠海赛纳三维科技有限公司 Three-dimensional object printing method, data processing device and computer equipment

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016123549A1 (en) * 2015-01-29 2016-08-04 Alcoa Inc. Systems and methods for modelling additively manufactured bodies
CN107206495A (en) * 2015-01-29 2017-09-26 奥科宁克有限公司 System and method for simulating addition manufacture body
CN108819256A (en) * 2018-06-15 2018-11-16 湖南华曙高科技有限责任公司 Scan control method, device, computer equipment and storage medium
CN113400650A (en) * 2020-08-19 2021-09-17 珠海赛纳三维科技有限公司 Data processing method, data processing device, storage medium and three-dimensional printing device
CN113560574A (en) * 2021-06-10 2021-10-29 广东工业大学 3D printing defect repairing method
CN113619121A (en) * 2021-08-25 2021-11-09 珠海赛纳三维科技有限公司 Three-dimensional object printing method, data processing device and computer equipment

Also Published As

Publication number Publication date
CN116277973A (en) 2023-06-23

Similar Documents

Publication Publication Date Title
CN116277973B (en) 3D prints detecting system
US12001191B2 (en) Automated 360-degree dense point object inspection
EP3963414A2 (en) Automated 360-degree dense point object inspection
CN100384220C (en) Video camera rating data collecting method and its rating plate
WO2018192662A1 (en) Defect classification in an image or printed output
CN112505056A (en) Defect detection method and device
CN111299078A (en) Automatic tracking dispensing method based on assembly line
CN101804720A (en) Vision-based online digital electronic engraving quality monitoring method and device
CN108537772B (en) Visual detection method for printing defects of chip resistor positive conductor
CN108724733B (en) Visual monitoring feedback method for surface exposure 3D printing
CN109360794B (en) Visual detection method and device for secondary printing precision of crystalline silicon photovoltaic solar cell electrode
CN112596981B (en) Monitoring method, device, equipment and storage medium for three-dimensional printing process
CN110544235A (en) Flexible circuit board image area identification method based on differential geometry
CN113646157A (en) Calibration of a camera arranged to monitor an additive manufacturing method
CN116638766A (en) 3D printing deviation detection method and device and computer equipment
CN113989369A (en) High-precision calibration method and device for laser processing system
CN116503316A (en) Chip defect measurement method and system based on image processing
CN109636859B (en) Single-camera-based calibration method for three-dimensional visual inspection
CN115187612A (en) Plane area measuring method, device and system based on machine vision
CN117232396B (en) Visual detection system and method for product quality of high-speed production line
CN113246473B (en) Compensation method and compensation device for 3D printer, 3D printer and storage medium
CN113592870B (en) Printing defect detection method based on self-adaptive focal length
CN115194323A (en) Positioning welding method of laser welding machine
CN113063804A (en) Automatic positioning method of thermal cutting machine vision system based on image processing
CN106705836A (en) Automatic detection system for size parameters of T type guide rail

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant