CN117333486B - UV finish paint performance detection data analysis method, device and storage medium - Google Patents

UV finish paint performance detection data analysis method, device and storage medium Download PDF

Info

Publication number
CN117333486B
CN117333486B CN202311626281.7A CN202311626281A CN117333486B CN 117333486 B CN117333486 B CN 117333486B CN 202311626281 A CN202311626281 A CN 202311626281A CN 117333486 B CN117333486 B CN 117333486B
Authority
CN
China
Prior art keywords
data
image sample
image
sample data
category
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311626281.7A
Other languages
Chinese (zh)
Other versions
CN117333486A (en
Inventor
请求不公布姓名
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingyuan Oppein Integrated Home Co ltd
Original Assignee
Qingyuan Oppein Integrated Home Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingyuan Oppein Integrated Home Co ltd filed Critical Qingyuan Oppein Integrated Home Co ltd
Priority to CN202311626281.7A priority Critical patent/CN117333486B/en
Publication of CN117333486A publication Critical patent/CN117333486A/en
Application granted granted Critical
Publication of CN117333486B publication Critical patent/CN117333486B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/04Manufacturing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/54Extraction of image or video features relating to texture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T90/00Enabling technologies or technologies with a potential or indirect contribution to GHG emissions mitigation

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Human Resources & Organizations (AREA)
  • Software Systems (AREA)
  • Economics (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Evolutionary Computation (AREA)
  • Tourism & Hospitality (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Marketing (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Quality & Reliability (AREA)
  • General Engineering & Computer Science (AREA)
  • Educational Administration (AREA)
  • Game Theory and Decision Science (AREA)
  • Development Economics (AREA)
  • Mathematical Physics (AREA)
  • Operations Research (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Manufacturing & Machinery (AREA)
  • Primary Health Care (AREA)
  • Investigating Or Analysing Materials By Optical Means (AREA)

Abstract

The invention provides a method, a device and a storage medium for analyzing UV finish paint performance detection data, which relate to the technical field of intelligent data processing, wherein the method for analyzing the UV finish paint performance detection data comprises the following steps: acquiring and processing image sample data of the UV finishing coat through image acquisition equipment; performing finish paint type judgment on the processed image sample data of the UV finish paint based on a UV finish paint type judgment algorithm; acquiring component data of the UV finishing paint; analyzing the association relation between the class of the finish paint and the component data to obtain an association result; and (3) adjusting and optimizing the component data of the UV finishing paint according to the obtained association result. The invention can automatically judge the types of the finish paint on the processed image data of the UV finish paint, can reduce human errors, improves the accuracy and consistency of judgment, and is beneficial to improving the competitiveness of enterprises.

Description

UV finish paint performance detection data analysis method, device and storage medium
Technical Field
The invention relates to the technical field of intelligent data processing, in particular to a method and a device for analyzing UV finish paint performance detection data and a storage medium.
Background
The UV finishing paint, namely the ultraviolet curing finishing paint, is a novel environment-friendly finishing paint. The curing process is that the photoinitiator in the finish paint generates free radical through ultraviolet irradiation, and then the resin containing unsaturated double bond is polymerized, so that the film is quickly cured. The UV finishing paint has a plurality of advantages, including high curing speed, and can finish curing in a few seconds, thereby greatly shortening the production period; the cured paint film has high glossiness, high hardness, good weather resistance, good wear resistance, good chemical resistance and the like; meanwhile, the UV finish paint has no solvent volatilization in the curing process, so that the UV finish paint is a real environment-friendly paint with zero solvent pollution. The UV finishing paint has very wide application fields, including furniture, floor, decorative board, plastic, metal, paper, electronic product and other industries. In these industries, UV topcoats not only provide good decorative effects, but also provide protection, allowing for longer service lives of the substrates.
With the development of society, people have higher and higher requirements on the appearance quality of products, and the products are pursued to have better decoration, protection and durability, but due to the influence of various factors such as raw materials, formulas and production processes, the performances of the UV finishing paint are uneven, and the consistency of the products is difficult to ensure. Therefore, it is particularly necessary to perform scientific performance detection on the UV top-coat paint.
At present, the performance detection of the UV finish paint is mainly realized through manual physical and chemical index tests, such as the tests of indexes of color uniformity, glossiness, smoothness and the like. The detection mode is time-consuming and labor-consuming, the efficiency is low, the relationship between different indexes is difficult to judge, the overall performance of the UV finishing paint is not easy to evaluate, and secondly, the relationship between the components of the UV finishing paint is difficult to be clear, and the relationship between the components and the performance of the finishing paint cannot be determined, so that the performance optimization of the UV finishing paint becomes complex and difficult.
For the problems in the related art, no effective solution has been proposed at present.
Disclosure of Invention
In view of the above, the invention provides a method, a device and a storage medium for analyzing UV finish paint performance detection data, so as to solve the problems of time and labor waste, low efficiency and adverse evaluation of the overall performance of the UV finish paint through the manual physicochemical index test.
In order to solve the problems, the invention adopts the following specific technical scheme:
according to a first aspect of the present invention, there is provided a UV topcoat performance detection data analysis method comprising the steps of:
s1, acquiring and processing image sample data of UV finishing paint through image acquisition equipment;
S2, performing finish paint type judgment on the processed image sample data of the UV finish paint based on a UV finish paint type judgment algorithm;
s3, acquiring component data of the UV finishing paint;
s4, analyzing the association relation between the finishing paint category and the component data to obtain an association result;
and S5, adjusting and optimizing the component data of the UV finishing paint according to the obtained association result.
Preferably, the image sample data of the UV topcoat is acquired by an image acquisition device and processed comprising the steps of:
s11, preparing UV finishing paint detection samples with different components and performances, and performing image acquisition on the UV finishing paint through an image acquisition device according to a standard operation procedure to obtain image sample data;
s12, preprocessing the obtained image sample data, wherein the preprocessing comprises image denoising, image enhancement and color space conversion;
s13, performing outlier detection on the preprocessed image sample data, and deleting the detected outlier data;
s14, carrying out data arrangement and generalization on the residual image sample data.
Preferably, the outlier detection of the preprocessed image sample data and the deletion of the detected outlier data includes the steps of:
S131, randomly selecting image sample data from the preprocessed image sample data, and creating an isolation tree through a random segmentation attribute;
s132, calculating the path length of each image sample data point in the image sample data in each isolation tree;
s133, carrying out normalization processing on the obtained path length, and calculating an outlier score of each image sample data point;
s134, comparing the outlier score of each image sample data point with a preset threshold value, and marking the image sample data points with the outlier scores larger than the preset threshold value as outliers;
s135, deleting the image sample data points marked with the outliers from the image sample data.
Preferably, calculating the path length of each image sample data point in the image sample data in each isolation tree comprises the steps of:
s1321, initializing the path length of each image sample data point to be zero;
s1322, starting from the root node of each isolation tree, finding the child node to which the image sample data point belongs according to the attribute value of the image sample data point, and adding one to the path length;
s1323, setting the current node as the found child node, and repeating the steps S1322-S1323 until the leaf node is reached;
S1324, for each image sample data point, calculating an average value of the path lengths of the image sample data points in all the isolation trees, and taking the average value of the path lengths as the path lengths of the image sample data points in the isolation trees.
Preferably, the performing the finish type judgment on the processed image sample data of the UV finish based on the UV finish type judgment algorithm includes the following steps:
s21, collecting and processing UV finish paint image training data containing different categories;
s22, extracting feature vectors of the processed UV finishing coat image training data, and taking the extracted feature vectors as sample training data;
s23, calculating the center of the super sphere of each category in the sample training data through a UV finishing paint category judgment algorithm;
s24, determining the radius of each hypersphere by calculating a weighted Markov distance;
s25, constructing a hypersphere manifold of each category based on the hypersphere center of each category and the radius of each hypersphere;
s26, inputting the image sample data into the hypersphere manifold, finding the nearest manifold center of the hypersphere manifold, and judging the type of the image sample data according to the nearest manifold center.
Preferably, determining the radius of each hypersphere by calculating the weighted mahalanobis distance comprises the steps of:
S241, carrying out data standardization processing on each feature in the sample training data by a maximum and minimum value method;
s242, calculating an information entropy value and a feature weight of each feature in the sample training data;
s243, calculating covariance matrixes of each category according to the sample training data;
s244, calculating a weighted Markov distance from each sample training data to the class center based on the covariance matrix and the feature matrix;
s245, taking the maximum value of the weighted Marshall distances of all the sample training data as the radius of the hypersphere.
Preferably, a calculation formula for calculating a weighted mahalanobis distance from each sample training data to the category center based on the covariance matrix and the feature matrix is as follows:
in the method, in the process of the invention,wdrepresenting a weighted mahalanobis distance of each sample training data to the center of the category;
frepresenting the total number of feature vectors of the sample training data;
w t representing the feature weights;
σ t representing the standard deviation of the feature t;
x it representing sample training datax i Features intThe value of the key is taken;
x jt representing sample training datax j Features intThe value of the key is taken;
omega represents a feature weight matrix;
-1 representing the covariance matrix.
Preferably, the analysis of the association relationship between the top paint category and the component data to obtain the association result includes the following steps:
S41, calculating a correlation coefficient between the finishing paint category and the component data through the Pearson correlation coefficient;
s42, carrying out correlation judgment according to the obtained correlation coefficient;
s43, determining the relation between the finishing paint category and the component data according to the correlation judgment result to obtain a correlation result.
According to a second aspect of the present invention, there is provided a UV topcoat performance detection data analysis device comprising: the system comprises a data acquisition processing module, a finish paint category judging module, a component data acquisition module, an association relation analysis module and an attribute adjustment optimization module;
the data acquisition processing module is used for acquiring and processing image sample data of the UV finishing paint through the image acquisition equipment;
the finishing paint type judging module is used for judging the finishing paint type of the processed image sample data of the UV finishing paint based on a UV finishing paint type judging algorithm;
the component data acquisition module is used for acquiring the component data of the UV finish paint;
the association relation analysis module is used for analyzing the association relation between the finishing paint category and the component data to obtain an association result;
and the attribute adjustment optimization module is used for adjusting and optimizing the component data of the UV finish paint according to the obtained association result.
According to a third aspect of the present invention, there is provided a computer readable storage medium comprising computer program instructions which, when run on a computer, cause the computer to perform the UV finish performance detection data analysis method of the first aspect.
The beneficial effects of the invention are as follows:
1. according to the invention, the image sample data of the UV finishing paint is acquired through the image acquisition equipment and processed, the information of the finishing paint can be accurately acquired, the finishing paint type judgment can be automatically carried out on the processed image data of the UV finishing paint based on the UV finishing paint type judgment algorithm, the human error can be reduced, the judgment accuracy and consistency are improved, the component data of the UV finishing paint are acquired, the component data of the UV finishing paint can be adjusted and optimized through analyzing the association relation between the finishing paint type and the component data, so that the production process is optimized, the product quality is improved, the production cost is reduced, the relation between the finishing paint type and the component can be better understood by enterprises through the association result, the data support is provided for production decision, and the competitiveness of the enterprises is improved.
2. According to the invention, the UV finishing paint detection samples with different components and performances are prepared, the image acquisition is carried out according to the standard operation program by the image acquisition equipment, the acquisition quality and consistency of the image are ensured, the image data quality and analysis accuracy are improved by carrying out pretreatment through image denoising, image enhancement and color space conversion, outlier detection and deletion are carried out by adopting an isolated forest algorithm, the outlier is effectively processed, the data reliability is improved, the data availability is improved by data arrangement and induction, and high-quality data support is provided for subsequent UV finishing paint performance detection and data analysis.
3. According to the invention, the UV finishing coat image data are converted into the feature vectors, information beneficial to classification can be extracted from the images, meanwhile, unnecessary information is removed, the classification accuracy is improved, the Marsh distance considers the distribution and correlation of the data, different importance of different features can be given through the weight setting, the classification accuracy is further improved, the classification boundary can be easily adjusted through changing the center and the radius of the super sphere, and important reference information can be provided in links such as production, quality inspection and the like through effective classification and identification, so that the working efficiency and the product quality are improved.
4. According to the invention, the correlation judgment is carried out according to the obtained correlation coefficient, which finish paint categories are obviously associated with specific component data can be determined, and the relationship between the finish paint categories and the component data is determined according to the correlation judgment result, so that the correlation result is obtained, and the method has important value in the aspects of optimizing the product performance, improving the production efficiency and the like.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are needed in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art. In the drawings:
FIG. 1 is a flow chart of a method of analyzing UV topcoat performance detection data according to an embodiment of the present invention;
fig. 2 is a functional block diagram of a UV topcoat performance detection data analysis device according to an embodiment of the present invention.
In the figure:
1. a data acquisition processing module; 2. a finishing paint category judging module; 3. a component data acquisition module; 4. the association relation analysis module; 5. and the attribute adjustment optimization module.
Detailed Description
In order to make the technical solutions in the present application better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only some embodiments of the present application, but not all embodiments. All other embodiments, based on the embodiments herein, which would be apparent to one of ordinary skill in the art without undue burden are intended to be within the scope of the present application.
According to the embodiment of the invention, a method and a device for analyzing UV finish paint performance detection data and a storage medium are provided.
The invention will now be further described with reference to the drawings and detailed description;
As shown in fig. 1, according to a first embodiment of the present invention, there is provided a UV topcoat performance detection data analysis method including the steps of:
s1, acquiring and processing image sample data of the UV finishing paint through an image acquisition device.
As a preferred embodiment, acquiring image sample data of the UV topcoat by the image acquisition device and processing comprises the steps of:
s11, preparing UV finishing paint detection samples with different components and performances, and performing image acquisition on the UV finishing paint through an image acquisition device according to a standard operation procedure to obtain image sample data.
It should be noted that various UV topcoats required in actual production, including UV topcoats of different compositions and having different properties, were prepared as a series of UV topcoats test samples.
For example, the components of the UV topcoat mainly include UV curing agents, resins, pigments, additives, etc., and the properties of the UV topcoat mainly refer to color uniformity, glossiness, smoothness, bubble rate, etc.
The image acquisition device, such as a high-definition camera or a scanner, is used for carrying out image acquisition on the UV finish paint samples according to standard operation procedures, wherein the operation procedures comprise factors such as acquisition distance, light conditions, acquisition angles and the like so as to ensure that the acquired images are clear and accurately reflect the real conditions of the UV finish paint.
S12, preprocessing the obtained image sample data, wherein the preprocessing comprises image denoising, image enhancement, color space conversion and the like.
The image denoising refers to removing noise in an image by a linear filtering method such as mean filtering and median filtering, and the denoising can improve the reliability of subsequent analysis.
Image enhancement refers to enhancing image contrast through histogram equalization, which may highlight surface details.
Color space conversion refers to converting an RGB color image into other color spaces, such as HSV space to separate information of hue, saturation and the like, and the color space can be converted to better analyze the color characteristics of the image.
S13, performing outlier detection on the preprocessed image sample data, and deleting the detected outlier data.
As a preferred embodiment, performing outlier detection on the preprocessed image sample data and deleting the detected outlier data includes the steps of:
s131, randomly selecting image sample data from the preprocessed image sample data, and creating an isolation tree through the random segmentation attribute.
Specifically, N images are randomly selected from the preprocessed image sample data set to be used as a sample set for creating an isolation tree, for each selected image sample, an L-dimensional characteristic attribute is extracted to be used as an attribute of the sample, the L-dimensional characteristic attribute comprises characteristics such as color, texture, shape and the like of the image, before each isolation tree is built, a characteristic dimension is randomly selected to be used as a segmentation attribute, a segmentation point is randomly generated on the segmentation attribute, the sample set is segmented into a left sub-set and a right sub-set according to the segmentation point, the left sub-set and the right sub-set are used as left sub-nodes and right sub-nodes, and random segmentation attribute selection and segmentation point generation are recursively carried out to build the isolation tree.
S132, calculating the path length of each image sample data point in the image sample data in each isolation tree.
As a preferred embodiment, calculating the path length of each image sample data point in the image sample data in each isolation tree comprises the steps of:
s1321, for each image sample data point, initializing its path length to zero.
It should be noted that, initializing the path length means that before calculating the path length of each sample, the path length of all samples needs to be set to 0 as a starting point of the path length calculation, and the path length before each sample is emptied for the purpose of ensuring that each calculation starts from 0, and obtaining the correct path length of the sample in the current isolation tree.
S1322, starting from the root node of each isolation tree, finding the child node to which the image sample data point belongs according to the attribute value of the image sample data point, and adding one to the path length.
It should be noted that, the origin node of a tree is called a root node, in a data structure of a tree, the root node has no parent node, only child nodes, a node lower node is called a child node, and when traversing each layer of tree, the child nodes are determined according to the attribute of the sample, and the path lengths are accumulated.
S1323, setting the current 0 node as the found child node, and repeating the steps S1322-S1323 until the leaf node is reached.
For example, repeating steps S1322-S1323, which means selecting a new child node of the current node and updating the current node to this newly found child node, and repeating the two steps until you reach a leaf node, which is a node without child nodes, so when a leaf node is reached, step S1322 will not be performed because no child node can select, so far, the traversal from the root node to the leaf node is completed.
S1324, for each image sample data point, calculating an average value of the path lengths of the image sample data points in all the isolation trees, and taking the average value of the path lengths as the path lengths of the image sample data points in the isolation trees.
Specifically, the path length refers to the number of edges that pass from the root node to the image sample data point, and since the isolation forest is composed of a plurality of isolation trees, the path lengths in each tree are added and then divided by the number of trees to obtain an average value of the path lengths, and the average value is used as the path length of the image sample in the isolation tree.
And S133, normalizing the obtained path length to be within the range of [0,1], and calculating an outlier score of each image sample data point.
In particular, the purpose of normalization is to make the path lengths of all data points within the same comparison range, and then calculate outlier scores by measuring the similarity or distance between each data point and the preset reference data.
S134, comparing the outlier score of each image sample data point with a preset threshold value, and marking the image sample data points with the outlier scores larger than the preset threshold value as outliers.
For example, if the outlier score of a data point is greater than a preset threshold, it is marked as an outlier, and if the outlier score of the data point is less than or equal to the preset threshold, it will be considered a normal point.
S135, deleting the image sample data points marked with the outliers from the image sample data.
S14, carrying out data arrangement and generalization on the residual image sample data.
Specifically, UV finish paint detection samples with different components and performances are prepared, image acquisition is carried out through image acquisition equipment according to a standard operation program, the acquisition quality and consistency of images are ensured, the quality and analysis accuracy of image data are improved through image denoising, image enhancement and color space conversion pretreatment, outlier detection and deletion are carried out by adopting an isolated forest algorithm, outliers are effectively processed, the reliability of data is improved, and the usability of the data is improved through data arrangement and induction, so that high-quality data support is provided for subsequent UV finish paint performance detection and data analysis.
S2, performing finish paint type judgment on the processed image sample data of the UV finish paint based on a UV finish paint type judgment algorithm.
The judgment of the type of the finish paint is mainly to judge the color uniformity, glossiness, smoothness and bubble rate of the finish paint.
As a preferred embodiment, the performing the topcoat category judgment on the processed image sample data of the UV topcoat based on the UV topcoat category judgment algorithm includes the steps of:
s21, collecting and processing UV finishing coat image training data containing different types.
It should be noted that a large number of images containing different types of UV finishes are collected, and then the image processing in step S1 is performed.
S22, extracting feature vectors of the processed UV finishing coat image training data, and taking the extracted feature vectors as sample training data.
Specifically, for each UV finish image training data, extracting feature vectors of the images by using a feature extraction algorithm to obtain a group of digital representation image feature vectors, and then taking the image feature vectors as sample training data.
S23, calculating the center of the super sphere of each category in the sample training data through a UV finishing paint category judging algorithm.
Specifically, the UV finish paint category judgment algorithm in the application is mainly based on the classification thought of an iterative self-organizing clustering algorithm, and calculating the hypersphere center of each category in sample training data comprises the following steps:
Step one, selecting K initial clustering center points, and distributing sample training data to the closest clustering center points to form K clustering clusters.
And step two, calculating a new cluster center point of each cluster, which is usually the average value of all samples in the cluster.
And thirdly, finding out two closest clustering center points, calculating the midpoints of the two closest clustering center points as new clustering center points, and combining the two new clustering center points into one cluster.
And step four, repeating the step two to the step four until a certain termination condition is met, wherein the center of the super sphere of each category is composed of K-1 clustering center points.
S24, determining the radius of each hypersphere by calculating the weighted Markov distance.
As a preferred embodiment, determining the radius of each hypersphere by calculating the weighted mahalanobis distance comprises the steps of:
s241, carrying out data standardization processing on each feature in the sample training data through a maximum and minimum value method.
S242, calculating the information entropy value and the feature weight of each feature in the sample training data.
In particular, the information entropy value of each feature is calculated to evaluate its importance for distinguishing between the different categories, information entropy is typically used for feature selection, and based on the information entropy values, the weights of the features can be calculated, wherein features with lower information entropy are typically given higher weights, as they have a greater impact on classification.
S243, calculating covariance matrixes of each category according to the sample training data.
S244, calculating the weighted Markov distance from each sample training data to the class center based on the covariance matrix and the feature matrix.
As a preferred embodiment, the calculation formula for calculating the weighted mahalanobis distance from each sample training data to the category center based on the covariance matrix and the feature matrix is as follows:
in the method, in the process of the invention,wdrepresenting the weighted mahalanobis distance of each sample training data to the center of the category,frepresenting the total number of feature vectors of the sample training data,w t the characteristic weight is represented by a characteristic weight,σ t the standard deviation of the feature t is indicated,x it representing sample training datax i Features intThe value of the value is taken up on the table,x jt representing sample training datax j Features intThe value of the sum is represented by omega, and the sum is represented by a characteristic weight matrix -1 Representing the covariance matrix.
S245, taking the maximum value of the weighted Marshall distances of all the sample training data as the radius of the hypersphere.
S25, constructing a hypersphere manifold of each category based on the hypersphere center of each category and the radius of each hypersphere.
It should be noted that, the hypersphere manifold is a method for representing data distribution in a multidimensional space; the main idea is to consider the data as a set of points in a high dimensional space and assume that each data class can be represented by a hypersphere. In this approach, the center of the hypersphere represents the center or mean of the class, while the radius of the hypersphere can be expressed as the data distribution range or variance of the class.
Specifically, the center of the super sphere is used as the center coordinate, the radius of the super sphere is used as the coverage radius, a super sphere manifold mathematical model which takes the center of the sphere as the center and the radius as the coverage is constructed, and finally, the super sphere model corresponding to each category is obtained.
S26, inputting the image sample data into the hypersphere manifold, finding the nearest manifold center of the hypersphere manifold, and judging the type of the image sample data according to the nearest manifold center.
Specifically, the distance between the image sample data and the center of each type of hypersphere is calculated, and the type of the hypersphere is determined.
Specifically, information beneficial to classification can be extracted from an image by converting UV finish paint image data into feature vectors, unnecessary information is removed, classification accuracy is improved, data distribution and correlation are considered by a mahalanobis distance, different importance of different features can be given through weight setting, classification accuracy is further improved, classification boundaries can be easily adjusted by changing the center and the radius of a super sphere, and important reference information can be provided in links such as production, quality inspection and the like through effective classification and identification, so that working efficiency and product quality are improved.
And S3, acquiring component data of the UV finishing paint.
Before the component data of the UV finish is obtained, a repository of the UV finish product image with the category label and the component data corresponding to the UV finish product image needs to be established.
And then, finding out the data record corresponding to the image for each judged category, and extracting various performance data of the UV finishing paint to which the image belongs from the corresponding data record.
And S4, analyzing the association relation between the finishing paint category and the component data to obtain an association result.
As a preferred embodiment, analyzing the association relationship between the top coat category and the component data to obtain an association result includes the steps of:
s41, calculating the correlation coefficient between the finishing paint category and the component data through the Pearson correlation coefficient.
It should be noted that the pearson correlation coefficient, also called pearson correlation, is a statistical method for measuring the strength and direction of a linear relationship between two numerical variables, and is generally used to determine the degree of correlation between two variables, and the coefficient ranges from-1 to 1.
S42, carrying out correlation judgment according to the obtained correlation coefficient.
Specifically, the relationship between the class of the top coat and the component data is judged according to the value of the correlation coefficient, and in general, the value of the correlation coefficient is between-1 and 1, and the specific judgment standard is as follows:
If the correlation coefficient is close to 1, positive correlation is indicated, i.e., there is a strong positive correlation between the topcoat class and the component data.
If the correlation coefficient is close to-1, a negative correlation is indicated, i.e., there is a strong negative correlation between the topcoat class and the component data.
If the correlation coefficient is close to 0, no linear correlation is indicated, i.e., no obvious linear relationship between the topcoat class and the composition data.
S43, determining the relation between the finishing paint category and the component data according to the correlation judgment result to obtain a correlation result.
Specifically, positive correlation: if the correlation coefficient is positive, approaching 1, it indicates that the component data and the topcoat class have positive correlation, i.e., the topcoat class also has a similar trend as the component data changes. This means that the choice of different composition data may have a significant effect on the type of lacquer.
Negative correlation: if the correlation coefficient is negative, approaching-1, it indicates that the composition data and the topcoat category have a negative correlation, i.e., as the composition data changes, the topcoat category has an opposite trend, meaning that different composition data have opposite effects on the topcoat category.
No correlation exists: if the correlation coefficient is close to 0, it is indicated that there is no linear correlation between the topcoat class and the performance data, and selecting different topcoat components has no significant effect on the topcoat class.
Specifically, the correlation judgment is performed according to the obtained correlation coefficient, which finish paint categories are obviously associated with specific component data can be determined, and the relationship between the finish paint categories and the component data is determined according to the correlation judgment result, so that the correlation result is obtained, and the method has important values in the aspects of optimizing the product performance, improving the production efficiency and the like.
And S5, adjusting and optimizing the component data of the UV finishing paint according to the obtained association result.
In particular, if the correlation results indicate that a particular topcoat category is associated with certain component data, adjustments to the formulation and production process of the topcoat may be attempted to improve such component data or attributes, including, for example, but not limited to, adjusting the composition, concentration, changing curing conditions, and the like.
As shown in fig. 2, according to a second embodiment of the present invention, there is provided a UV topcoat performance detection data analysis device including: the system comprises a data acquisition processing module 1, a finish paint category judging module 2, a component data acquisition module 3, an association relation analysis module 4 and an attribute adjustment optimizing module 5.
The data acquisition processing module 1 is used for acquiring and processing image sample data of the UV finishing paint through the image acquisition equipment.
The finishing paint type judging module 2 is used for judging the finishing paint type of the processed image sample data of the UV finishing paint based on a UV finishing paint type judging algorithm.
And the component data acquisition module 3 is used for acquiring the component data of the UV finish paint.
And the association relation analysis module 4 is used for analyzing the association relation between the finishing paint category and the performance data to obtain an association result.
And the attribute adjustment optimization module 5 is used for adjusting and optimizing the component data of the UV finish paint according to the obtained association result.
The UV finish performance detection data analysis device can execute the UV finish performance detection data analysis method provided by any embodiment of the invention, and has the corresponding functional modules and beneficial effects of the execution method. Technical details not described in detail in this embodiment can be seen in the analysis method for UV topcoat performance detection data provided in any embodiment of the present invention.
Since the above-described UV topcoat performance detection data analysis device is a device capable of executing the data analysis method in the embodiment of the present invention, based on the data analysis method described in the embodiment of the present invention, those skilled in the art will be able to understand the specific implementation of the UV topcoat performance detection data analysis device in the embodiment of the present invention and various modifications thereof, so how the UV topcoat performance detection data analysis device implements the UV topcoat performance detection data analysis method in the embodiment of the present invention will not be described in detail herein. The device adopted by the method for analyzing the UV finish paint performance detection data in the embodiment of the invention belongs to the scope of protection required by the application.
According to a third embodiment of the present invention, there is provided a computer-readable storage medium comprising computer program instructions which, when run on a computer, cause the computer to perform the UV topcoat performance detection data analysis method of the first embodiment.
The computer-readable storage medium embodiments of the present invention are substantially the same as the embodiments of the method and apparatus described above, and are not described here in any way.
It should be noted that, the foregoing reference numerals of the embodiments of the present invention are merely for describing the embodiments, and do not represent the advantages and disadvantages of the embodiments. And the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, apparatus, article, or method that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, apparatus, article, or method. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, apparatus, article or method that comprises the element.
In summary, by means of the technical scheme, the image sample data of the UV finishing paint is obtained through the image acquisition equipment and processed, finishing paint information can be accurately obtained, finishing paint type judgment can be automatically carried out on the processed image data of the UV finishing paint based on a UV finishing paint type judgment algorithm, human errors can be reduced, the judgment accuracy and consistency are improved, component data of the UV finishing paint are obtained, the component data of the UV finishing paint can be adjusted and optimized by analyzing the association relation between the finishing paint type and the component data, so that the production process is optimized, the product quality is improved, the production cost is reduced, and enterprises can better understand the relation between the finishing paint type and the component by the association result, data support is provided for production decision, and the improvement of the competitiveness of enterprises is facilitated; according to the invention, the UV finishing paint detection samples with different components and performances are prepared, the image acquisition is carried out according to the standard operation program by the image acquisition equipment, the acquisition quality and consistency of the image are ensured, the quality and analysis accuracy of the image data are improved by carrying out pretreatment on image denoising, image enhancement and color space conversion, outlier detection and deletion are carried out by adopting an isolated forest algorithm, the outlier is effectively processed, the reliability of the data is improved, the usability of the data is improved by data arrangement and induction, and high-quality data support is provided for the subsequent UV finishing paint performance detection and data analysis; according to the invention, the UV finish paint image data are converted into the feature vectors, information beneficial to classification can be extracted from the images, meanwhile, unnecessary information is removed, the classification accuracy is improved, the Marsh distance considers the distribution and correlation of the data, different importance of different features can be given through the weight setting, the classification accuracy is further improved, the classification boundary can be easily adjusted through changing the center and radius of the super sphere, and important reference information can be provided in links such as production, quality inspection and the like through effective classification and identification, so that the working efficiency and the product quality are improved; according to the invention, the correlation judgment is carried out according to the obtained correlation coefficient, which finish paint categories are obviously associated with specific component data can be determined, and the relationship between the finish paint categories and the component data is determined according to the correlation judgment result, so that the correlation result is obtained, and the method has important value in the aspects of optimizing the product performance, improving the production efficiency and the like.
The computer storage media of embodiments of the invention may take the form of any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the computer-readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a Read-Only Memory (ROM), an erasable programmable Read-Only Memory ((Erasable Programmable Read Only Memory, EPROM) or flash Memory), an optical fiber, a portable compact disc Read-Only Memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, radio Frequency (RF), etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
The foregoing description of the embodiments has been provided for the purpose of illustrating the general principles of the invention, and is not meant to limit the scope of the invention, but to limit the invention to the particular embodiments, and any modifications, equivalents, improvements, etc. that fall within the spirit and principles of the invention are intended to be included within the scope of the invention.

Claims (7)

  1. The analysis method for the UV finishing coat performance detection data is characterized by comprising the following steps of:
    s1, acquiring and processing image sample data of UV finishing paint through image acquisition equipment;
    s2, performing finish paint type judgment on the processed image sample data of the UV finish paint based on a UV finish paint type judgment algorithm;
    s3, acquiring component data of the UV finishing paint;
    s4, analyzing the association relation between the finishing paint category and the component data to obtain an association result;
    s5, adjusting and optimizing the component data of the UV finishing paint according to the obtained association result;
    the method for judging the types of the finish paint based on the UV finish paint type judgment algorithm comprises the following steps of:
    S21, collecting and processing UV finish paint image training data containing different categories;
    s22, extracting feature vectors of the processed UV finishing coat image training data, and taking the extracted feature vectors as sample training data;
    s23, calculating the center of the super sphere of each category in the sample training data through a UV finishing paint category judgment algorithm;
    s24, determining the radius of each hypersphere by calculating a weighted Markov distance;
    s25, constructing a hypersphere manifold of each category based on the hypersphere center of each category and the radius of each hypersphere;
    s26, inputting the image sample data into the hypersphere manifold, finding the nearest manifold center of the hypersphere manifold, and judging the type of the image sample data according to the nearest manifold center;
    said determining the radius of each hypersphere by calculating the weighted mahalanobis distance comprises the steps of:
    s241, carrying out data standardization processing on each feature in the sample training data by a maximum and minimum value method;
    s242, calculating an information entropy value and a feature weight of each feature in the sample training data;
    s243, calculating covariance matrixes of each category according to the sample training data;
    s244, calculating a weighted Markov distance from each sample training data to the class center based on the covariance matrix and the feature matrix;
    S245, taking the maximum value of the weighted Marshall distances of all sample training data as the radius of the hypersphere;
    the analysis of the association relation between the finishing paint category and the component data to obtain an association result comprises the following steps:
    s41, calculating a correlation coefficient between the finishing paint category and the component data through the Pearson correlation coefficient;
    s42, carrying out correlation judgment according to the obtained correlation coefficient;
    the value of the correlation coefficient is between-1 and 1, and the specific judgment standard is as follows:
    if the correlation coefficient is close to 1, positive correlation is indicated, namely, a strong positive correlation exists between the finishing paint category and the component data;
    if the correlation coefficient is close to-1, negative correlation is indicated, namely, a strong negative correlation exists between the finishing paint category and the component data;
    if the correlation coefficient is close to 0, no linear correlation is shown, namely no obvious linear relation exists between the finishing paint category and the component data;
    s43, determining the relation between the finishing paint category and the component data according to the correlation judgment result to obtain a correlation result.
  2. 2. The method of analyzing UV topcoat performance detection data according to claim 1, wherein the acquiring and processing of the image sample data of the UV topcoat by the image acquisition device comprises the steps of:
    S11, preparing UV finishing paint detection samples with different components and performances, and performing image acquisition on the UV finishing paint through an image acquisition device according to a standard operation procedure to obtain image sample data;
    s12, preprocessing the obtained image sample data, wherein the preprocessing comprises image denoising, image enhancement and color space conversion;
    s13, performing outlier detection on the preprocessed image sample data, and deleting the detected outlier data;
    s14, carrying out data arrangement and generalization on the residual image sample data.
  3. 3. The method of analyzing UV topcoat performance detection data according to claim 2, wherein the performing outlier detection on the preprocessed image sample data and deleting the detected outlier data comprises the steps of:
    s131, randomly selecting image sample data from the preprocessed image sample data, and creating an isolation tree through a random segmentation attribute;
    s132, calculating the path length of each image sample data point in the image sample data in each isolation tree;
    s133, carrying out normalization processing on the obtained path length, and calculating an outlier score of each image sample data point;
    S134, comparing the outlier score of each image sample data point with a preset threshold value, and marking the image sample data points with the outlier scores larger than the preset threshold value as outliers;
    s135, deleting the image sample data points marked with the outliers from the image sample data.
  4. 4. A method of analyzing UV topcoat performance detection data according to claim 3, wherein said calculating the path length of each image sample data point in the image sample data in each isolation tree comprises the steps of:
    s1321, initializing the path length of each image sample data point to be zero;
    s1322, starting from the root node of each isolation tree, finding the child node to which the image sample data point belongs according to the attribute value of the image sample data point, and adding one to the path length;
    s1323, setting the current node as the found child node, and repeating the steps S1322-S1323 until the leaf node is reached;
    s1324, for each image sample data point, calculating an average value of the path lengths of the image sample data points in all the isolation trees, and taking the average value of the path lengths as the path lengths of the image sample data points in the isolation trees.
  5. 5. The method for analyzing UV topcoat performance detection data according to claim 1, wherein the calculation formula for calculating the weighted mahalanobis distance from each sample training data to the category center based on the covariance matrix and the feature matrix is:
    In the method, in the process of the invention,wdrepresenting a weighted mahalanobis distance of each sample training data to the center of the category;
    frepresenting the total number of feature vectors of the sample training data;
    w t representing the feature weights;
    σ t representing the standard deviation of the feature t;
    x it representing sample training datax i Features intThe value of the key is taken;
    x jt representing sample training datax j Features intThe value of the key is taken;
    omega represents a feature weight matrix;
    -1 representing the covariance matrix.
  6. A UV topcoat performance test data analysis device for implementing the UV topcoat performance test data analysis method of any one of claims 1 to 5, characterized in that the UV topcoat performance test data analysis device comprises: the system comprises a data acquisition processing module, a finish paint category judging module, a component data acquisition module, an association relation analysis module and an attribute adjustment optimization module;
    the data acquisition processing module is used for acquiring and processing image sample data of the UV finishing paint through the image acquisition equipment;
    the finishing paint type judging module is used for judging the finishing paint type of the processed image sample data of the UV finishing paint based on a UV finishing paint type judging algorithm;
    the component data acquisition module is used for acquiring the component data of the UV finish paint;
    the association relation analysis module is used for analyzing the association relation between the finishing paint category and the component data to obtain an association result;
    And the attribute adjustment optimization module is used for adjusting and optimizing the component data of the UV finish paint according to the obtained association result.
  7. 7. A computer readable storage medium, characterized in that the computer readable storage medium comprises computer program instructions which, when run on a computer, cause the computer to perform the UV topcoat performance detection data analysis method according to any one of claims 1 to 5.
CN202311626281.7A 2023-11-30 2023-11-30 UV finish paint performance detection data analysis method, device and storage medium Active CN117333486B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311626281.7A CN117333486B (en) 2023-11-30 2023-11-30 UV finish paint performance detection data analysis method, device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311626281.7A CN117333486B (en) 2023-11-30 2023-11-30 UV finish paint performance detection data analysis method, device and storage medium

Publications (2)

Publication Number Publication Date
CN117333486A CN117333486A (en) 2024-01-02
CN117333486B true CN117333486B (en) 2024-03-22

Family

ID=89279603

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311626281.7A Active CN117333486B (en) 2023-11-30 2023-11-30 UV finish paint performance detection data analysis method, device and storage medium

Country Status (1)

Country Link
CN (1) CN117333486B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109597968A (en) * 2018-12-29 2019-04-09 西安电子科技大学 Paste solder printing Performance Influence Factor analysis method based on SMT big data
CN111080088A (en) * 2019-11-28 2020-04-28 东莞市三姆森光电科技有限公司 Method for quickly judging product quality based on clustered hypersphere model
CN111813618A (en) * 2020-05-28 2020-10-23 平安科技(深圳)有限公司 Data anomaly detection method, device, equipment and storage medium
CN115393645A (en) * 2022-08-27 2022-11-25 宁波华东核工业工程勘察院 Automatic soil classification and naming method and system, storage medium and intelligent terminal
CN115510043A (en) * 2022-09-29 2022-12-23 甘肃新泉风力发电有限公司 Wind power curve abnormal data eliminating method
WO2023000653A1 (en) * 2021-07-19 2023-01-26 湖南大学 Method for implementing hyperspectral medical component analysis by using graph convolutional neural network

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109597968A (en) * 2018-12-29 2019-04-09 西安电子科技大学 Paste solder printing Performance Influence Factor analysis method based on SMT big data
CN111080088A (en) * 2019-11-28 2020-04-28 东莞市三姆森光电科技有限公司 Method for quickly judging product quality based on clustered hypersphere model
CN111813618A (en) * 2020-05-28 2020-10-23 平安科技(深圳)有限公司 Data anomaly detection method, device, equipment and storage medium
WO2023000653A1 (en) * 2021-07-19 2023-01-26 湖南大学 Method for implementing hyperspectral medical component analysis by using graph convolutional neural network
CN115393645A (en) * 2022-08-27 2022-11-25 宁波华东核工业工程勘察院 Automatic soil classification and naming method and system, storage medium and intelligent terminal
CN115510043A (en) * 2022-09-29 2022-12-23 甘肃新泉风力发电有限公司 Wind power curve abnormal data eliminating method

Also Published As

Publication number Publication date
CN117333486A (en) 2024-01-02

Similar Documents

Publication Publication Date Title
CN107784661B (en) Transformer substation equipment infrared image classification and identification method based on region growing method
CN107515895B (en) Visual target retrieval method and system based on target detection
CN106203523B (en) The hyperspectral image classification method of the semi-supervised algorithm fusion of decision tree is promoted based on gradient
CN111553127B (en) Multi-label text data feature selection method and device
Kim et al. Color–texture segmentation using unsupervised graph cuts
CN109543723B (en) Robust image clustering method
CN109981625B (en) Log template extraction method based on online hierarchical clustering
CN103345744B (en) A kind of human body target part automatic analytic method based on many images
CN110097060B (en) Open set identification method for trunk image
CN108537751B (en) Thyroid ultrasound image automatic segmentation method based on radial basis function neural network
CN105184265A (en) Self-learning-based handwritten form numeric character string rapid recognition method
CN103294987A (en) Fingerprint matching method and fingerprint matching implementation mode
CN111062928A (en) Method for identifying lesion in medical CT image
CN109993213A (en) A kind of automatic identifying method for garment elements figure
CN111860359B (en) Point cloud classification method based on improved random forest algorithm
CN111178427A (en) Depth self-coding embedded clustering method based on Sliced-Wasserstein distance
CN112465062A (en) Clustering method based on manifold learning and rank constraint
CN115424050A (en) Method and system for detecting and positioning ceramic tile surface defects and storage medium
CN105160598B (en) Power grid service classification method based on improved EM algorithm
CN110458064A (en) Combined data is driving and the detection of the low target of Knowledge driving type and recognition methods
Lu et al. Spectral segmentation via midlevel cues integrating geodesic and intensity
CN108428234B (en) Interactive segmentation performance optimization method based on image segmentation result evaluation
CN117333486B (en) UV finish paint performance detection data analysis method, device and storage medium
CN113762151A (en) Fault data processing method and system and fault prediction method
CN113313179A (en) Noise image classification method based on l2p norm robust least square method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant