CN114742834B - Method for judging abrasion of machining cutter of complex structural part - Google Patents

Method for judging abrasion of machining cutter of complex structural part Download PDF

Info

Publication number
CN114742834B
CN114742834B CN202210659535.4A CN202210659535A CN114742834B CN 114742834 B CN114742834 B CN 114742834B CN 202210659535 A CN202210659535 A CN 202210659535A CN 114742834 B CN114742834 B CN 114742834B
Authority
CN
China
Prior art keywords
image
cutter
historical
tool
reconstructed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210659535.4A
Other languages
Chinese (zh)
Other versions
CN114742834A (en
Inventor
杨之乐
朱俊丞
刘祥飞
余发国
谭勇
魏国君
李政
马庆丰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhongke Hangmai CNC Software Shenzhen Co Ltd
Original Assignee
Zhongke Hangmai CNC Software Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhongke Hangmai CNC Software Shenzhen Co Ltd filed Critical Zhongke Hangmai CNC Software Shenzhen Co Ltd
Priority to CN202210659535.4A priority Critical patent/CN114742834B/en
Publication of CN114742834A publication Critical patent/CN114742834A/en
Application granted granted Critical
Publication of CN114742834B publication Critical patent/CN114742834B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23QDETAILS, COMPONENTS, OR ACCESSORIES FOR MACHINE TOOLS, e.g. ARRANGEMENTS FOR COPYING OR CONTROLLING; MACHINE TOOLS IN GENERAL CHARACTERISED BY THE CONSTRUCTION OF PARTICULAR DETAILS OR COMPONENTS; COMBINATIONS OR ASSOCIATIONS OF METAL-WORKING MACHINES, NOT DIRECTED TO A PARTICULAR RESULT
    • B23Q17/00Arrangements for observing, indicating or measuring on machine tools
    • B23Q17/09Arrangements for observing, indicating or measuring on machine tools for indicating or measuring cutting pressure or for determining cutting-tool condition, e.g. cutting ability, load on tool
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/12Classification; Matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Mechanical Engineering (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Analysis (AREA)
  • Length Measuring Devices With Unspecified Measuring Means (AREA)

Abstract

The invention discloses a method for judging abrasion of a machining cutter of a complex structural member. The method adopts a data reconstruction method to generate the training data of the target model, and reduces the time for acquiring and labeling the training data. The method solves the problem that a large amount of labor cost is needed to collect and label data before training because a large amount of training data is needed to collect in advance to complete model training in the method for predicting the cutter wear grade by adopting a model in the prior art.

Description

Method for judging abrasion of machining cutter of complex structural part
Technical Field
The invention relates to the field of signal processing, in particular to a method for judging abrasion of a machining cutter of a complex structural part.
Background
For the machining of complex structural parts, the tool is more prone to wear, and the wear stage generally comprises three stages, namely initial wear, normal wear and severe wear. In order to ensure the machining precision, the wear grade of the cutter needs to be monitored in real time, and the cutter which is worn severely needs to be replaced in time. In the prior art, a method for predicting the tool wear level by using a model usually needs to collect a large amount of training data in advance to finish model training, so that a large amount of labor cost is needed to collect and label data before training.
Thus, there is still a need for improvement and development of the prior art.
Disclosure of Invention
The technical problem to be solved by the present invention is to provide a method for determining wear of a tool for machining a complex structural member, aiming at solving the problem that a method for predicting the wear level of the tool by using a model in the prior art usually needs to collect a large amount of training data in advance to complete model training, so that a large amount of labor cost is required to collect and label data before training.
The technical scheme adopted by the invention for solving the problems is as follows:
in a first aspect, an embodiment of the present invention provides a method for determining wear of a machining tool for a complex structural member, where the method includes:
acquiring a cutter image corresponding to a target cutter, and inputting the cutter image into a pre-trained target model;
acquiring an actual cutter wear grade output by the target model based on the cutter image;
the training process of the target model comprises the following steps:
acquiring a plurality of historical cutter image sets, wherein the plurality of historical cutter image sets respectively correspond to different cutter wear levels;
generating a plurality of reconstructed tool image sets corresponding to the plurality of historical tool image sets respectively according to the plurality of historical tool image sets, wherein each reconstructed tool image set comprises a plurality of reconstructed tool images, and each reconstructed tool image is generated based on a preset number of historical tool images in the historical tool image set corresponding to the reconstructed tool image set;
generating a training tool image set according to a plurality of historical tool image sets and a plurality of reconstructed tool image sets corresponding to the historical tool image sets respectively;
and performing model training on the target model according to the training tool image set to obtain the trained target model.
In one embodiment, the acquiring a plurality of sets of historical tool images includes:
acquiring a plurality of historical cutter images and cutter current signals corresponding to the historical cutter images respectively, wherein the image acquisition time of each historical cutter image is the same as the acquisition time of the cutter current signal corresponding to the historical cutter image;
determining cutter wear grades corresponding to a plurality of historical cutter images according to cutter current signals corresponding to the plurality of historical cutter images respectively;
and determining a plurality of historical tool image sets according to tool wear grades respectively corresponding to the plurality of historical tool images, wherein the tool wear grades respectively corresponding to the historical tool images in each historical tool image set are the same.
In one embodiment, the determining, according to the tool current signals corresponding to the plurality of historical tool images, tool wear levels corresponding to the plurality of historical tool images includes:
determining signal waveform data corresponding to a plurality of historical cutter images according to cutter current signals corresponding to the plurality of historical cutter images respectively;
and matching signal waveform data respectively corresponding to a plurality of historical cutter images with a preset signal waveform database to obtain cutter wear grades respectively corresponding to the plurality of historical cutter images, wherein the signal waveform database comprises a plurality of standard signal waveform data, and the plurality of standard signal waveform data respectively correspond to different cutter wear grades.
In one embodiment, the generating a plurality of reconstructed tool image sets corresponding to the plurality of historical tool image sets according to the plurality of historical tool image sets includes:
determining a plurality of reconstruction combinations corresponding to each historical tool image set according to each historical tool image set, wherein the plurality of reconstruction combinations comprise all image combinations determined based on the preset number;
acquiring reconstructed cutter images respectively corresponding to a plurality of reconstructed combinations, wherein each reconstructed cutter image comprises a foreground image and a background image, the foreground image is determined based on the foreground image of each historical cutter image in the reconstructed combination corresponding to the reconstructed cutter image, and the background image is determined based on the background image of each historical cutter image in the reconstructed combination corresponding to the reconstructed cutter image;
and generating a reconstructed tool image set corresponding to the historical tool image set according to the reconstructed tool images respectively corresponding to the plurality of reconstructed combinations.
In one embodiment, the obtaining of the reconstructed tool images corresponding to a plurality of the reconstruction combinations respectively includes:
acquiring a first fusion weight and a second fusion weight which respectively correspond to each historical tool image in each reconstruction combination;
according to the first fusion weight corresponding to each historical cutter image in the reconstruction combination, performing weighted fusion on the foreground images of the historical cutter images in the reconstruction combination to obtain a fusion foreground image corresponding to the reconstruction combination;
according to the second fusion weight corresponding to each historical cutter image in the reconstruction combination, performing weighted fusion on the background image of each historical cutter image in the reconstruction combination to obtain a fusion background image corresponding to the reconstruction combination;
and determining the reconstructed cutter image corresponding to the reconstructed combination according to the fused foreground image and the fused background image corresponding to the reconstructed combination.
In an embodiment, the obtaining a first fusion weight and a second fusion weight corresponding to each historical tool image in each of the reconstruction combinations includes:
acquiring image proportions corresponding to the historical cutter images in each reconstruction combination, wherein the image proportions are used for reflecting the area ratio of foreground images in the historical cutter images;
and determining a first fusion weight and a second fusion weight which respectively correspond to each historical tool image in the reconstruction combination according to the image proportion, wherein the first fusion weight is in a direct proportion relation with the image proportion, and the second fusion weight is in an inverse proportion relation with the image proportion.
In one embodiment, the target model is a width learning model.
In a second aspect, an embodiment of the present invention further provides a device for determining wear of a machining tool for a complex structural member, where the device includes:
the image acquisition module is used for acquiring a cutter image corresponding to a target cutter and inputting the cutter image into a pre-trained target model;
the state judgment module is used for acquiring the actual cutter abrasion grade output by the target model based on the cutter image;
the training process of the target model comprises the following steps:
acquiring a plurality of historical cutter image sets, wherein the plurality of historical cutter image sets respectively correspond to different cutter wear levels;
generating a plurality of reconstructed cutter image sets corresponding to the historical cutter image sets respectively according to the plurality of historical cutter image sets, wherein each reconstructed cutter image set comprises a plurality of reconstructed cutter images, and each reconstructed cutter image is generated based on a preset number of historical cutter images in the historical cutter image set corresponding to the reconstructed cutter image set;
generating a training tool image set according to a plurality of historical tool image sets and a plurality of reconstructed tool image sets corresponding to the historical tool image sets respectively;
and performing model training on a target model according to the training tool image set to obtain the trained target model.
In a third aspect, an embodiment of the present invention further provides a terminal, where the terminal includes a memory and one or more processors; the memory stores one or more programs; the program comprises instructions for executing the method for judging the abrasion of the machining cutter of the complex structural part; the processor is configured to execute the program.
In a fourth aspect, the embodiment of the present invention further provides a computer-readable storage medium, on which a plurality of instructions are stored, where the instructions are adapted to be loaded and executed by a processor to implement the steps of any one of the above methods for determining wear of a machining tool for a complex structural component.
The invention has the beneficial effects that: the embodiment of the invention generates the training data of the target model by a data reconstruction method, thereby reducing the time for acquiring and labeling the training data. The method solves the problem that a large amount of labor cost is needed to collect and label data before training because a large amount of training data is needed to collect in advance to complete model training in the method for predicting the cutter wear grade by adopting a model in the prior art.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments described in the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic flow chart of a method for determining wear of a machining tool for a complex structural member according to an embodiment of the present invention.
Fig. 2 is a schematic flowchart of a training process of a target model according to an embodiment of the present invention.
Fig. 3 is a schematic block diagram of a device for determining wear of a tool for machining a complex structural member according to an embodiment of the present invention.
Fig. 4 is a schematic block diagram of a terminal according to an embodiment of the present invention.
Detailed Description
The invention discloses a method for judging the abrasion of a machining cutter of a complex structural part, and in order to make the purpose, the technical scheme and the effect of the invention clearer and clearer, the invention is further described in detail by referring to the attached drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
As used herein, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. It will be understood that when an element is referred to as being "connected" or "coupled" to another element, it can be directly connected or coupled to the other element or intervening elements may also be present. Further, "connected" or "coupled" as used herein may include wirelessly connected or wirelessly coupled. As used herein, the term "and/or" includes all or any element and all combinations of one or more of the associated listed items.
It will be understood by those skilled in the art that, unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the prior art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
For the machining of complex structural parts, the tool is more prone to wear, and the wear stage generally comprises three stages, namely initial wear, normal wear and severe wear. In order to ensure the machining precision, the wear grade of the cutter needs to be monitored in real time, and the cutter which is severely worn needs to be replaced in time. In the prior art, a method for predicting the tool wear grade by using a model usually needs to collect a large amount of training data in advance to finish model training, so that a large amount of labor cost is required to collect and label data before training.
Aiming at the defects in the prior art, the invention provides a method for judging the abrasion of a machining cutter of a complex structural part, which comprises the steps of acquiring a cutter image corresponding to a target cutter, and inputting the cutter image into a target model which is trained in advance; acquiring an actual cutter wear grade output by the target model based on the cutter image; the training process of the target model comprises the following steps: acquiring a plurality of historical cutter image sets, wherein the plurality of historical cutter image sets respectively correspond to different cutter wear levels; generating a plurality of reconstructed cutter image sets corresponding to the historical cutter image sets respectively according to the plurality of historical cutter image sets, wherein each reconstructed cutter image set comprises a plurality of reconstructed cutter images, and each reconstructed cutter image is generated based on a preset number of historical cutter images in the historical cutter image set corresponding to the reconstructed cutter image set; generating a training tool image set according to a plurality of historical tool image sets and a plurality of reconstructed tool image sets corresponding to the historical tool image sets respectively; and performing model training on the target model according to the training tool image set to obtain the trained target model. The method adopts a data reconstruction method to generate the training data of the target model, and reduces the time for acquiring and labeling the training data. The method solves the problem that a large amount of labor cost is needed to collect and label data before training because a large amount of training data is needed to collect in advance to complete model training in the method for predicting the cutter wear grade by adopting a model in the prior art.
As shown in fig. 1, the method comprises the steps of:
and S100, acquiring a cutter image corresponding to a target cutter, and inputting the cutter image into a pre-trained target model.
Specifically, the target tool in this embodiment may be any tool that is used for machining a complex structural member and needs to monitor the wear level. The target model of this embodiment is trained in advance, and the external shape and structure features of the target tool under different wear levels are learned, so this embodiment inputs the current tool image of the target tool into the target model, and determines the current tool wear level of the target tool through the target model.
As shown in fig. 1, the method further comprises the steps of:
and S200, acquiring an actual tool wear grade output by the target model based on the tool image.
Specifically, after the target model acquires the current cutter image of the target cutter, the cutter image is subjected to image analysis based on the previously learned external shape and structure characteristics of the target cutter under different wear levels, so that image classification is realized, and the actual cutter wear level corresponding to the cutter image is output according to the classification result.
In one implementation, as shown in fig. 2, the training process of the target model includes:
s10, obtaining a plurality of historical cutter image sets, wherein the plurality of historical cutter image sets respectively correspond to different cutter wear levels;
step S20, generating a plurality of reconstructed tool image sets corresponding to the plurality of historical tool image sets respectively according to the plurality of historical tool image sets, wherein each reconstructed tool image set comprises a plurality of reconstructed tool images, and each reconstructed tool image is generated based on a preset number of historical tool images in the historical tool image set corresponding to the reconstructed tool image set;
step S30, generating a training tool image set according to a plurality of historical tool image sets and a plurality of reconstructed tool image sets corresponding to the historical tool image sets respectively;
and S40, performing model training on the target model according to the training tool image set to obtain the trained target model.
In order to train the target model, the present embodiment needs to generate the training data of the target model first. Specifically, firstly, historical tool images of tools of the same type as the target tool under different tool wear levels are obtained, and a plurality of historical tool image sets are obtained, wherein each historical tool image set corresponds to different tool wear levels. In order to obtain a sufficient amount of training data, for each historical tool image set, the present embodiment reconstructs the images in the historical tool image set to obtain a reconstructed tool image set corresponding to the historical tool image set, so as to increase the number of training images in the tool wear level corresponding to the historical tool image set. It can be understood that, for each historical tool image set, the type of the annotation data of each image in the historical tool image set and the corresponding reconstructed tool image set is the same, and is determined based on the tool wear level corresponding to the historical tool image set. And finally, generating a training tool image set according to each historical tool image set and each reconstructed tool image set, and finishing model training of the target model through the training tool image set. According to the embodiment, the automatic amplification of the training data is realized through a data reconstruction method, the reconstructed data does not need manual marking, and a large amount of time for acquiring and marking the data is saved.
In one implementation, the step S10 specifically includes the following steps:
s101, obtaining a plurality of historical cutter images and cutter current signals corresponding to the historical cutter images respectively, wherein the image acquisition time of each historical cutter image is the same as the acquisition time of the cutter current signal corresponding to the historical cutter image;
s102, determining cutter wear grades respectively corresponding to a plurality of historical cutter images according to cutter current signals respectively corresponding to the plurality of historical cutter images;
step S103, determining a plurality of historical tool image sets according to tool wear levels respectively corresponding to the plurality of historical tool images, wherein the tool wear levels respectively corresponding to the historical tool images in each historical tool image set are the same.
Since a large amount of labor and time cost are required for manually labeling data, the embodiment provides an automatic data labeling method. Specifically, a plurality of historical tool images and tool current signals respectively corresponding to the historical tool images are acquired through a historical processing record of the numerical control machine tool and a shooting record of a camera device for monitoring the tool, and the wear level of the target tool in the historical tool image can be automatically judged according to the tool current signal of the historical tool image for each historical tool image because the signal characteristics of the tool current signals generated by the target tool under different wear levels are different. And finally, carrying out image classification on all historical cutter images according to the cutter wear grades, and putting the historical cutter images with the same cutter wear grade into a set to obtain a plurality of historical cutter image sets.
In one implementation, the step S102 specifically includes the following steps:
step S1021, determining signal waveform data corresponding to a plurality of historical tool images according to tool current signals corresponding to the plurality of historical tool images;
step S1022, matching the signal waveform data respectively corresponding to the plurality of historical tool images with a preset signal waveform database to obtain tool wear levels respectively corresponding to the plurality of historical tool images, where the signal waveform database includes a plurality of standard signal waveform data, and the plurality of standard signal waveform data respectively correspond to different tool wear levels.
Specifically, for each historical tool image, the waveform characteristics of the tool current signal corresponding to the historical tool image are extracted to obtain signal waveform data. In the embodiment, waveform characteristics of current signals generated by a target cutter under different wear grades are obtained in advance, and a signal waveform database is constructed according to the waveform characteristics, wherein the signal waveform database comprises a plurality of standard signal waveform data, and each standard signal waveform data is provided with a label data for reflecting the cutter wear grade corresponding to the standard signal waveform data. And aiming at each historical cutter image, comparing the signal waveform data of the historical cutter image with the signal waveform database, and taking the cutter wear grade corresponding to the standard waveform data with the highest similarity degree as the cutter wear grade corresponding to the historical cutter image. Thereby completing the labeling work of each historical cutter image.
As shown in fig. 1, the step S20 specifically includes the following steps:
step S201, determining a plurality of reconstruction combinations corresponding to each historical tool image set according to each historical tool image set, wherein the plurality of reconstruction combinations comprise all image combinations determined based on the preset number;
step S202, acquiring reconstructed cutter images respectively corresponding to a plurality of reconstructed combinations, wherein each reconstructed cutter image comprises a foreground image and a background image, the foreground image is determined based on the foreground image of each historical cutter image in the reconstructed combination corresponding to the reconstructed cutter image, and the background image is determined based on the background image of each historical cutter image in the reconstructed combination corresponding to the reconstructed cutter image;
and S203, generating a reconstructed tool image set corresponding to the historical tool image set according to the reconstructed tool images respectively corresponding to the plurality of reconstructed combinations.
Specifically, assuming that the preset number is 2, for each historical tool image set, all image combinations that can be formed by two historical tool images in the historical tool image set are obtained, and each image combination is a reconstruction combination. And for each reconstruction combination, fusing foreground images and background images of 2 historical tool images in the reconstruction combination with each other to obtain a reconstruction tool image corresponding to the reconstruction combination. And finally, obtaining a reconstructed cutter image set corresponding to the historical cutter image set according to the reconstructed cutter images of the reconstructed combinations. It can be understood that the total number of the reconstructed tool images corresponding to each historical tool image set is equal to the number of the reconstructed combinations corresponding to the historical tool image set, and different numbers of the reconstructed tool images can be obtained by adjusting the preset number.
In an implementation manner, the step S202 specifically includes the following steps:
step S2021, acquiring a first fusion weight and a second fusion weight corresponding to each historical tool image in each reconstruction combination;
step S2022, performing weighted fusion on the foreground images of the historical tool images in the reconstruction combination according to the first fusion weights corresponding to the historical tool images in the reconstruction combination respectively to obtain fusion foreground images corresponding to the reconstruction combination;
step S2023, performing weighted fusion on the background images of the historical tool images in the reconstruction combination according to the second fusion weights corresponding to the historical tool images in the reconstruction combination respectively to obtain fusion background images corresponding to the reconstruction combination;
step S2024, determining the reconstructed tool image corresponding to the reconstruction combination according to the fusion foreground image and the fusion background image corresponding to the reconstruction combination.
Specifically, for each reconstruction combination, each historical tool image in the reconstruction combination is provided with two weights, one is a first fusion weight for indicating the importance degree of a foreground image of the reconstruction combination, and the other is a second fusion weight for indicating the importance degree of a background image of the reconstruction combination. And when the foreground images of the historical cutter images in the reconstruction combination are fused, performing weighted fusion by adopting the first fusion weights respectively corresponding to the historical cutter images to obtain a fusion foreground image. And when the background images of the historical tool images in the reconstruction combination are fused, performing weighted fusion by adopting the second fusion weights respectively corresponding to the historical tool images to obtain a fusion background image. And finally, generating a reconstructed cutter image corresponding to the reconstructed combination according to the fused foreground image and the fused background image.
In one implementation, the step S2021 specifically includes the following steps:
step S20211, obtaining image proportions corresponding to the historical cutter images in each reconstruction combination, wherein the image proportions are used for reflecting the area ratio of foreground images in the historical cutter images;
step S20212, determining a first fusion weight and a second fusion weight respectively corresponding to each historical tool image in the reconstruction combination according to the image proportion, wherein the first fusion weight is in a direct proportion relation with the image proportion, and the second fusion weight is in an inverse proportion relation with the image proportion.
Specifically, for each historical tool image, the present embodiment determines the corresponding first fusion weight and second fusion weight according to the area ratio of the foreground image in the historical tool image. When the area ratio of the foreground image is larger, the foreground part in the historical tool image is more than the background part, so that the first fusion weight corresponding to the historical tool image is higher, and the second fusion weight is lower; conversely, when the area ratio of the foreground image is smaller, the foreground part in the historical tool image is less than the background part, so that the first fusion weight corresponding to the historical tool image is lower, and the second fusion weight is higher.
In one implementation manner, for each of the reconstruction combinations, the image resolution levels respectively corresponding to the historical tool images in the reconstruction combination are obtained. And determining a first fusion weight corresponding to the historical tool image according to the image resolution grade and the area ratio corresponding to the historical tool image and determining a second fusion weight corresponding to the historical tool image according to the difference value between the highest grade of the image resolution grade and the first fusion weight. In short, since the image resolution of the historical tool image affects the quality of the reconstructed tool image, especially the foreground portion, the image resolution of each historical tool image needs to be considered when setting the first fusion weight.
In one implementation, the target model is a width learning model.
Specifically, the breadth learning is a neural network structure independent of a depth structure, and due to the fact that an incremental learning algorithm is added in the breadth learning model, when a new node is added in the network structure, network weight is updated with small calculation cost, and therefore the breadth learning model is more suitable for application scenarios with few data features and high requirements on prediction instantaneity compared with the deep learning model. Therefore, in order to further reduce the requirement on the number of training data during the training of the target model, the present embodiment adopts the width learning model as the target model to solve the problems of insufficient training data and data labeling.
Based on the above embodiment, the present invention further provides a device for determining wear of a machining tool for a complex structural member, as shown in fig. 3, the device includes:
the image acquisition module 01 is used for acquiring a cutter image corresponding to a target cutter and inputting the cutter image into a pre-trained target model;
the state judgment module 02 is used for acquiring the actual cutter wear grade output by the target model based on the cutter image;
the training process of the target model comprises the following steps:
acquiring a plurality of historical cutter image sets, wherein the plurality of historical cutter image sets respectively correspond to different cutter wear levels;
generating a plurality of reconstructed tool image sets corresponding to the plurality of historical tool image sets respectively according to the plurality of historical tool image sets, wherein each reconstructed tool image set comprises a plurality of reconstructed tool images, and each reconstructed tool image is generated based on a preset number of historical tool images in the historical tool image set corresponding to the reconstructed tool image set;
generating a training tool image set according to a plurality of historical tool image sets and a plurality of reconstructed tool image sets corresponding to the historical tool image sets respectively;
and performing model training on a target model according to the training cutter image set to obtain the trained target model.
Based on the above embodiments, the present invention further provides a terminal, and a schematic block diagram thereof may be as shown in fig. 4. The terminal comprises a processor, a memory, a network interface and a display screen which are connected through a system bus. Wherein the processor of the terminal is configured to provide computing and control capabilities. The memory of the terminal comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The network interface of the terminal is used for connecting and communicating with an external terminal through a network. The computer program is executed by a processor to implement the method for determining wear of the machining tool for the complex structural member. The display screen of the terminal can be a liquid crystal display screen or an electronic ink display screen.
It will be understood by those skilled in the art that the block diagram of fig. 4 is a block diagram of only a portion of the structure associated with the inventive arrangements and is not intended to limit the terminals to which the inventive arrangements may be applied, and that a particular terminal may include more or less components than those shown, or may have some components combined, or may have a different arrangement of components.
In one implementation, one or more programs are stored in a memory of the terminal and configured to be executed by one or more processors include instructions for performing a complex structural member machining tool wear determination method.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, databases, or other media used in embodiments provided herein may include non-volatile and/or volatile memory. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), synchronous Link (Synchlink) DRAM (SLDRAM), Rambus (Rambus) direct RAM (RDRAM), direct bused dynamic RAM (DRDRAM), and bused dynamic RAM (RDRAM).
In summary, the invention discloses a method for judging the abrasion of a machining cutter of a complex structural part, which comprises the steps of acquiring a cutter image corresponding to a target cutter, and inputting the cutter image into a target model which is trained in advance; acquiring an actual cutter wear grade output by the target model based on the cutter image; the training process of the target model comprises the following steps: acquiring a plurality of historical cutter image sets, wherein the plurality of historical cutter image sets respectively correspond to different cutter wear levels; generating a plurality of reconstructed tool image sets corresponding to the plurality of historical tool image sets respectively according to the plurality of historical tool image sets, wherein each reconstructed tool image set comprises a plurality of reconstructed tool images, and each reconstructed tool image is generated based on a preset number of historical tool images in the historical tool image set corresponding to the reconstructed tool image set; generating a training tool image set according to a plurality of historical tool image sets and a plurality of reconstructed tool image sets corresponding to the historical tool image sets respectively; and performing model training on the target model according to the training tool image set to obtain the trained target model. The method adopts a data reconstruction method to generate the training data of the target model, and reduces the time for acquiring and labeling the training data. The method solves the problem that a large amount of labor cost is needed to collect and label data before training because a large amount of training data is needed to collect in advance to complete model training in the method for predicting the cutter wear grade by adopting a model in the prior art.
It is to be understood that the invention is not limited to the examples described above, but that modifications and variations may be effected thereto by those of ordinary skill in the art in light of the foregoing description, and that all such modifications and variations are intended to be within the scope of the invention as defined by the appended claims.

Claims (7)

1. A method for judging the abrasion of a machining cutter of a complex structural part is characterized by comprising the following steps:
acquiring a cutter image corresponding to a target cutter, and inputting the cutter image into a pre-trained target model;
acquiring an actual cutter wear grade output by the target model based on the cutter image;
the training process of the target model comprises the following steps:
acquiring a plurality of historical cutter image sets, wherein the plurality of historical cutter image sets respectively correspond to different cutter wear levels;
generating a plurality of reconstructed cutter image sets corresponding to the historical cutter image sets respectively according to the plurality of historical cutter image sets, wherein each reconstructed cutter image set comprises a plurality of reconstructed cutter images, and each reconstructed cutter image is generated based on a preset number of historical cutter images in the historical cutter image set corresponding to the reconstructed cutter image set;
generating a training tool image set according to a plurality of historical tool image sets and a plurality of reconstructed tool image sets corresponding to the historical tool image sets respectively;
performing model training on the target model according to the training tool image set to obtain the trained target model;
generating a plurality of reconstructed tool image sets corresponding to the plurality of historical tool image sets respectively according to the plurality of historical tool image sets, including:
determining a plurality of reconstruction combinations corresponding to each historical tool image set according to each historical tool image set, wherein the plurality of reconstruction combinations comprise all image combinations determined based on the preset number;
acquiring reconstructed cutter images respectively corresponding to a plurality of reconstructed combinations, wherein each reconstructed cutter image comprises a foreground image and a background image, the foreground image is determined based on the foreground image of each historical cutter image in the reconstructed combination corresponding to the reconstructed cutter image, and the background image is determined based on the background image of each historical cutter image in the reconstructed combination corresponding to the reconstructed cutter image;
generating a reconstructed cutter image set corresponding to the historical cutter image set according to the reconstructed cutter images respectively corresponding to the plurality of reconstructed combinations;
the obtaining of the reconstructed cutter images corresponding to the plurality of reconstructed combinations respectively comprises:
acquiring a first fusion weight and a second fusion weight which respectively correspond to each historical tool image in each reconstruction combination, wherein the first fusion weight is used for reflecting the importance degree of a foreground image, and the second fusion weight is used for reflecting the importance degree of a background image;
according to the first fusion weight corresponding to each historical cutter image in the reconstruction combination, performing weighted fusion on the foreground images of the historical cutter images in the reconstruction combination to obtain a fusion foreground image corresponding to the reconstruction combination;
according to the second fusion weight corresponding to each historical cutter image in the reconstruction combination, performing weighted fusion on the background image of each historical cutter image in the reconstruction combination to obtain a fusion background image corresponding to the reconstruction combination;
determining the reconstructed cutter image corresponding to the reconstruction combination according to the fusion foreground image and the fusion background image corresponding to the reconstruction combination;
the obtaining of the first fusion weight and the second fusion weight respectively corresponding to each historical tool image in each reconstruction combination includes:
acquiring image proportions corresponding to the historical cutter images in each reconstruction combination, wherein the image proportions are used for reflecting the area ratio of foreground images in the historical cutter images;
determining a first fusion weight and a second fusion weight respectively corresponding to each historical tool image in the reconstruction combination according to the image proportion, wherein the first fusion weight is in a direct proportion relation with the image proportion, the second fusion weight is in an inverse proportion relation with the image proportion, when the area ratio of the foreground image is larger, the foreground part in the historical tool image is more than the background part, the first fusion weight corresponding to the historical tool image is higher, and the second fusion weight is lower; when the area ratio of the foreground image is smaller, the foreground part in the historical tool image is less than the background part, the first fusion weight corresponding to the historical tool image is lower, and the second fusion weight is higher;
the method further comprises the following steps: aiming at each reconstruction combination, acquiring image resolution levels corresponding to all historical cutter images in the reconstruction combination; and determining a first fusion weight corresponding to the historical tool image according to the image resolution grade and the area ratio corresponding to the historical tool image and determining a second fusion weight corresponding to the historical tool image according to the difference value between the highest grade of the image resolution grade and the first fusion weight.
2. The method for determining wear of a machining tool for a complex structural member according to claim 1, wherein the obtaining of the plurality of sets of historical tool images includes:
acquiring a plurality of historical cutter images and cutter current signals respectively corresponding to the historical cutter images, wherein the image acquisition time of each historical cutter image is the same as the acquisition time of the cutter current signal corresponding to the historical cutter image;
determining cutter wear grades respectively corresponding to a plurality of historical cutter images according to cutter current signals respectively corresponding to the plurality of historical cutter images;
and determining a plurality of historical tool image sets according to tool wear grades respectively corresponding to the plurality of historical tool images, wherein the tool wear grades respectively corresponding to the historical tool images in each historical tool image set are the same.
3. The method for determining the wear of the tool for machining the complex structural member according to claim 2, wherein the determining the wear levels of the tool corresponding to the plurality of historical tool images according to the tool current signals corresponding to the plurality of historical tool images respectively comprises:
determining signal waveform data respectively corresponding to a plurality of historical cutter images according to cutter current signals respectively corresponding to the plurality of historical cutter images;
and matching signal waveform data respectively corresponding to a plurality of historical cutter images with a preset signal waveform database to obtain cutter wear grades respectively corresponding to the plurality of historical cutter images, wherein the signal waveform database comprises a plurality of standard signal waveform data, and the plurality of standard signal waveform data respectively correspond to different cutter wear grades.
4. The method for determining wear of a machining tool for a complex structural member according to claim 1, wherein the target model is a width learning model.
5. A complicated structural member processing cutter wear judgment device is characterized in that the device comprises:
the image acquisition module is used for acquiring a cutter image corresponding to a target cutter and inputting the cutter image into a pre-trained target model;
the state judgment module is used for acquiring the actual cutter abrasion grade output by the target model based on the cutter image;
the training process of the target model comprises the following steps:
acquiring a plurality of historical cutter image sets, wherein the plurality of historical cutter image sets respectively correspond to different cutter wear levels;
generating a plurality of reconstructed tool image sets corresponding to the plurality of historical tool image sets respectively according to the plurality of historical tool image sets, wherein each reconstructed tool image set comprises a plurality of reconstructed tool images, and each reconstructed tool image is generated based on a preset number of historical tool images in the historical tool image set corresponding to the reconstructed tool image set;
generating a training tool image set according to a plurality of historical tool image sets and a plurality of reconstructed tool image sets corresponding to the historical tool image sets respectively;
performing model training on a target model according to the training tool image set to obtain the trained target model;
generating a plurality of reconstructed tool image sets corresponding to the plurality of historical tool image sets respectively according to the plurality of historical tool image sets, including:
determining a plurality of reconstruction combinations corresponding to each historical tool image set according to each historical tool image set, wherein the plurality of reconstruction combinations comprise all image combinations determined based on the preset number;
acquiring reconstructed cutter images respectively corresponding to a plurality of reconstructed combinations, wherein each reconstructed cutter image comprises a foreground image and a background image, the foreground image is determined based on the foreground image of each historical cutter image in the reconstructed combination corresponding to the reconstructed cutter image, and the background image is determined based on the background image of each historical cutter image in the reconstructed combination corresponding to the reconstructed cutter image;
generating a reconstructed cutter image set corresponding to the historical cutter image set according to the reconstructed cutter images respectively corresponding to the plurality of reconstructed combinations;
the obtaining of the reconstructed cutter images corresponding to the plurality of reconstructed combinations respectively comprises:
acquiring a first fusion weight and a second fusion weight which respectively correspond to each historical tool image in each reconstruction combination, wherein the first fusion weight is used for reflecting the importance degree of a foreground image, and the second fusion weight is used for reflecting the importance degree of a background image;
according to the first fusion weight corresponding to each historical cutter image in the reconstruction combination, performing weighted fusion on the foreground images of the historical cutter images in the reconstruction combination to obtain a fusion foreground image corresponding to the reconstruction combination;
according to the second fusion weight corresponding to each historical cutter image in the reconstruction combination, performing weighted fusion on the background image of each historical cutter image in the reconstruction combination to obtain a fusion background image corresponding to the reconstruction combination;
determining the reconstructed cutter image corresponding to the reconstruction combination according to the fusion foreground image and the fusion background image corresponding to the reconstruction combination;
the obtaining of the first fusion weight and the second fusion weight respectively corresponding to each historical tool image in each reconstruction combination includes:
acquiring image proportions corresponding to the historical cutter images in each reconstruction combination, wherein the image proportions are used for reflecting the area ratio of foreground images in the historical cutter images;
determining a first fusion weight and a second fusion weight respectively corresponding to each historical tool image in the reconstruction combination according to the image proportion, wherein the first fusion weight is in a direct proportion relation with the image proportion, the second fusion weight is in an inverse proportion relation with the image proportion, when the area ratio of the foreground image is larger, the foreground part in the historical tool image is more than the background part, the first fusion weight corresponding to the historical tool image is higher, and the second fusion weight is lower; when the area ratio of the foreground image is smaller, the foreground part in the historical tool image is less than the background part, the first fusion weight corresponding to the historical tool image is lower, and the second fusion weight is higher;
the device further comprises: aiming at each reconstruction combination, acquiring image resolution grades corresponding to all historical cutter images in the reconstruction combination; and determining a first fusion weight corresponding to the historical tool image according to the image resolution grade and the area ratio corresponding to the historical tool image and determining a second fusion weight corresponding to the historical tool image according to the difference value between the highest grade of the image resolution grade and the first fusion weight.
6. A terminal, characterized in that the terminal comprises a memory and more than one processor; the memory stores more than one program; the program includes instructions for executing the method for determining wear of a machining tool for a complex structural member according to any one of claims 1 to 4; the processor is configured to execute the program.
7. A computer readable storage medium having stored thereon a plurality of instructions, wherein the instructions are adapted to be loaded and executed by a processor to perform the steps of the method for determining wear of a complex structural component machining tool according to any one of claims 1 to 4.
CN202210659535.4A 2022-06-13 2022-06-13 Method for judging abrasion of machining cutter of complex structural part Active CN114742834B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210659535.4A CN114742834B (en) 2022-06-13 2022-06-13 Method for judging abrasion of machining cutter of complex structural part

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210659535.4A CN114742834B (en) 2022-06-13 2022-06-13 Method for judging abrasion of machining cutter of complex structural part

Publications (2)

Publication Number Publication Date
CN114742834A CN114742834A (en) 2022-07-12
CN114742834B true CN114742834B (en) 2022-09-13

Family

ID=82286983

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210659535.4A Active CN114742834B (en) 2022-06-13 2022-06-13 Method for judging abrasion of machining cutter of complex structural part

Country Status (1)

Country Link
CN (1) CN114742834B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111242905A (en) * 2020-01-06 2020-06-05 科大讯飞(苏州)科技有限公司 Method and equipment for generating X-ray sample image and storage device
CN111300144A (en) * 2019-11-25 2020-06-19 上海大学 Automatic detection method for tool wear state based on image processing
JP2021070114A (en) * 2019-10-31 2021-05-06 株式会社ジェイテクト Tool wear prediction system
CN114273977A (en) * 2021-12-15 2022-04-05 重庆文高科技有限公司 MES-based cutter wear detection method and system
CN114346761A (en) * 2022-01-06 2022-04-15 中国科学技术大学 Cutter wear condition detection method for generating countermeasure network based on improved conditions

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6487475B2 (en) * 2017-02-24 2019-03-20 ファナック株式会社 Tool state estimation device and machine tool
US20210197335A1 (en) * 2019-12-26 2021-07-01 Dalian University Of Technology Data Augmentation Method Based On Generative Adversarial Networks In Tool Condition Monitoring
CN111122587B (en) * 2020-01-19 2022-06-28 南京理工大学 Cutter damage detection method based on visual feature extraction
CN112712063B (en) * 2021-01-18 2022-04-26 贵州大学 Tool wear value monitoring method, electronic device and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021070114A (en) * 2019-10-31 2021-05-06 株式会社ジェイテクト Tool wear prediction system
CN111300144A (en) * 2019-11-25 2020-06-19 上海大学 Automatic detection method for tool wear state based on image processing
CN111242905A (en) * 2020-01-06 2020-06-05 科大讯飞(苏州)科技有限公司 Method and equipment for generating X-ray sample image and storage device
CN114273977A (en) * 2021-12-15 2022-04-05 重庆文高科技有限公司 MES-based cutter wear detection method and system
CN114346761A (en) * 2022-01-06 2022-04-15 中国科学技术大学 Cutter wear condition detection method for generating countermeasure network based on improved conditions

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于深度学习的高速铣削刀具磨损状态预测方法;林杨等;《机械与电子》;20170724(第07期);第14-19页 *

Also Published As

Publication number Publication date
CN114742834A (en) 2022-07-12

Similar Documents

Publication Publication Date Title
CN110991649A (en) Deep learning model building method, device, equipment and storage medium
CN104504460A (en) Method and device for predicating user loss of car calling platform
CN110210625B (en) Modeling method and device based on transfer learning, computer equipment and storage medium
KR102109583B1 (en) Method and Apparatus for pricing based on machine learning
US20210398674A1 (en) Method for providing diagnostic system using semi-supervised learning, and diagnostic system using same
CN115829297B (en) Work package generation method, device, terminal and storage medium for assembly type building
DE202018102632U1 (en) Device for creating a model function for a physical system
CN115310562B (en) Fault prediction model generation method suitable for energy storage equipment in extreme state
CN110751175A (en) Method and device for optimizing loss function, computer equipment and storage medium
CN114862776A (en) Product surface defect detection method and device, computer equipment and medium
CN110969600A (en) Product defect detection method and device, electronic equipment and storage medium
CN116089870A (en) Industrial equipment fault prediction method and device based on meta-learning under small sample condition
CN114966413B (en) Method for predicting state of charge of energy storage battery pack
CN114660993B (en) Numerical control machine tool fault prediction method based on multi-source heterogeneous data feature dimension reduction
CN114997744A (en) Equipment health assessment method and device, computer equipment and medium
CN114742834B (en) Method for judging abrasion of machining cutter of complex structural part
CN112379913B (en) Software optimization method, device, equipment and storage medium based on risk identification
CN113222014A (en) Image classification model training method and device, computer equipment and storage medium
CN110276802B (en) Method, device and equipment for positioning pathological tissue in medical image
CN115292964B (en) Visual service life management method, system, terminal and storage medium for processing parts
CN115018842B (en) Defect detection method and device based on machine vision, terminal and storage medium
CN110597874B (en) Data analysis model creation method and device, computer equipment and storage medium
CN115856694A (en) Battery life prediction method and device, computer equipment and storage medium
CN116051880A (en) Result prediction method based on uncertainty evaluation under label noise
CN111899263B (en) Image segmentation method, device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant