CN117218476A - Model training method, using method, device, equipment and storage medium - Google Patents

Model training method, using method, device, equipment and storage medium Download PDF

Info

Publication number
CN117218476A
CN117218476A CN202310489070.7A CN202310489070A CN117218476A CN 117218476 A CN117218476 A CN 117218476A CN 202310489070 A CN202310489070 A CN 202310489070A CN 117218476 A CN117218476 A CN 117218476A
Authority
CN
China
Prior art keywords
type
image
prediction
sample
article
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310489070.7A
Other languages
Chinese (zh)
Inventor
王昌安
张博深
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202310489070.7A priority Critical patent/CN117218476A/en
Publication of CN117218476A publication Critical patent/CN117218476A/en
Pending legal-status Critical Current

Links

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Abstract

The application discloses a model training method, a using method, a device, equipment and a storage medium, and belongs to the technical field of artificial intelligence. The training method of the object quality inspection model comprises the following steps: acquiring a sample image and a defect label of the sample image, wherein the sample image is used for describing image information of a sample article, the defect label comprises a grouping label and a type label, the grouping label is used for indicating whether the sample article has defects, and the type label is used for indicating the article characteristic type corresponding to the sample article under the grouping label; invoking a grouping prediction network in the article quality inspection model to predict the sample image to obtain a prediction grouping of the sample image; invoking a type prediction network in the article quality inspection model to predict the sample image to obtain a predicted type of the sample image; and training the article quality inspection model according to the grouping errors between the grouping labels and the prediction groups and the type errors between the type labels and the prediction types to obtain a trained article quality inspection model.

Description

Model training method, using method, device, equipment and storage medium
Technical Field
The present application relates to the field of artificial intelligence, and in particular, to a model training method, a model training device, a model training apparatus, and a model training storage medium.
Background
The industrial quality inspection technology is based on machine vision and is used for carrying out automatic defect detection on industrial products.
In the related art, specific defect types of the industrial product are obtained through analysis of image characteristics such as image flatness, image consistency and the like of scanned images of the industrial product.
However, the degree of similarity between defect types in industrial products is high, and how to improve the accuracy of the scanned image for analysis is a problem to be solved.
Disclosure of Invention
The application provides a model training method, a using method, a device, equipment and a storage medium, wherein the technical scheme is as follows:
according to an aspect of the present application, there is provided a method of training a quality inspection model of an article, the method comprising:
acquiring a sample image and a defect label of the sample image, wherein the sample image is used for describing image information of a sample article, the defect label comprises a grouping label and a type label, the grouping label is used for indicating whether the sample article has a defect, and the type label is used for indicating an article characteristic type corresponding to the sample article under the grouping label;
Invoking a grouping prediction network in the article quality inspection model to predict the sample image to obtain a prediction grouping of the sample image, wherein the prediction grouping is a prediction result of whether the sample article has defects;
invoking a type prediction network in the article quality inspection model to predict the sample image to obtain a prediction type of the sample image, wherein the prediction type is a prediction result of an article characteristic type of the sample article;
and training the article quality inspection model according to the grouping errors between the grouping labels and the prediction groups and the type errors between the type labels and the prediction types to obtain a trained article quality inspection model.
According to another aspect of the present application, there is provided a method of using an item quality inspection model comprising a packet prediction network and a type prediction network, the method comprising:
acquiring an input image, wherein the input image is an image to be predicted, and the defect type of the image to be predicted is predicted;
invoking the grouping prediction network in the article quality inspection model to predict the input image to obtain quality inspection grouping of the input image;
And calling the type prediction network in the article quality inspection model to perform prediction processing on the input image to obtain the quality inspection type of the input image.
According to another aspect of the present application, there is provided a training apparatus for a quality inspection model of an article, the apparatus comprising:
the system comprises an acquisition module, a detection module and a storage module, wherein the acquisition module is used for acquiring a sample image and a defect label of the sample image, the sample image is used for describing image information of a sample article, the defect label comprises a grouping label and a type label, the grouping label is used for indicating whether the sample article has a defect, and the type label is used for indicating an article characteristic type corresponding to the sample article under the grouping label;
the processing module is used for calling a grouping prediction network in the article quality inspection model to conduct prediction processing on the sample image to obtain a prediction grouping of the sample image, wherein the prediction grouping is a prediction result of whether the sample article has defects or not;
the processing module is further used for calling a type prediction network in the article quality inspection model to conduct prediction processing on the sample image to obtain a prediction type of the sample image, wherein the prediction type is a prediction result of article characteristic types of the sample article;
And the training module is used for training the article quality inspection model according to the grouping errors between the grouping labels and the prediction groups and the type errors between the type labels and the prediction types to obtain a trained article quality inspection model.
In an alternative design of the application, the processing module is further configured to:
invoking the type prediction network, and performing prediction processing on the sample image based on the prediction group to obtain the prediction type of the sample image belonging to the prediction group, wherein the prediction type is used for indicating the article characteristic type corresponding to the sample article in the prediction group;
wherein the prediction type is used for indicating a defect type of the sample article when the prediction group indicates that the sample article is defective, and is used for indicating a cause type of the sample article that is not defective when the prediction group indicates that the sample article is not defective.
In an alternative design of the present application, the packet prediction network includes a feature extraction sub-network and a label prediction sub-network;
the processing module is further configured to:
Invoking the feature extraction sub-network to perform feature extraction processing on the sample image to obtain image feature information of the sample image;
invoking the label prediction sub-network to perform prediction processing on the image characteristic information to obtain the prediction group of the sample image;
and calling the type prediction network, and performing prediction processing on the image characteristic information based on the prediction packet to obtain the prediction type of the prediction packet to which the sample image belongs.
In an alternative design of the application, the training module is further configured to:
configuring a first price weight for the distinction between the grouping label and the predicted grouping, determining the grouping error, and configuring a second price weight for the distinction between the type label and the predicted type, determining the type error, wherein the first price weight and the second price weight are different;
and training the article quality inspection model according to the grouping error and the type error to obtain the trained article quality inspection model.
In an alternative design of the application, the training module is further configured to:
calculating a packet difference between the packet label and the predicted packet, and calculating a type difference between the type label and the predicted type;
Determining a product of the group distinction and the first price weight as the group error;
and determining the product of the type difference and the second cost weight as the type error.
In an alternative design of the application, the processing module is further configured to:
determining the second cost weight according to the type of the type label in the group of the sample image, wherein the second cost weight and the number of the first type labels are in an inverse relation;
the grouping of the sample images is indicated by the grouping label or the prediction grouping of the sample images.
In an alternative design of the application, the processing module is further configured to:
determining the second cost weight according to the correlation information between the type label and the prediction type under the condition that the type label and the prediction type are different;
the correlation information is used for indicating the similarity degree between the type label corresponding to the sample image and the prediction type, and the second cost weight and the similarity degree are in positive correlation.
In an alternative design of the application, the processing module is further configured to:
And adding supplementary weight for the second cost weight under the condition that the packet label and the attribution packet corresponding to the prediction type are different.
In an alternative design of the application, the first cost weight is greater than the second cost weight.
In an optional design of the application, the acquiring module is further configured to acquire a global image, where the global image and the sample image form a sample image group, and the sample image is a region of interest marked in the global image;
the processing module is also used for calling a region prediction network in the article quality inspection model, and performing prediction processing on the global image to obtain a predicted interest region in the global image;
wherein the distinction between the predicted region of interest and the sample image is used to train the item quality inspection model.
In an optional design of the present application, the obtaining module is further configured to obtain a sample size of the type tag of the sample image in a sample library;
the processing module is further configured to, if the sample size does not exceed a reference number, perform augmentation processing on a first sample information set corresponding to the type tag in the sample library, and increase the number of the first sample information set to the reference number;
And/or, in the case that the sample size exceeds the reference number, sampling the first sample information group corresponding to the type tag in the sample library, and reducing the number of the first sample information group to the reference number;
wherein the sample image and the defect label of the sample image are obtained from the sample library.
According to another aspect of the present application, there is provided an apparatus for using an item quality inspection model including a packet prediction network and a type prediction network, the apparatus comprising:
the acquisition module is used for acquiring an input image, wherein the input image is an image to be predicted, and defect type prediction needs to be performed on the image to be predicted;
the processing module is used for calling the grouping prediction network in the article quality inspection model to conduct prediction processing on the input image so as to obtain quality inspection grouping of the input image;
and the processing module is also used for calling the type prediction network in the article quality inspection model to perform prediction processing on the input image so as to obtain the quality inspection type of the input image.
In an optional design of the present application, the obtaining module is further configured to obtain a historical defect probability corresponding to a historical image, where an object to be inspected corresponding to the input image and a historical object corresponding to the historical image have the same object type;
The processing module is further configured to correct a defect packet of the input image to the defect packet for indicating that the input image has no defect if the confidence indicated by the defect label is less than the historical defect probability.
In an alternative design of the application, the acquisition module is further configured to:
acquiring the history image corresponding to the input image, wherein the history image comprises a history defect image with defects and a history sound image without defects;
determining the ratio of the historical defect image in the historical image as the historical defect probability;
the input image is a region of interest in an image to be inspected, the position of the input image on the article to be inspected and the position of the historical image on the historical article have overlapping regions, and the image to be inspected is used for describing global image information of the article to be inspected.
In an alternative design of the application, the processing module is further configured to:
searching an associated image of a history defect image in the history sound image, wherein the associated image and the history defect image are image information of the same object presented from different angles, and an overlapping area exists between the associated image and the history defect image;
And updating label information of the associated image, wherein a historical quality inspection group corresponding to the associated image indicates that the historical article has defects.
In an alternative design of the present application, a first defect probability of the historical defect probabilities is used to indicate a historical defect probability of a first pixel point in the input image; the processing module is further configured to:
smoothing the first defect probability according to the adjacent defect probability of the adjacent pixel points at the periphery of the first pixel point so as to update the first defect probability;
the adjacent defect probability is used for indicating the historical defect probability of the adjacent pixel point.
According to another aspect of the present application, there is provided a computer device comprising a processor and a memory having stored therein at least one instruction, at least one program, code set or instruction set loaded and executed by the processor to implement a method of training an article quality inspection model as described in the above aspect, and/or a method of using an article quality inspection model.
According to another aspect of the present application, there is provided a computer readable storage medium having stored therein at least one instruction, at least one program, a set of codes or a set of instructions, the at least one instruction, the at least one program, the set of codes or the set of instructions being loaded and executed by a processor to implement a method of training an article quality inspection model and/or a method of using an article quality inspection model as described in the above aspects.
According to another aspect of the present application there is provided a computer program product comprising computer instructions stored in a computer readable storage medium, the computer instructions being read from the computer readable storage medium and executed by a processor to implement the method of training a quality inspection model of an item and/or the method of using a quality inspection model of an item as described in the above aspects.
The technical scheme provided by the application has the beneficial effects that at least:
the sample image is processed through a grouping prediction model and a type prediction model in the article quality inspection model, the sample image is predicted from two dimensions of the article characteristic type of the sample image and whether the sample image has defects, the dimension for processing the sample image is expanded, and the information in the sample image is fully mined and utilized; compared with the technical scheme of only carrying out grouping prediction or only carrying out type prediction, the method improves the prediction effect on the characteristic type of the article and improves the accuracy of quality detection on the article on the basis of guaranteeing a good prediction angle on the angle of whether the defect exists.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram of a computer system provided in accordance with an exemplary embodiment of the present application;
FIG. 2 is a schematic diagram of training an item quality inspection model provided in accordance with an exemplary embodiment of the present application;
FIG. 3 is a flow chart of a method of training an item quality inspection model provided in accordance with an exemplary embodiment of the present application;
FIG. 4 is a schematic diagram of a line scan camera provided in an exemplary embodiment of the application;
FIG. 5 is a flowchart of a method of training an item quality inspection model provided in accordance with an exemplary embodiment of the present application;
FIG. 6 is a flowchart of a method of training an item quality inspection model provided in accordance with an exemplary embodiment of the present application;
FIG. 7 is a schematic diagram of a residual unit provided by an exemplary embodiment of the present application;
FIG. 8 is a flowchart of a method of training an item quality inspection model provided in accordance with an exemplary embodiment of the present application;
FIG. 9 is a schematic illustration of a sample article having a defect provided by an exemplary embodiment of the present application;
FIG. 10 is a flowchart of a method of training an item quality inspection model provided in accordance with an exemplary embodiment of the present application;
FIG. 11 is a flowchart of a method of training an item quality inspection model provided in accordance with an exemplary embodiment of the present application;
FIG. 12 is a schematic illustration of an item quality inspection model provided by an exemplary embodiment of the present application;
FIG. 13 is a flowchart of a method of using an item quality inspection model provided by an exemplary embodiment of the present application;
FIG. 14 is a flowchart of a method of using an item quality inspection model provided by an exemplary embodiment of the present application;
FIG. 15 is a schematic representation of a historical defect image and associated image provided by an exemplary embodiment of the present application;
FIG. 16 is a block diagram of a training device for an article quality inspection model provided in accordance with an exemplary embodiment of the present application;
FIG. 17 is a block diagram of an apparatus for using an article quality inspection model according to an exemplary embodiment of the present application;
fig. 18 is a block diagram of a server according to an exemplary embodiment of the present application.
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and together with the description, serve to explain the principles of the application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, the embodiments of the present application will be described in further detail with reference to the accompanying drawings.
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples do not represent all implementations consistent with the application. Rather, they are merely examples of apparatus and methods consistent with aspects of the application as detailed in the accompanying claims.
The terminology used in the present disclosure is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. As used in this disclosure and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any or all possible combinations of one or more of the associated listed items.
It should be noted that, the user information (including but not limited to user equipment information, user personal information, etc.) and the data (including but not limited to data for analysis, stored data, presented data, etc.) related to the present application are information and data authorized by the user or sufficiently authorized by each party, and the collection, use and processing of the related data need to comply with the related laws and regulations and standards of the related country and region. For example, the information such as the sample image and the input image in the present application is acquired under the condition of sufficient authorization.
It should be understood that, although the terms first, second, etc. may be used in this disclosure to describe various information, these information should not be limited by these terms. These terms are only used to distinguish one type of information from another. For example, a first parameter may also be referred to as a second parameter, and similarly, a second parameter may also be referred to as a first parameter, without departing from the scope of the present disclosure. The word "if" as used herein may be interpreted as "at … …" or "at … …" or "responsive to a determination", depending on the context.
FIG. 1 shows a schematic diagram of a computer system provided by one embodiment of the application. The computer system may implement a system architecture that becomes a training method and/or a usage method for the quality inspection model of the item. The computer system may include: a terminal 100 and a server 200.
The terminal 100 may be an electronic device such as a mobile phone, a tablet computer, a vehicle-mounted terminal (car), a wearable device, a PC (Personal Computer ), or the like. The terminal 100 may be provided with a client that runs a target application, which may be an application for training and/or using the quality inspection model of the article, or may be another application that provides a function for training and/or using the quality inspection model of the article, which is not limited in this regard. The present application is not limited to the form of the target Application program, and may be a web page, including but not limited to an App (Application), an applet, etc. installed in the terminal 100.
The server 200 may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing cloud computing services. The server 200 may be a background server of the target application program, and is configured to provide a background service for a client of the target application program.
According to the training method and/or the using method of the object quality inspection model, an execution main body of each step can be computer equipment, and the computer equipment refers to electronic equipment with data calculation, processing and storage capabilities. Taking the implementation environment of the solution shown in fig. 1 as an example, the training method and/or the using method of the quality inspection model of the article may be executed by the terminal 100 (for example, the training method and/or the using method of the quality inspection model of the article may be executed by a client of the terminal 100 installing the running target application program), the training method and/or the using method of the quality inspection model of the article may be executed by the server 200, or the training method and/or the using method of the quality inspection model of the article may be executed by the terminal 100 and the server 200 in an interactive and coordinated manner, which is not limited in the present application.
In addition, the technical scheme of the application can be combined with a block chain technology. For example, the training method and/or the use method of the disclosed object quality inspection model, wherein some data (such as the direction information of the first road and the second road) can be stored on the blockchain. Communication between the terminal 100 and the server 200 may be performed through a network, such as a wired or wireless network.
Next, an article quality inspection model in the present application will be described:
FIG. 2 illustrates a schematic diagram of training an item quality inspection model provided in accordance with one embodiment of the present application.
In this embodiment, the item quality inspection model 300 includes a packet prediction network 310, a type prediction network 320; next, a specific description will be given of the training process of the article quality inspection model 300:
the sample image 302 and the defect label 304 of the sample image 302 are acquired, the defect label 304 includes a group label 304a and a type label 304b, the group label 304a is used for indicating whether the sample image 302 has a defect, for example, the partition label includes a defect group and a sound group, the defect group is used for indicating that the sample image 302 has a defect, and the sound group is used for indicating that the sample image 302 has no defect. The type label 304b is used for indicating the corresponding article characteristic type of the sample image under the grouping label 304 a; taking the defect group as an example, the article characteristic type includes at least one of scratch, mar, and crush. Illustratively, the health group is used to indicate that the performance of the sample article is acceptable, and in particular, the type of article characteristics attributed to the health group may include pass, and/or that the appearance is insufficient due to at least one of oil, dirt, etc., but the actual performance is acceptable.
Invoking a feature extraction sub-network 312 in the packet prediction network 310 to perform feature extraction processing on the sample image 302 to obtain image feature information 332 of the sample image 302; image feature information 332 is used to indicate hidden layer features of sample image 302;
invoking the label prediction subnetwork 314 to perform prediction processing on the image feature information 332 to obtain a prediction packet 334 of the sample image; the prediction group 334 is a prediction result of whether or not the sample image 302 has a defect.
The type prediction network 320 is invoked to predict the image feature information 332 based on the prediction packet 334 to obtain a prediction type 336 of the sample image 302 assigned to the prediction packet 334, where the prediction type 336 is used to indicate the corresponding item feature type of the sample image 302 in the prediction packet 334.
Calculate packet distinction 334a between packet label 304a and predicted packet 334, and calculate type distinction 336a between type label 304b and predicted type 336;
determining the second cost weight 336b by inverting the number of first type tags 306a in the sample tag library 306; the known first type tags 306a in the sample tag library 306 are used to indicate the specific defect type of the corresponding image under the predicted packet 334. The first cost weight 334b is greater than the second cost weight 336b, and specifically, the first cost weight 334b is a preset value greater than the second cost weight 336 b.
Determining the product of the group distinction 334a and the first price weight 334b as a group error 342; determining the product of type difference 336a and second cost weight 336b as type error 344; the item quality inspection model 300 is trained based on the grouping error 342 and the type error 344 to obtain a trained item quality inspection model.
In order to improve the prediction accuracy of the quality inspection model of the article, the quality inspection model of the article needs to be trained, and a training method of the quality inspection model of the article will be described through the following embodiments.
FIG. 3 illustrates a flowchart of a method of training an item quality inspection model provided by an exemplary embodiment of the present application. The method may be performed by a computer device. The method comprises the following steps:
step 510: acquiring a sample image and a defect label of the sample image;
the sample image is illustratively used to describe image information of the sample object, which is illustratively recorded by at least one of photographing, scanning, perspective imaging, and the like.
Further, the sample object is usually a three-dimensional object, and the sample image can be obtained by stitching sub-images obtained by different focal lengths. Taking the scanning method as an example, the scanning directions of the different sample images may be the same or different. FIG. 4 illustrates a schematic diagram of a line scan camera provided by an exemplary embodiment of the present application; the sample object is scanned by the line scanning camera 702, so as to obtain a sample image 704 composed of a plurality of frame images 704a, the arrangement direction of the frame images 704a and the scanning direction of the line scanning camera 702 are mutually perpendicular, the scanning direction of the line scanning camera 702 is also called as the moving direction 706 of the sample object, the frame image 704a is composed of a row of pixels (pixels), in one example, the arrangement mode of the frame images 704a is {1024-16386} ×1 Pixel, and the frame image 704a is a row of pixels composed of 1024 to 16386 pixels.
Illustratively, the defect tags include a group tag for indicating whether the specimen item is defective or not, and a type tag for indicating an item feature type of the specimen item corresponding under the group tag. By way of example, the packet label may be represented directly or indirectly,
the grouping label can be directly acquired, the type label can also be acquired, and the grouping label is indirectly indicated through the grouping to which the type label belongs; illustratively, there is a home relationship between a packet label and a type label, each packet label corresponding to at least two type labels. The home relationship between the packet tag and the type tag is typically preset. And searching the type label according to the attribution relation between the packet label and the type label to obtain a corresponding packet label, namely indirectly indicating the packet label through the type label.
Step 520: invoking a grouping prediction network in the article quality inspection model to predict the sample image to obtain a prediction grouping of the sample image;
illustratively, the prediction group is a prediction of whether the sample item is defective; the prediction processing in this step may be directly processing the sample image, or may be indirectly processing the sample image based on the feature information extracted from the sample image.
Illustratively, the predicted packet is a result of predicting the sample image based on network parameters in the packet prediction network, and the predicted packet is a result of predicting whether the packet prediction network has a defect on the sample article. Exemplary packet prediction networks include, but are not limited to, at least one of convolutional neural networks (Convolutional Neural Network, CNN), recurrent neural networks (Recurrent Neural Network, RNN), depth residual networks (Deep Residual Network, resNet), conversion networks (Transformer Network); the present embodiment does not limit the network structure of the packet-predicted network.
Step 530: invoking a type prediction network in the article quality inspection model to predict the sample image to obtain a predicted type of the sample image;
illustratively, the prediction type is a prediction of an item feature type of the sample item; the prediction processing in this step may be directly or indirectly processing the sample image, for example, performing prediction processing on the feature information extracted from the sample image, so as to implement indirect processing on the sample image. Illustratively, the prediction type is obtained by predicting the sample image based on network parameters in a type prediction network; the present embodiment does not limit the network structure of the type prediction network.
In one example, the prediction processes of the packet prediction network and the type prediction network in the present embodiment are independent of each other; for example, the prediction packet obtained by the packet prediction network does not affect the prediction process of the type prediction network, and the prediction type obtained by the type prediction network does not affect the prediction process of the packet prediction network. It should be noted that, in the case where the prediction processes of the packet prediction network and the type prediction network are independent of each other, there is no exclusion of data exchange between the two prediction networks, for example, the type prediction network predicts using intermediate information generated by the packet prediction network. It is only representative that the prediction of the packet does not impose constraints on the prediction process of the type prediction network and/or that the prediction type does not impose constraints on the prediction process of the packet prediction network. In another example, the prediction processes of the packet-predicted network and the type-predicted network are interrelated; for example, the prediction processing of the type prediction network is performed based on the predicted packets obtained by the packet prediction network.
Step 540: training the article quality inspection model according to the grouping errors between the grouping labels and the prediction groups and the type errors between the type labels and the prediction types to obtain a trained article quality inspection model;
Illustratively, the packet error is used to indicate a distinction between a packet label and a predicted packet, and the packet label and the predicted packet are used to indicate a distinction between a type label and a predicted type. Illustratively, some or all of the model parameters in the quality inspection model are adjusted according to the grouping error and the type error to achieve backward propagation training of the quality inspection model.
By way of example, model training is performed simultaneously according to the grouping error and the type error, and the dimension of processing the sample image is expanded. The classification error indicates the prediction effect of the article quality inspection model from the perspective of whether a defect exists or not, and the type error indicates the prediction effect of the article quality inspection model from the perspective of the article characteristic type; because the article quality inspection model is trained and meanwhile the grouping error and the type error are utilized, the classification error provides constraint for the article quality inspection model, and the prediction effect on the article characteristic type is improved on the basis of ensuring that the angle of whether defects exist has a good prediction angle. The error of the prediction type of the type prediction network in the article quality inspection model is avoided, and the sample image error with the defect is classified into the type corresponding to the defect without the defect, or the sample image error without the defect is classified into the type corresponding to the defect with the defect; creating a serious error problem.
In summary, according to the method provided by the embodiment, the sample image is processed through the grouping prediction model and the type prediction model in the object quality inspection model, the sample image is predicted from the two dimensions of whether the sample image has defects or not and the object feature type of the sample image, the dimension of processing the sample image is expanded, and the information in the sample image is fully mined and utilized; compared with the technical scheme of only carrying out grouping prediction or only carrying out type prediction, the method improves the prediction effect on the characteristic type of the article and improves the accuracy of quality detection on the article on the basis of guaranteeing a good prediction angle on the angle of whether the defect exists.
FIG. 5 illustrates a flowchart of a method of training an item quality inspection model provided by an exemplary embodiment of the present application. The method may be performed by a computer device. That is, in the embodiment shown in fig. 3, step 530 may be implemented as step 532:
step 532: invoking a type prediction network, and performing prediction processing on the sample image based on the prediction packet to obtain a prediction type of the sample image belonging to the prediction packet;
illustratively, the prediction type is used to indicate a corresponding item feature type of the sample item in the prediction group; in this embodiment, the prediction packet is used as a constraint condition for the prediction processing, and the obtained sample image is assigned to the prediction type of the prediction packet.
In one implementation, the type prediction network includes a multi-classifier for classifying and predicting types of the sample item with defects and the sample item without defects, and determining the type with highest confidence in the classification result and belonging to the predicted packet as the predicted type by taking the predicted packet as a constraint. In another implementation, the type prediction network includes two multi-classifiers, and a first multi-classifier of the two multi-classifiers is invoked for prediction processing if the prediction packet indicates that the sample item is defective; and in the case that the prediction group indicates that the sample article does not have a defect, calling a second multi-classifier in the two multi-classifiers to perform prediction processing.
In various embodiments of the present application, in the case where the predictive packet indicates that the sample item is defective, the predictive type is used to indicate a type of defect for the sample item, and in the case where the predictive packet indicates that the sample item is not defective, the predictive type is used to indicate a type of cause for the sample item not being defective.
In summary, according to the method provided by the embodiment, the sample image is processed through the grouping prediction model and the type prediction model in the object quality inspection model, and the sample image is predicted from two dimensions of whether the sample image has a defect or not and the object feature type of the sample image; the type prediction network predicts by taking the prediction group as a constraint condition to obtain a prediction type, expands the dimension for processing the sample image, fully mines and utilizes the information in the sample image, and improves the accuracy of quality detection of the object.
FIG. 6 illustrates a flowchart of a method for training an item quality inspection model provided by an exemplary embodiment of the present application. The method may be performed by a computer device. That is, in the embodiment shown in fig. 5, step 520 may be implemented as steps 522, 524; step 532 may be implemented as step 532a:
step 522: invoking a feature extraction sub-network to perform feature extraction processing on the sample image to obtain image feature information of the sample image;
illustratively, the packet prediction network includes a feature extraction sub-network and a label prediction sub-network; the characteristic extraction sub-network is used for carrying out characteristic extraction processing on the sample image; the image feature information of the sample image is illustratively hidden layer features of the sample image.
In one example, the feature extraction sub-network is a depth residual network; in one implementation, the feature extraction subnetwork is ResNet50, which consists of n residual blocks (blocks), with n being an integer greater than 1, and n being 5 in a preferred embodiment. Each residual block is illustratively composed of residual units connected one after the other. For each residual unit, performing channel downsampling, performing feature transformation by using convolution of 3x3, recovering to the original channel size by channel upsampling, and finally obtaining output features by connecting with the input residual, wherein fig. 7 is a schematic diagram of the residual unit provided by an exemplary embodiment of the present application; illustratively, weight layers (weight layers) represent convolution operations, the output of the residual units being obtained by summing the feature map F (x) after the convolution operation with the input x of the residual units, each weight layer being followed by processing the output of the weight layer using an activation function, illustratively a modified linear unit (Rectified Linear Unit, RELU). In an alternative example, the residual blocks are specifically convolution blocks (ConvBlock), and space downsampling is achieved between each convolution block through maximum pooling (Maxpool) or convolution with a step length of 2, so that the receptive field and the local translation of the feature extraction sub-network are increased. The number of reference channels of each residual block also becomes larger as the network level deepens, so that richer semantic information can be extracted.
Step 524: calling a label prediction sub-network to predict the image characteristic information to obtain a prediction group of the sample image;
the label prediction sub-network is used for predicting to obtain a prediction packet, and the prediction packet is obtained by performing prediction processing on the image characteristic information.
Step 532a: invoking a type prediction network, and performing prediction processing on the image characteristic information based on the prediction packet to obtain a prediction type of the sample image belonging to the prediction packet;
in this embodiment, the type prediction network is a prediction type obtained by performing prediction processing on image feature information obtained by the feature extraction sub-network. In the present embodiment, the prediction processes of the packet prediction network and the type prediction network are interrelated; the prediction processing of the type prediction network is performed based on the predicted packet obtained by the packet prediction network as a constraint condition.
In summary, according to the method provided by the embodiment, the sample image is processed through the grouping prediction model and the type prediction model in the object quality inspection model, and the sample image is predicted from two dimensions of whether the sample image has a defect or not and the object feature type of the sample image; the type prediction network predicts the image characteristic information obtained by carrying out characteristic extraction processing on the sample image by taking the prediction group as a constraint condition to obtain a prediction type, expands the dimension of processing the sample image, fully mines and utilizes the information in the sample image, and improves the accuracy of quality detection on the articles.
FIG. 8 illustrates a flowchart of a method for training an item quality inspection model provided by an exemplary embodiment of the present application. The method may be performed by a computer device. That is, in the embodiment shown in fig. 3, step 540 may be implemented as steps 542, 544:
step 542: configuring a first price weight for the difference between the packet label and the predicted packet, determining a packet error, and configuring a second price weight for the difference between the type label and the predicted type, determining a type error;
illustratively, the distinction between the packet label and the predicted packet is used to indicate whether there is a difference between the packet label and the predicted packet, and in the case where the packet label and the predicted packet are the same, the distinction between the packet label and the predicted packet is 0; similarly, the distinction between type label and prediction type is used to indicate whether there is a difference between type label and prediction type.
In one implementation, the first cost weight and the second cost weight are different, and the important training is performed by configuring different cost weights, grouping distinction between grouping labels and prediction grouping, and type distinction between type labels and prediction types to configure different cost weights for model training, so as to greatly influence the difference of the quality inspection model of the article.
Illustratively, the grouping label is used for indicating whether the sample article has a defect, and the use performance of the sample article with the defect is not qualified, such as at least one of scratch, crush, extrusion and the like of the sample article, while the use performance of the sample article without the defect is qualified, and specific article characteristic types may include pass, and/or the appearance is not sufficient due to at least one of greasy dirt, dirt and the like, but the actual use performance is qualified. FIG. 9 illustrates a schematic diagram of a sample article having defects provided by an exemplary embodiment of the present application; the first image 712 shows one defect type of the sample article, such as a crush injury; the second image 714 shows another defect type of the sample article, such as a scratch. The specific location of the defect is shown in boxes in the first image 712 and the second image 714, and for clarity of display, the specific location of the black line frame defect is selected in the first image 712, and the specific location of the white line frame defect is selected in the second image 714.
In one example, for an item quality inspection model, the ability to discern whether the performance of a sample item is acceptable in use requires significant training in this example to reject items that are unacceptable in quality inspection. In this case, a high cost weight needs to be configured for packet distinction, and a low cost weight needs to be configured for type distinction; the first cost weight is greater than the second cost weight. In another example, for an item quality inspection model, the ability to discern item feature types of a sample item needs to be trained with emphasis in this example, providing a data reference for item production flow optimization. In this case, a low cost weight needs to be configured for packet distinction, and a high cost weight needs to be configured for type distinction. In an optional example, first, distinguishing whether the service performance of a sample object is qualified or not from a key training object quality inspection model so as to realize that a grouping prediction network has accurate grouping prediction capability; then adjusting a training mode, and mainly training the capability of the object quality inspection model to distinguish object characteristic types of sample objects, and configuring high cost weight for type distinction; on the basis that the packet prediction network provides accurate packet prediction capability, the prediction capability of the type prediction network on the characteristic types of the articles is improved. After the model training of the article quality inspection model is completed, the type prediction network obtained by training is used as the trained article quality inspection model, so that the network parameter quantity of the article quality inspection model is saved, the article quality inspection is carried out only through the type prediction network obtained by training, a grouping prediction network is not required to be constructed in the use process of the article quality inspection model, and the accurate article characteristic type prediction is realized through a small quantity of network parameters; meanwhile, the prediction work of whether defects exist or not is also realized according to the existence of the attribution relation between the grouping label and the type label.
In an alternative implementation, this step may be implemented as the following steps:
calculating a packet difference between the packet label and the predicted packet, and calculating a type difference between the type label and the predicted type;
determining a product of the packet difference and the first price weight as a packet error;
determining the product of the type difference and the second cost weight as a type error;
the present embodiment is not limited to the determination of the packet distinction and the type distinction, and the two distinction may be Cross-Entropy (Cross-Entropy) loss or other loss. In one example, the first cost error and the second cost error are both positive numbers greater than 0. As can be seen from the above description, the first cost weight and the second cost weight may be determined independently or in association with each other. In one example, the first price weight is preset, such as by predetermining the first price weight to be 1.
Step 544: training the object quality inspection model according to the grouping error and the type error to obtain a trained object quality inspection model;
illustratively, some or all of the model parameters in the quality inspection model are adjusted according to the grouping error and the type error to achieve backward propagation training of the quality inspection model.
In summary, according to the method provided by the embodiment, the sample image is processed through the grouping prediction model and the type prediction model in the object quality inspection model, and the sample image is predicted from two dimensions of whether the sample image has a defect or not and the object feature type of the sample image; the first price weight is configured for grouping distinction to obtain grouping errors, the second cost weight is configured for type distinction to obtain type errors, different training weights are configured for the grouping errors and the grouping errors in the training process, the dimension of processing the sample image is expanded, the information in the sample image is fully mined and utilized, and the accuracy of quality detection of the articles is improved.
Next, the first cost weight and the second cost weight are described.
In an alternative implementation, as described above, the ability to identify whether the performance of a sample item is acceptable for a key training item quality inspection model is first implemented to achieve that the packet prediction network has accurate packet prediction capabilities; then adjusting a training mode, and mainly training the capability of the object quality inspection model to distinguish object characteristic types of sample objects, and configuring high cost weight for type distinction; on the basis that the packet prediction network provides accurate packet prediction capability, the prediction capability of the type prediction network on the characteristic types of the articles is improved.
In this embodiment, the first price weight and the second price weight are obtained according to the training times of the quality inspection model of the article. Further, the first price weight and the training times of the article quality inspection model are in inverse correlation, and the second price weight and the training times of the article quality inspection model are in positive correlation. With the increase of the training times of the article quality inspection model, the key point of model training is to gradually transfer to the capability of the key training article quality inspection model to distinguish the article characteristic type of the sample article from the capability of the training article quality inspection model to distinguish whether the service performance of the sample article is qualified or not.
Further, the sum of the first price weight and the second price weight is unchanged, and the sum of the first price weight and the second price weight can be a fixed value which is preset, such as 1.
Further, the first price error is corrected based on the historical packet differences. The historical grouping distinction is the distinction between the historical prediction grouping corresponding to the historical sample image and the grouping label of the historical sample image in the historical training process of the historical sample image to the article quality inspection model. And determining a grouping correction amount according to the historical grouping difference, wherein the grouping correction amount and the historical grouping difference are in positive correlation, the grouping correction amount is a positive number, and the corrected first price error is the sum of the grouping correction amount and the first price error.
In the present embodiment, the grouping correction amount is used to indicate, in the case where there is a shortage of the capability of the article quality inspection model to discriminate whether the use performance of the sample article is acceptable, the degree of capability shortage for discriminating whether the use performance of the sample article is acceptable according to the history grouping discrimination; and correcting the first price error according to the grouping correction quantity, and ensuring that a grouping prediction network in the article quality inspection model provides accurate grouping prediction capability. Further, in the case where the historical packet difference is greater than the packet error threshold, the first price error is corrected based on the historical packet difference. The packet error threshold is preset.
Similarly, the second cost error is corrected based on the history type distinction. The historical type distinction is the distinction between the historical prediction type corresponding to the historical sample image and the type label of the historical sample image in the historical training process of the article quality inspection model through the historical sample image. And determining a type correction amount according to the history type difference, wherein the type correction amount and the history type difference are in positive correlation, the type correction amount is a positive number, and the corrected second cost error is the sum of the type correction amount and the second cost error. Further, in the case where the history type difference is greater than the type error threshold, the second cost error is corrected according to the history type difference. The type error threshold is preset. Further, in the case where the historical type distinction is greater than the type error threshold and the historical packet distinction is greater than the packet error threshold, the first price error is corrected based only on the historical packet distinction. The method comprises the steps of training the capability of distinguishing whether the service performance of a sample article is qualified or not by the article quality inspection model preferentially, and then, training the capability of distinguishing the article characteristic type of the sample article by the article quality inspection model in a focused mode under the condition that whether the service performance of the sample article is qualified or not can be distinguished by the article quality inspection model.
Next, the second cost weight is further described; the manner of determining the second cost weight may be implemented as at least one of the following implementations on the basis of the embodiment shown in fig. 8.
The implementation mode is as follows: determining a second cost weight according to the type of the type label in the group of the sample image;
illustratively, the grouping of the sample images is indicated by a grouping tag or predictive grouping of the sample images; the group of sample images includes at least two types of type labels for indicating the type of article characteristics of the group of sample articles. Illustratively, the type tags in the group in which the sample image is located are obtained from a library of known sample tags.
Illustratively, the second cost weight and the type label are in an inverse relation; in one example, the second cost weight is the reciprocal of the category of the type tag.
In a specific example, the indication that the sample article is not defective is referred to as an OK group, where N types exist; the indication that the sample item is defective is referred to as an NG group in which there are M types. For the NG group, the sample items are misclassified into other types in the NG group, the second cost weight is 1/(M-1), and for the OK group, the sample items are misclassified into other types in the OK group, the second cost weight is 1/(N-1).
In summary, according to the method provided by the embodiment, the sample image is processed through the grouping prediction model and the type prediction model in the object quality inspection model, and the sample image is predicted from two dimensions of whether the sample image has a defect or not and the object feature type of the sample image; the type error is obtained by configuring the second cost weight for type distinction, the second cost weight is reduced along with the increase of the number of the first type labels, the influence of the number of the first type labels on the complexity degree of type distinction is considered, the dimension for processing the sample image is expanded, the information in the sample image is fully mined and utilized, and the accuracy of quality detection of the articles is improved.
The implementation mode II is as follows: under the condition that the type label and the prediction type are different, determining a second cost weight according to the correlation information between the type label and the prediction type;
illustratively, the correlation information is used for indicating the similarity degree between the type label corresponding to the sample image and the prediction type, and the second cost weight and the similarity degree are in positive correlation. Illustratively, a second cost weight is determined according to the correlation information between the type label and the prediction type, a high cost weight is configured for a type distinction with high similarity between the type label and the prediction type, the type distinction is emphasized,
In an alternative implementation, the method further includes: and adding supplementary weight for the second cost weight under the condition that the packet labels and the attribution packets corresponding to the prediction types are different.
Illustratively, in the case where the packet label and the belonging packet to which the prediction type corresponds are different, the type prediction network in the article quality inspection model has an error in the prediction type, and classifies the sample image error having the defect as corresponding to the type having no defect, or classifies the sample image error having no defect as corresponding to the type having the defect; serious errors are generated. And adding supplementary weight to the second cost weight to improve the cost weight for training the type distinction.
In summary, according to the method provided by the embodiment, the sample image is processed through the grouping prediction model and the type prediction model in the object quality inspection model, and the sample image is predicted from two dimensions of whether the sample image has a defect or not and the object feature type of the sample image; the second cost weight is configured for type distinction to obtain type errors, the second cost weight is reduced along with the improvement of the similarity, the influence of the similarity between type labels on the complexity of type distinction is considered, the dimension for processing the sample image is expanded, the information in the sample image is fully mined and utilized, and the accuracy of quality detection of objects is improved.
FIG. 10 illustrates a flowchart of a method of training an item quality inspection model provided by an exemplary embodiment of the present application. The method may be performed by a computer device. I.e. in the embodiment shown in fig. 3, further comprising steps 502, 515:
step 502: acquiring a global image;
illustratively, the global image and the sample image form a sample image group, the sample image is a region of interest marked in the global image, one or more regions of interest are marked in the global image, and the sample image group comprises the global image and the regions of interest in the global image;
step 515: calling a region prediction network in the object quality inspection model, and performing prediction processing on the global image to obtain a predicted interest region in the global image;
in this embodiment, the object quality inspection model further includes a region prediction network, where the region prediction network is used to predict a region of interest in the global image. Illustratively, the distinction between the predicted region of interest and the sample image is used to train the item quality inspection model. Illustratively, the error in training the item quality inspection model further includes a region error that indicates a difference between the predicted region of interest and the sample image. And training the object quality inspection model through the region error, and improving the marking capability of the region prediction network on the region of interest.
The present embodiment does not limit the execution time sequence between the step 515 and the step 510, and the step 515 may be executed before, after, or simultaneously with the step 510.
In summary, according to the method provided by the embodiment, the sample image is processed through the grouping prediction model and the type prediction model in the article quality inspection model, the sample image is predicted from whether the sample image has defects or not and the two dimensions of the article characteristic type of the sample image, the prediction interest area in the global image is predicted and marked through the area prediction network in the article quality inspection model, the dimension of processing the sample image is expanded, the information in the sample image is fully mined and utilized, and the accuracy of quality detection of the article is improved.
FIG. 11 illustrates a flowchart of a method for training an item quality inspection model provided by an exemplary embodiment of the present application. The method may be performed by a computer device. I.e. in the embodiment shown in fig. 3, further comprising a step 504, a step 506a, a step 506b:
step 504: acquiring the sample quantity of the type label of the sample image in a sample library;
illustratively, the sample library includes a plurality of sample information sets, each sample information set being composed of a sample image and a defect label for the sample image. Illustratively, the sample size is used to indicate the number of first sample information sets in the sample library, where the same type tag as the currently acquired sample image exists.
Step 506a: performing augmentation processing on a first sample information group corresponding to the type tag in the sample library under the condition that the sample size does not exceed the reference number, and increasing the number of the first sample information group to the reference number;
the reference number may be preset, or may be determined according to a sample library. For example, the reference number may be used to indicate a fixed number, or may be used to indicate a range of numbers; the upper limit value and the lower limit value of the fixed number or the number range indicated by the further reference number are positive integers. In one example, the reference number is a ratio between the number of sample information sets in the sample library and the type of type tag. For example, the indication that the sample article is not defective is referred to as an OK group, in which N types exist; the indication that the sample item is defective is referred to as an NG group in which there are M types. The types of the type tags are M+N.
Exemplary ways of augmenting the process include, but are not limited to, at least one of: repeating the first sample information group corresponding to the type label, rotating the sample image in the first sample information group, and cutting to form a new sample information group.
Step 506b: under the condition that the sample quantity exceeds the reference quantity, sampling the first sample information group corresponding to the type label in the sample library, and reducing the quantity of the first sample information group to the reference quantity;
the sampling process may be random sampling or sampling at the same interval, and the embodiment is not limited to the specific manner of sampling process.
Illustratively, the specimen image and the defect label of the specimen image are obtained from a specimen library. Such as the sample image and the defect label of the sample image in step 510 in an embodiment provided by the present application, are obtained from a sample library.
The present embodiment is described by taking the example that the embodiment includes the step 506a and the step 506b as an example, but the case that the step 506a and the step 506b are split into different embodiments and separately implemented is not excluded, and a new embodiment is formed.
FIG. 12 illustrates a schematic diagram of an item quality inspection model provided by an exemplary embodiment of the present application; illustratively, the network structure in the present embodiment for performing the augmentation process and/or the sampling process is taken as a detection head 1, the packet prediction network above is taken as a detection head 2, and the type prediction network above is taken as a detection head 3; the above region prediction network is specifically implemented by using a residual network ResNet50 for feature extraction, using a framework of a continuous delivery (Cascade) based region method (Regions with CNN features, R-CNN) of convolutional neural network features for defect detection, and predicting a marked region of interest 722; and forming an article quality inspection model. The region of interest 722 is marked through the region prediction network, the region of interest 722 is processed based on the three detection heads, the detection head 1 carries out instance-level balanced lifting recall, the detection head 2 carries out inter-group cost sensitive learning, and the detection head 3 carries out inter-type cost sensitive learning.
In summary, according to the method provided by the embodiment, the sample image is processed through the grouping prediction model and the type prediction model in the object quality inspection model, the sample image is predicted from whether the sample image has defects or not and the two dimensions of the object feature type of the sample image, and the sample amount of the type label of the sample image in the sample library is obtained, so that the sample amount of different types of labels is balanced by sampling or amplifying the first sample information group; the dimension of processing the sample image is expanded, the information in the sample image is fully mined and utilized, and the accuracy of quality detection of the articles is improved.
FIG. 13 illustrates a flowchart of a method of using an item quality inspection model provided by an exemplary embodiment of the present application. The method may be performed by a computer device. The method comprises the following steps:
step 610: acquiring an input image;
the input image is an image to be predicted, which needs defect type prediction; the input image is illustratively used to describe image information of the item to be inspected, which is illustratively recorded by at least one of photographing, scanning, perspective imaging, etc.
Step 620: invoking a grouping prediction network in the article quality inspection model to predict an input image to obtain quality inspection grouping of the input image;
illustratively, the item quality inspection model includes a packet prediction network and a type prediction network; illustratively, the article quality inspection model is obtained through training, and the article quality inspection model comprises a packet prediction network and a type prediction network which are obtained through training; in an alternative implementation, the item quality inspection model is trained via the item quality inspection model training method above. The quality inspection type is used for indicating whether the object to be inspected has a defect.
The prediction processing in this step may be directly processing the input image, or may be indirectly processing the input image based on the feature information extracted from the input image. Further, for the description of the packet prediction network, please refer to the above embodiments of the training method of the quality inspection model, and the description thereof is omitted herein.
Step 630: and calling a type prediction network in the article quality inspection model to perform prediction processing on the input image, so as to obtain the quality inspection type of the input image.
The prediction processing in this step may be, for example, direct processing of the input image or indirect processing of the input image based on the feature information extracted from the input image. The quality inspection type is used for indicating the corresponding article characteristic type of the article to be inspected under the quality inspection grouping. For example, in the case that the quality inspection group indicates that the object to be inspected is defective, the prediction type is used to indicate a defect type of the object to be inspected, and in the case that the quality inspection group indicates that the object to be inspected is not defective, the prediction type is used to indicate a cause type of the object to be inspected that is not defective.
Further, for the description of the type prediction network and the relationship between the type prediction network and the packet prediction network, please refer to the above embodiments of the training method of the quality inspection model, which are not described herein.
In summary, according to the method provided by the embodiment, the input image is processed through the grouping prediction model and the type prediction model in the object quality inspection model, the input image is predicted from whether the input image has defects or not and the two dimensions of the object feature type of the input image, the dimension of processing the input image is expanded, the information in the input image is fully mined and utilized, and the accuracy of quality inspection of the object to be inspected is improved.
FIG. 14 is a flow chart illustrating a method of using an item quality inspection model provided by an exemplary embodiment of the present application. The method may be performed by a computer device. I.e. in the embodiment shown in fig. 13, further comprising steps 632, 634:
step 632: acquiring a history defect probability corresponding to a history image;
the to-be-inspected articles corresponding to the input images and the historical articles corresponding to the historical images have the same article type;
The historical defect probability is obtained under the condition that the quality inspection group indicates that the input image has defects, and an association relationship exists between the historical image corresponding to the defect probability information obtained in the step and the input image; further, the historical defect probability is a probability that the historical image detects that a defect exists, and may be a probability that a defect type identical to the quality inspection type is detected in the historical image.
The to-be-inspected object corresponding to the input image and the history object corresponding to the history image have the same object type, for example, at least one of an object model, an appearance, a size, a material and a processing technology of the to-be-inspected object and the history object are the same, and in one example, the to-be-inspected object and the history object are industrial products of the same model obtained by adopting the same processing technology. Further, the position of the input image on the object to be inspected and the position of the history image on the history object have overlapping areas. Further, the duty ratio of the overlapping area in the input image exceeds a preset proportion, and the value of the preset proportion is a preset value exceeding 0 and not exceeding 1.
In an alternative implementation, this step may be implemented as:
Acquiring a history image corresponding to an input image;
determining the duty ratio of the historical defect image in the historical image as the historical defect probability;
illustratively, the history image includes a history defect image with a defect and a history sound image without a defect. The historical quality inspection group corresponding to the historical defect image indicates that the historical article has defects, and the historical quality inspection group corresponding to the historical sound image indicates that the historical article has no defects.
Illustratively, the input image is a region of interest in an image to be inspected, the image to be inspected being used to describe global image information of the item to be inspected. Illustratively, the location of the input image on the item to be inspected and the location of the history image on the history item have overlapping areas, and for an introduction of the existence of overlapping areas, reference is made to the introduction above.
Step 634: correcting the defect group of the input image into a defect group for indicating that the input image has no defect under the condition that the confidence of the defect label indication is smaller than the historical defect probability;
illustratively, the confidence in the defect label indication is predicted by a type prediction network in the item quality inspection model. The defect grouping of the input image is corrected to be the defect grouping for indicating that the input image has no defects, so that the confidence of defect detection on the low-frequency defect position is ensured, and the quality inspection accuracy is ensured.
In summary, in the method provided in this embodiment, the input image is processed through the group prediction model and the type prediction model in the quality inspection model, and the input image is predicted from two dimensions of whether the input image has a defect or not and the type of the object feature of the input image; under the condition that the confidence coefficient of defect label indication is smaller than the historical defect probability, correcting the defect group of the input image into a defect group for indicating that the input image has no defects, ensuring the confidence coefficient of defect detection on the low-frequency defect position, expanding the dimension of processing the input image, fully mining and utilizing the information in the input image, and improving the accuracy of quality detection on the objects to be inspected.
Next, the history defect image and the history defect probability are further described.
In one implementation, on the basis of the embodiment shown in fig. 14, the following two steps are further included:
step 11: searching the associated image of the history defect image in the history sound image;
for example, the associated image and the history defect image represent image information of the same object from different angles, and it should be noted that the associated image and the history defect image correspond to the same object, and the associated image and the history defect image represent different angles, such as different scanning angles, for the image information of the history object. Illustratively, there is an overlap region between the associated image and the historical defect image; for an introduction of the overlap region, and optionally the duty cycle of the overlap region in the historical defect image, refer to the introduction in step 632.
Step 12: updating label information of the associated image, wherein a history quality inspection group corresponding to the associated image indicates that a history article has a defect;
the label information of the associated image comprises quality inspection groups and quality inspection types of the associated image, the quality inspection groups and the quality inspection types of the associated image are updated to be the same labels as those of the historical defect images, and after the quality inspection groups and the quality inspection types are updated, the historical quality inspection groups corresponding to the associated image indicate that the historical articles have defects.
FIG. 15 illustrates a schematic view of a historical defect image and associated image provided by an exemplary embodiment of the present application; the global image 750 of the history item indicates image information of the history item, the first associated image 752a is an associated image corresponding to the first history defect image 752b, the two images are scanned, the scanning directions of the two images are different, the scanning direction of the first associated image 752a is a first direction 750a, and the scanning direction of the first history defect image 752b is a second direction 750b. Similarly, the second correlation image 754a is a correlation image corresponding to the second historical defect image 754b, and the third correlation image 756a is a correlation image corresponding to the third historical defect image 752 b. Illustratively, the first associated image 752a and the first historical defect image 752b are located in an upper right corner region of the global image 750, the second associated image 754a and the second historical defect image 754b are located in an intermediate region of the global image 750, and the third associated image 756a and the third historical defect image 752b are located in a lower left corner region of the global image 750.
In summary, in the method provided in this embodiment, the input image is processed through the group prediction model and the type prediction model in the quality inspection model, and the input image is predicted from two dimensions of whether the input image has a defect or not and the type of the object feature of the input image; by updating the label information of the associated image, the marks of the historical quality inspection group and the historical quality inspection type are supplemented in the historical image, the dimension of processing the input image is expanded, the information in the input image is fully mined and utilized, and the accuracy of quality inspection of the articles to be inspected is improved.
In one implementation, on the basis of the embodiment shown in fig. 14, the method further includes the following steps:
step 21: smoothing the first defect probability of the first pixel according to the adjacent defect probability of the adjacent pixel on the periphery of the first pixel so as to update the first defect probability;
illustratively, the historical defect probabilities are used to indicate historical defect probability information for at least one pixel point in the input image, and a first defect probability of the historical defect probabilities is used to indicate a historical defect probability for a first pixel point in the input image.
Illustratively, the adjacent pixel points are at least one pixel point at the peripheral side of the first pixel point, and the distance between the first pixel points does not exceed a preset distance. The adjacent defect probability is used for indicating the historical defect probability of the adjacent pixel point, and the adjacent defect probability can be information in the historical defect probability of the input image or information in the historical defect probabilities corresponding to other input images except the input image.
Illustratively, the present embodiment does not limit the specific manner of smoothing processing, and in one example, an average value of the first defect probability and the adjacent defect probability is determined as the updated first defect probability.
In summary, in the method provided in this embodiment, the input image is processed through the group prediction model and the type prediction model in the quality inspection model, and the input image is predicted from two dimensions of whether the input image has a defect or not and the type of the object feature of the input image; by means of smoothing, the first defect probability in the historical defect probability is updated by referring to the adjacent pixel points on the periphery, the dimension of processing the input image is expanded, the information in the input image is fully mined and utilized, and the accuracy of quality detection of the objects to be detected is improved.
In one example, the item quality inspection model includes a feature extraction network, a quality inspection network.
And acquiring a sample image, wherein a residual network ResNet50 is used as a characteristic extraction network, the sample image is subjected to characteristic extraction, and the image characteristic information of the sample image is extracted to carry semantic information contained in the sample image.
In an industrial quality inspection scenario, the quality inspection network will detect a number of different defect categories, the output of the quality inspection network including the actual defect category, i.e., NG group, which is typically a defect category intolerable in industrial production. The output of the quality inspection network also includes an OK group, which is not a real defect, but often has very similar appearance characteristics to the defect.
In an actual industrial scene, different NG classes have certain similarity, and the actual industrial scene has low requirement on the identification capability of the specific defect class of the industrial object (such as the capability of requiring important training to distinguish whether the service performance of the industrial object is qualified or not so as to reject the industrial object with unqualified quality inspection).
In this embodiment, the quality inspection network outputs a predicted packet and a predicted type of the industrial item, and the training of the item quality inspection model in this embodiment is performed according to a packet loss constraint and a type loss constraint.
Specifically, assume that there are a total of M NG groups, N OK groups. For the NG group, when the error is divided into other types in the NG group, the cost weight of the type loss constraint configuration is 1/(M-1), and when the error is divided into the OK group, the cost weight of the group loss constraint configuration is 1; similarly, for an OK group, the cost weight of the type loss constraint configuration is 1/(N-1) when it is misclassified into other types in other OK groups, and the cost weight of the group loss constraint configuration is 1 when it is misclassified into NG groups. The cost weights for the packet loss constraint and type loss constraint configuration are set to 1 for the weights of the correct packet and classification. And obtaining a cost table C according to the flow, and using the cost table C as weight to guide the article quality inspection model to focus training on distinguishing the OK group and the NG group.
It will be appreciated by those skilled in the art that the foregoing embodiments may be implemented independently, or the foregoing embodiments may be freely combined to form new embodiments to implement the training method of the quality inspection model of the article and/or the use method of the quality inspection model of the article.
FIG. 16 shows a block diagram of a training apparatus for an item quality inspection model, according to an exemplary embodiment of the present application. The device comprises:
An obtaining module 810, configured to obtain a sample image and a defect label of the sample image, where the sample image is used to describe image information of a sample article, the defect label includes a group label and a type label, the group label is used to indicate whether the sample article has a defect, and the type label is used to indicate an article feature type corresponding to the sample article under the group label;
the processing module 820 is configured to invoke a packet prediction network in the article quality inspection model to perform prediction processing on the sample image, so as to obtain a prediction packet of the sample image, where the prediction packet is a prediction result of whether the sample article has a defect;
the processing module 820 is further configured to invoke a type prediction network in the item quality inspection model to perform prediction processing on the sample image, so as to obtain a prediction type of the sample image, where the prediction type is a prediction result of an item feature type of the sample item;
the training module 830 is configured to train the quality inspection model according to the packet error between the packet label and the predicted packet, and the type error between the type label and the predicted type, so as to obtain a trained quality inspection model of the article.
In an alternative design of the present application, the processing module 820 is further configured to:
invoking the type prediction network, and performing prediction processing on the sample image based on the prediction group to obtain the prediction type of the sample image belonging to the prediction group, wherein the prediction type is used for indicating the article characteristic type corresponding to the sample article in the prediction group;
wherein the prediction type is used for indicating a defect type of the sample article when the prediction group indicates that the sample article is defective, and is used for indicating a cause type of the sample article that is not defective when the prediction group indicates that the sample article is not defective.
In an alternative design of the present application, the packet prediction network includes a feature extraction sub-network and a label prediction sub-network;
the processing module 820 is further configured to:
invoking the feature extraction sub-network to perform feature extraction processing on the sample image to obtain image feature information of the sample image;
invoking the label prediction sub-network to perform prediction processing on the image characteristic information to obtain the prediction group of the sample image;
And calling the type prediction network, and performing prediction processing on the image characteristic information based on the prediction packet to obtain the prediction type of the prediction packet to which the sample image belongs.
In an alternative design of the present application, the training module 830 is further configured to:
configuring a first price weight for the distinction between the grouping label and the predicted grouping, determining the grouping error, and configuring a second price weight for the distinction between the type label and the predicted type, determining the type error, wherein the first price weight and the second price weight are different;
and training the article quality inspection model according to the grouping error and the type error to obtain the trained article quality inspection model.
In an alternative design of the present application, the training module 830 is further configured to:
calculating a packet difference between the packet label and the predicted packet, and calculating a type difference between the type label and the predicted type;
determining a product of the group distinction and the first price weight as the group error;
and determining the product of the type difference and the second cost weight as the type error.
In an alternative design of the present application, the processing module 820 is further configured to:
determining the second cost weight according to the type of the type label in the group of the sample image, wherein the second cost weight and the number of the first type labels are in an inverse relation;
the grouping of the sample images is indicated by the grouping label or the prediction grouping of the sample images.
In an alternative design of the present application, the processing module 820 is further configured to:
determining the second cost weight according to the correlation information between the type label and the prediction type under the condition that the type label and the prediction type are different;
the correlation information is used for indicating the similarity degree between the type label corresponding to the sample image and the prediction type, and the second cost weight and the similarity degree are in positive correlation.
In an alternative design of the present application, the processing module 820 is further configured to:
and adding supplementary weight for the second cost weight under the condition that the packet label and the attribution packet corresponding to the prediction type are different.
In an alternative design of the application, the first cost weight is greater than the second cost weight.
In an optional design of the application, the acquiring module 810 is further configured to acquire a global image, where the global image and the sample image form a sample image group, and the sample image is a region of interest marked in the global image;
the processing module 820 is further configured to invoke a region prediction network in the quality inspection model, and perform prediction processing on the global image to obtain a predicted region of interest in the global image;
wherein the distinction between the predicted region of interest and the sample image is used to train the item quality inspection model.
In an optional design of the present application, the obtaining module 810 is further configured to obtain a sample size of the type tag of the sample image in a sample library;
the processing module 820 is further configured to, if the sample size does not exceed a reference number, perform augmentation processing on a first sample information set corresponding to the type tag in the sample library, and increase the number of the first sample information set to the reference number;
and/or, in the case that the sample size exceeds the reference number, sampling the first sample information group corresponding to the type tag in the sample library, and reducing the number of the first sample information group to the reference number;
Wherein the sample image and the defect label of the sample image are obtained from the sample library.
Fig. 17 is a block diagram showing a device for using the object quality inspection model according to an exemplary embodiment of the present application. An item quality inspection model includes a packet prediction network and a type prediction network, the apparatus comprising:
an obtaining module 840, configured to obtain an input image, where the input image is an image to be predicted that needs to be predicted for a defect type;
a processing module 850, configured to invoke the packet prediction network in the quality inspection model to perform prediction processing on the input image, so as to obtain an inspection packet of the input image;
the processing module 850 is further configured to invoke the type prediction network in the quality inspection model to perform prediction processing on the input image, so as to obtain the quality inspection type of the input image.
In an optional design of the present application, the obtaining module 840 is further configured to obtain a historical defect probability corresponding to a historical image, where an item to be inspected corresponding to the input image and a historical item corresponding to the historical image have the same item type;
the processing module 850 is further configured to correct a defect packet of the input image to be the defect packet for indicating that the input image has no defect if the confidence indicated by the defect label is less than the historical defect probability.
In an alternative design of the present application, the obtaining module 840 is further configured to:
acquiring the history image corresponding to the input image, wherein the history image comprises a history defect image with defects and a history sound image without defects;
determining the ratio of the historical defect image in the historical image as the historical defect probability;
the input image is a region of interest in an image to be inspected, the position of the input image on the article to be inspected and the position of the historical image on the historical article have overlapping regions, and the image to be inspected is used for describing global image information of the article to be inspected.
In an alternative design of the present application, the processing module 850 is further configured to:
searching an associated image of a history defect image in the history sound image, wherein the associated image and the history defect image are image information of the same object presented from different angles, and an overlapping area exists between the associated image and the history defect image;
and updating label information of the associated image, wherein a historical quality inspection group corresponding to the associated image indicates that the historical article has defects.
In an alternative design of the present application, a first defect probability of the historical defect probabilities is used to indicate a historical defect probability of a first pixel point in the input image; the processing module 850 is further configured to:
Smoothing the first defect probability according to the adjacent defect probability of the adjacent pixel points at the periphery of the first pixel point so as to update the first defect probability;
the adjacent defect probability is used for indicating the historical defect probability of the adjacent pixel point.
It should be noted that, when the apparatus provided in the foregoing embodiment performs the functions thereof, only the division of the respective functional modules is used as an example, in practical application, the foregoing functional allocation may be performed by different functional modules according to actual needs, that is, the content structure of the device is divided into different functional modules, so as to perform all or part of the functions described above.
With respect to the apparatus in the above embodiments, the specific manner in which the respective modules perform the operations has been described in detail in the embodiments regarding the method; the technical effects achieved by the execution of the operations by the respective modules are the same as those in the embodiments related to the method, and will not be described in detail herein.
The embodiment of the application also provides a computer device, which comprises: a processor and a memory, the memory storing a computer program; the processor is configured to execute the computer program in the memory to implement the training method of the quality inspection model of the article and/or the usage method of the quality inspection model of the article provided by the above method embodiments.
Optionally, the computer device is a server. Illustratively, fig. 18 is a block diagram of a server provided by an exemplary embodiment of the present application.
In general, the server 2300 includes: a processor 2301 and a memory 2302.
The processor 2301 may include one or more processing cores, such as a 4-core processor, an 8-core processor, or the like. The processor 2301 may be implemented in hardware in at least one of digital signal processing (Digital Signal Processing, DSP), field programmable gate array (Field-Programmable Gate Array, FPGA), programmable logic array (Programmable Logic Array, PLA). The processor 2301 may also include a main processor, which is a processor for processing data in an awake state, also referred to as a central processor (Central Processing Unit, CPU), and a coprocessor; a coprocessor is a low-power processor for processing data in a standby state. In some embodiments, the processor 2301 may be integrated with an image processor (Graphics Processing Unit, GPU) for use in connection with rendering and rendering of content to be displayed by the display screen. In some embodiments, the processor 2301 may also include an artificial intelligence (Artificial Intelligence, AI) processor for processing computing operations related to machine learning.
Memory 2302 may include one or more computer-readable storage media, which may be non-transitory. Memory 2302 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 2302 is used to store at least one instruction for execution by processor 2301 to implement the method of training the quality inspection model of an item and/or the method of using the quality inspection model of an item provided by a method embodiment of the present application.
In some embodiments, server 2300 may further optionally include: an input interface 2303 and an output interface 2304. The processor 2301 and the memory 2302 may be connected to the input interface 2303 and the output interface 2304 through buses or signal lines. The respective peripheral devices may be connected to the input interface 2303 and the output interface 2304 through buses, signal lines, or a circuit board. Input interface 2303, output interface 2304 may be used to connect at least one Input/Output (I/O) related peripheral device to processor 2301 and memory 2302. In some embodiments, the processor 2301, memory 2302, and input interface 2303, output interface 2304 are integrated on the same chip or circuit board; in some other embodiments, the processor 2301, the memory 2302, and either or both of the input interface 2303 and the output interface 2304 may be implemented on separate chips or circuit boards, as embodiments of the application are not limited in this respect.
Those skilled in the art will appreciate that the structures shown above are not limiting of server 2300 and may include more or fewer components than shown, or may combine certain components, or may employ a different arrangement of components.
In an exemplary embodiment, a chip is also provided, which includes programmable logic circuits and/or program instructions, for implementing the training method of the article quality inspection model and/or the use method of the article quality inspection model described in the above aspects when the chip is run on a computer device.
In an exemplary embodiment, a computer program product is also provided, the computer program product comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer readable storage medium, and the processor reads and executes the computer instructions from the computer readable storage medium, so as to implement the training method of the article quality inspection model and/or the using method of the article quality inspection model provided by the above method embodiments.
In an exemplary embodiment, a computer readable storage medium is also provided, in which a computer program is stored, where the computer program is loaded and executed by a processor to implement the training method of the quality inspection model of an article and/or the usage method of the quality inspection model of an article provided in the above method embodiments.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program for instructing relevant hardware, where the program may be stored in a computer readable storage medium, and the storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
Those skilled in the art will appreciate that in one or more of the examples described above, the functions described in the embodiments of the present application may be implemented in hardware, software, firmware, or any combination thereof. When implemented in software, these functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. Computer-readable media includes both computer storage media and communication media including any medium that facilitates transfer of a computer program from one place to another. A storage media may be any available media that can be accessed by a general purpose or special purpose computer.
The foregoing description of the preferred embodiments of the present application is not intended to limit the application, but rather, the application is to be construed as limited to the appended claims.

Claims (20)

1. A method of training a quality inspection model of an article, the method comprising:
acquiring a sample image and a defect label of the sample image, wherein the sample image is used for describing image information of a sample article, the defect label comprises a grouping label and a type label, the grouping label is used for indicating whether the sample article has a defect, and the type label is used for indicating an article characteristic type corresponding to the sample article under the grouping label;
invoking a grouping prediction network in the article quality inspection model to predict the sample image to obtain a prediction grouping of the sample image, wherein the prediction grouping is a prediction result of whether the sample article has defects;
invoking a type prediction network in the article quality inspection model to predict the sample image to obtain a prediction type of the sample image, wherein the prediction type is a prediction result of an article characteristic type of the sample article;
and training the article quality inspection model according to the grouping errors between the grouping labels and the prediction groups and the type errors between the type labels and the prediction types to obtain a trained article quality inspection model.
2. The method according to claim 1, wherein said invoking a type prediction network in the item quality inspection model to predict the sample image results in a predicted type of the sample image comprises:
invoking the type prediction network, and performing prediction processing on the sample image based on the prediction group to obtain the prediction type of the sample image belonging to the prediction group, wherein the prediction type is used for indicating the article characteristic type corresponding to the sample article in the prediction group;
wherein the prediction type is used for indicating a defect type of the sample article when the prediction group indicates that the sample article is defective, and is used for indicating a cause type of the sample article that is not defective when the prediction group indicates that the sample article is not defective.
3. The method of claim 2, wherein the packet prediction network comprises a feature extraction sub-network and a label prediction sub-network;
the step of calling a grouping prediction network in the article quality inspection model to conduct prediction processing on the sample image to obtain a prediction grouping of the sample image comprises the following steps:
Invoking the feature extraction sub-network to perform feature extraction processing on the sample image to obtain image feature information of the sample image;
invoking the label prediction sub-network to perform prediction processing on the image characteristic information to obtain the prediction group of the sample image;
the calling the type prediction network, performing prediction processing on the sample image based on the prediction packet, so as to obtain the prediction type of the sample image belonging to the prediction packet, including:
and calling the type prediction network, and performing prediction processing on the image characteristic information based on the prediction packet to obtain the prediction type of the prediction packet to which the sample image belongs.
4. A method according to any one of claims 1 to 3, wherein said training said quality inspection model based on a packet error between said packet label and said predicted packet and a type error between said type label and said predicted type, comprises:
configuring a first price weight for the distinction between the grouping label and the predicted grouping, determining the grouping error, and configuring a second price weight for the distinction between the type label and the predicted type, determining the type error, wherein the first price weight and the second price weight are different;
And training the article quality inspection model according to the grouping error and the type error to obtain the trained article quality inspection model.
5. The method of claim 4, wherein said configuring a first cost weight for a distinction between said packet label and said predicted packet, determining said packet error, and configuring a second cost weight for a distinction between said type label and said predicted type, determining said type error comprises:
calculating a packet difference between the packet label and the predicted packet, and calculating a type difference between the type label and the predicted type;
determining a product of the group distinction and the first price weight as the group error;
and determining the product of the type difference and the second cost weight as the type error.
6. The method according to claim 4, wherein the method further comprises:
determining the second cost weight according to the type of the type label in the group of the sample image, wherein the second cost weight and the number of the first type labels are in an inverse relation;
the grouping of the sample images is indicated by the grouping label or the prediction grouping of the sample images.
7. The method according to claim 4, wherein the method further comprises:
determining the second cost weight according to the correlation information between the type label and the prediction type under the condition that the type label and the prediction type are different;
the correlation information is used for indicating the similarity degree between the type label corresponding to the sample image and the prediction type, and the second cost weight and the similarity degree are in positive correlation.
8. The method of claim 7, wherein the method further comprises:
and adding supplementary weight for the second cost weight under the condition that the packet label and the attribution packet corresponding to the prediction type are different.
9. The method of claim 4, wherein the first cost weight is greater than the second cost weight.
10. A method according to any one of claims 1 to 3, wherein the method further comprises:
acquiring a global image, wherein the global image and the sample image form a sample image group, and the sample image is a region of interest marked in the global image;
Invoking a region prediction network in the object quality inspection model, and performing prediction processing on the global image to obtain a predicted interest region in the global image;
wherein the distinction between the predicted region of interest and the sample image is used to train the item quality inspection model.
11. A method according to any one of claims 1 to 3, wherein the method further comprises:
acquiring the sample size of the type label of the sample image in a sample library;
performing augmentation processing on a first sample information group corresponding to the type tag in the sample library and increasing the number of the first sample information group to the reference number under the condition that the sample size does not exceed the reference number;
and/or, in the case that the sample size exceeds the reference number, sampling the first sample information group corresponding to the type tag in the sample library, and reducing the number of the first sample information group to the reference number;
wherein the sample image and the defect label of the sample image are obtained from the sample library.
12. A method of using an item quality inspection model, the item quality inspection model comprising a packet prediction network and a type prediction network, the method comprising:
Acquiring an input image, wherein the input image is an image to be predicted, and the defect type of the image to be predicted is predicted;
invoking the grouping prediction network in the article quality inspection model to predict the input image to obtain quality inspection grouping of the input image;
and calling the type prediction network in the article quality inspection model to perform prediction processing on the input image to obtain the quality inspection type of the input image.
13. The method according to claim 12, wherein the method further comprises:
acquiring a historical defect probability corresponding to a historical image, wherein an object to be inspected corresponding to the input image and a historical object corresponding to the historical image have the same object type;
and correcting the defect group of the input image into the defect group for indicating that the input image has no defect under the condition that the confidence of the defect label indication is smaller than the historical defect probability.
14. The method of claim 13, wherein the obtaining the historical defect probabilities corresponding to the historical images comprises:
acquiring the history image corresponding to the input image, wherein the history image comprises a history defect image with defects and a history sound image without defects;
Determining the ratio of the historical defect image in the historical image as the historical defect probability;
the input image is a region of interest in an image to be inspected, the position of the input image on the article to be inspected and the position of the historical image on the historical article have overlapping regions, and the image to be inspected is used for describing global image information of the article to be inspected.
15. The method of claim 14, wherein the method further comprises:
searching an associated image of a history defect image in the history sound image, wherein the associated image and the history defect image are image information of the same object presented from different angles, and an overlapping area exists between the associated image and the history defect image;
and updating label information of the associated image, wherein a historical quality inspection group corresponding to the associated image indicates that the historical article has defects.
16. The method of claim 14, wherein a first defect probability of the historical defect probabilities is used to indicate a historical defect probability of a first pixel point in the input image; the method further comprises the steps of:
smoothing the first defect probability of the first pixel according to the adjacent defect probability of the adjacent pixel on the periphery of the first pixel so as to update the first defect probability;
The adjacent defect probability is used for indicating the historical defect probability of the adjacent pixel point.
17. A training device for a quality inspection model of an article, the device comprising:
the system comprises an acquisition module, a detection module and a storage module, wherein the acquisition module is used for acquiring a sample image and a defect label of the sample image, the sample image is used for describing image information of a sample article, the defect label comprises a grouping label and a type label, the grouping label is used for indicating whether the sample article has a defect, and the type label is used for indicating an article characteristic type corresponding to the sample article under the grouping label;
the processing module is used for calling a grouping prediction network in the article quality inspection model to conduct prediction processing on the sample image to obtain a prediction grouping of the sample image, wherein the prediction grouping is a prediction result of whether the sample article has defects or not;
the processing module is further used for calling a type prediction network in the article quality inspection model to conduct prediction processing on the sample image to obtain a prediction type of the sample image, wherein the prediction type is a prediction result of article characteristic types of the sample article;
and the training module is used for training the article quality inspection model according to the grouping errors between the grouping labels and the prediction groups and the type errors between the type labels and the prediction types to obtain a trained article quality inspection model.
18. An apparatus for using an item quality inspection model, the item quality inspection model comprising a packet prediction network and a type prediction network, the apparatus comprising:
the acquisition module is used for acquiring an input image, wherein the input image is an image to be predicted, and defect type prediction needs to be performed on the image to be predicted;
the processing module is used for calling the grouping prediction network in the article quality inspection model to conduct prediction processing on the input image so as to obtain quality inspection grouping of the input image;
and the processing module is also used for calling the type prediction network in the article quality inspection model to perform prediction processing on the input image so as to obtain the quality inspection type of the input image.
19. A computer device, the computer device comprising: a processor and a memory, wherein at least one section of program is stored in the memory; the processor is configured to execute the at least one program in the memory to implement a method for training the quality inspection model of an article according to any one of claims 1 to 11 and/or a method for using the quality inspection model of an article according to any one of claims 12 to 16.
20. A computer readable storage medium having stored therein executable instructions that are loaded and executed by a processor to implement a method of training an item quality inspection model according to any one of claims 1 to 11 and/or a method of using an item quality inspection model according to any one of claims 12 to 16.
CN202310489070.7A 2023-04-28 2023-04-28 Model training method, using method, device, equipment and storage medium Pending CN117218476A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310489070.7A CN117218476A (en) 2023-04-28 2023-04-28 Model training method, using method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310489070.7A CN117218476A (en) 2023-04-28 2023-04-28 Model training method, using method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117218476A true CN117218476A (en) 2023-12-12

Family

ID=89039539

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310489070.7A Pending CN117218476A (en) 2023-04-28 2023-04-28 Model training method, using method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117218476A (en)

Similar Documents

Publication Publication Date Title
CN109447169B (en) Image processing method, training method and device of model thereof and electronic system
CN110060237B (en) Fault detection method, device, equipment and system
CN111444921A (en) Scratch defect detection method and device, computing equipment and storage medium
CN108154508A (en) Method, apparatus, storage medium and the terminal device of product defects detection positioning
CN105574550A (en) Vehicle identification method and device
US11715190B2 (en) Inspection system, image discrimination system, discrimination system, discriminator generation system, and learning data generation device
CN110264444B (en) Damage detection method and device based on weak segmentation
CN111242899B (en) Image-based flaw detection method and computer-readable storage medium
CN112348787A (en) Training method of object defect detection model, object defect detection method and device
CN111368636A (en) Object classification method and device, computer equipment and storage medium
CN112700442A (en) Die-cutting machine workpiece defect detection method and system based on Faster R-CNN
CN116258707A (en) PCB surface defect detection method based on improved YOLOv5 algorithm
CN111340796A (en) Defect detection method and device, electronic equipment and storage medium
CN111414878B (en) Social attribute analysis and image processing method and device for land parcels
CN113111875A (en) Seamless steel rail weld defect identification device and method based on deep learning
CN114639102A (en) Cell segmentation method and device based on key point and size regression
CN114693963A (en) Recognition model training and recognition method and device based on electric power data feature extraction
CN117152484B (en) Small target cloth flaw detection method based on improved YOLOv5s
CN113780145A (en) Sperm morphology detection method, sperm morphology detection device, computer equipment and storage medium
CN112287905A (en) Vehicle damage identification method, device, equipment and storage medium
JP3749726B1 (en) Low contrast defect inspection method under periodic noise, low contrast defect inspection method under repeated pattern
JP7059889B2 (en) Learning device, image generator, learning method, and learning program
CN116823793A (en) Device defect detection method, device, electronic device and readable storage medium
CN110889418A (en) Gas contour identification method
CN116245882A (en) Circuit board electronic element detection method and device and computer equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination