WO2019176990A1 - Dispositif d'inspection, dispositif de discrimination d'image, dispositif de discrimination, procédé d'inspection et programme d'inspection - Google Patents

Dispositif d'inspection, dispositif de discrimination d'image, dispositif de discrimination, procédé d'inspection et programme d'inspection Download PDF

Info

Publication number
WO2019176990A1
WO2019176990A1 PCT/JP2019/010180 JP2019010180W WO2019176990A1 WO 2019176990 A1 WO2019176990 A1 WO 2019176990A1 JP 2019010180 W JP2019010180 W JP 2019010180W WO 2019176990 A1 WO2019176990 A1 WO 2019176990A1
Authority
WO
WIPO (PCT)
Prior art keywords
discriminator
output value
product
learning
data
Prior art date
Application number
PCT/JP2019/010180
Other languages
English (en)
Japanese (ja)
Inventor
真嗣 栗田
Original Assignee
オムロン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オムロン株式会社 filed Critical オムロン株式会社
Publication of WO2019176990A1 publication Critical patent/WO2019176990A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Definitions

  • the present invention relates to an inspection device, an image identification device, an identification device, an inspection method, and an inspection program.
  • Patent Document 1 when it is determined whether the inspection object shown in the image is normal or abnormal based on the learned first neural network, and it is determined that the inspection object is abnormal, There has been proposed an inspection apparatus for classifying the type of abnormality based on a learned second neural network.
  • the present inventor has found the following problems in the conventional technique for determining the quality of a product from image data using a discriminator such as a learned neural network as in Patent Document 1. It was.
  • the discriminator is constructed so as to discriminate the quality of the product shown in the pre-collected image data, and therefore cannot cope with an unknown case such as a new type of defect. Therefore, in order to deal with an unknown case, additional image data in which the unknown case is captured is collected, and re-learning or additional learning of the discriminator is performed using the collected additional image data. As a result, the updated discriminator can cope with an unknown case learned by the additional image data.
  • the discrimination performance of the discriminator may deteriorate due to the influence of learning special case discrimination.
  • AI technology such as a neural network
  • over-learning may occur and the discrimination performance of the discriminator may deteriorate.
  • the discrimination performance of the discriminator deteriorates due to re-learning or additional learning, there is a problem that the reliability of the quality determination of the product by the discriminator is impaired. Found.
  • this problem is not unique to the scene where product quality is judged. Similar problems may occur in any scene where the discriminator is updated by re-learning or additional learning, such as a scene where a certain state of a subject is identified from image data and a scene where a characteristic is identified from data other than image data. That is, the re-learning or additional learning of the classifier is performed using the collected additional data, so that the updated classifier can cope with an unknown case learned by the additional data.
  • the performance of identifying the features appearing in the target data by the classifier may be deteriorated.
  • the present invention has been made in view of such a situation in one aspect, and the purpose thereof is to use a discriminator even when the discrimination performance of the discriminator deteriorates due to re-learning or additional learning. It is to provide a technique for preventing the determination reliability from being impaired.
  • the present invention adopts the following configuration in order to solve the above-described problems.
  • an inspection apparatus is an inspection apparatus that inspects the quality of a product, and includes a data acquisition unit that acquires target image data in which the product to be inspected is captured, and the quality of the product. Consists of a first classifier constructed by machine learning using first learning data composed of image data for learning, and additional image data for learning the quality of the first learning data and the product A determination unit that determines the quality of the product in the acquired target image data using a second discriminator constructed by machine learning using the second learning data, and the quality of the product is determined A first output value obtained from the first discriminator by inputting the target image data to the first discriminator to determine whether the product is good or bad.
  • a range is set as a first determination criterion
  • the second output value obtained from the second discriminator by inputting the target image data to the second discriminator is used to determine the quality of the product.
  • the numerical range of the second determination criterion is set narrower than the numerical range of the first determination criterion
  • the determination unit includes the second discriminator.
  • the quality of the product is judged by comparing the output value obtained from the classifier with a threshold value by inputting target image data showing the product into the classifier. To do.
  • the deterioration of the identification performance due to overlearning is caused by a high possibility that the determination of pass / fail is erroneous near the threshold boundary.
  • the inspection apparatus holds both the first discriminator before re-learning or additional learning and the second discriminator after re-learning or additional learning.
  • the numerical range of the second determination criterion of the second discriminator is set narrower than the numerical range of the first determination criterion of the first discriminator. That is, the numerical range of the second determination criterion is provided with a range in which the quality determination by the second discriminator is not performed.
  • the inspection device judges pass / fail of a product based on the 2nd output value of the 2nd discriminator.
  • the inspection apparatus determines the quality of the product based on the first output value of the first discriminator. .
  • the inspection apparatus determines the pass / fail of the product by trusting the second discriminator. If not, the quality of the product is judged by the first discriminator without using the second discriminator.
  • the determination result before re-learning or additional learning is obtained. It is possible to guarantee and suppress fluctuations in the determination result due to re-learning or additional learning. Therefore, according to the said structure, even when the discrimination performance of a discriminator deteriorates by relearning or additional learning, it can prevent that the reliability of the quality determination using a discriminator is impaired. .
  • the “product” is not particularly limited, and may be appropriately selected according to the embodiment.
  • the “product” may be, for example, a product conveyed on a production line such as an electronic component or an automobile component.
  • the electronic component is, for example, a substrate, a chip capacitor, a liquid crystal, a relay winding, or the like.
  • the automobile parts are, for example, connecting rods, shafts, engine blocks, power window switches, panels, and the like. “Determining whether or not the product is good” may be simply determining whether or not the product is defective. In addition to determining whether or not the product is defective, the type of the defect is identified. May be included.
  • the defects are, for example, scratches, dirt, cracks, dents, dust, burrs, color unevenness, and the like.
  • the “discriminator” may be configured by a learning model that can acquire the ability to perform predetermined inference by machine learning, such as a neural network, a support vector machine, a self-organizing map, and a reinforcement learning model.
  • “Relearning” is to perform machine learning again using original learning data (first learning data) used for machine learning of the classifier.
  • additional learning is to update the classifier by machine learning using new learning data.
  • “Relearning or additional learning” may include updating a discriminator by these machine learnings and constructing a new discriminator.
  • each of the first output value of the first discriminator and the second output value of the second discriminator indicates that the larger the value is, the more defective the product is.
  • the numerical value range of the determination criterion may be configured by a range equal to or greater than a second threshold and a range equal to or smaller than a third threshold smaller than the second threshold, and the determination unit is acquired from the second discriminator.
  • the product When the second output value is larger than the second threshold value, the product may be determined to be defective as the second output value is included in the numerical range of the second determination criterion, The second output value obtained from the second discriminator is If the value is smaller than the third threshold value, the product may be determined to be a non-defective product, assuming that the second output value is included in the numerical range of the second determination criterion, and obtained from the second discriminator. If the second output value is smaller than the second threshold value and larger than the third threshold value, the second identification value is not included in the numerical range of the second determination criterion, and the first identification is performed.
  • the first output value may be acquired from the first discriminator by inputting the target image data into a device, and if the acquired first output value is greater than the first threshold, the product May be determined to have a defect, and if the acquired first output value is smaller than the first threshold value, the product may be determined to be non-defective. According to the said structure, even when the discrimination performance of a discriminator deteriorates by re-learning or additional learning, it can prevent appropriately that the reliability of the quality determination using a discriminator will be impaired. .
  • each of the first output value of the first discriminator and the second output value of the second discriminator indicates that the product is a non-defective product as the value increases.
  • the numerical range of 2 determination criteria may be configured by a range equal to or greater than the second threshold and a range equal to or smaller than the third threshold smaller than the second threshold, and the determination unit is obtained from the second discriminator.
  • the product When the second output value is greater than the second threshold, the product may be determined to be non-defective, assuming that the second output value is included in the numerical range of the second determination criterion, The second output value obtained from the second discriminator is When the second threshold value is smaller than the third threshold value, the product may be determined to be defective as the second output value is included in the numerical range of the second determination criterion.
  • the first output value may be acquired from the first discriminator, and when the acquired first output value is larger than the first threshold, The product may be determined to be a non-defective product, and if the acquired first output value is smaller than the first threshold value, the product may be determined to be defective. According to the said structure, even when the discrimination performance of a discriminator deteriorates by re-learning or additional learning, it can prevent appropriately that the reliability of the quality determination using a discriminator will be impaired. .
  • each of the first output value of the first discriminator and the second output value of the second discriminator may indicate a probability that the product has a defect
  • the numerical range of the first determination criterion may be configured by a range greater than or equal to the first threshold and a range less than the first threshold
  • the numerical range of the second determination criterion is configured by a range greater than or equal to the second threshold.
  • the determination unit may include the second output value in the numerical range of the second determination criterion when the second output value acquired from the second discriminator is larger than the second threshold value. As described above, it may be determined that the product has a defect, and when the second output value acquired from the second discriminator is smaller than the second threshold value, the second output value is the second value.
  • the first identification as not included in the numerical range of the criterion The target image data may be input to the first discriminator to obtain the first output value. If the obtained first output value is greater than the first threshold value, May be determined to be defective. According to the said structure, even when the discrimination performance of a discriminator deteriorates by re-learning or additional learning, it can prevent appropriately that the reliability of the quality determination using a discriminator will be impaired. .
  • each of the first output value of the first discriminator and the second output value of the second discriminator may indicate a probability that the product is a good product.
  • the numerical range of the first determination criterion may be configured by a range greater than or equal to the first threshold and a range less than the first threshold, and the numerical range of the second determination criterion is configured by a range greater than or equal to the second threshold.
  • the determination unit may include the second output value in the numerical range of the second determination criterion when the second output value acquired from the second discriminator is larger than the second threshold value.
  • the product may be determined as a non-defective product, and if the second output value acquired from the second discriminator is smaller than the second threshold value, the second output value is determined as the second determination value.
  • the first discriminator is not included in the standard numerical range. By inputting the target image data, the first output value may be acquired from the first discriminator. When the acquired first output value is larger than the first threshold value, the product is a non-defective product. It may be determined that According to the said structure, even when the discrimination performance of a discriminator deteriorates by re-learning or additional learning, it can prevent appropriately that the reliability of the quality determination using a discriminator will be impaired. .
  • the inspection apparatus may perform relearning such as a scene in which any feature is determined from image data other than the image data in which the product is captured, and a scene in which any feature is determined from data including data other than the image data. Or it may change so that it can apply to every scene which updates a discriminator by additional learning.
  • an image identification apparatus includes a data acquisition unit that acquires target image data in which a subject that is a target for identifying a state, and image data for performing learning that identifies the state of the subject.
  • a first discriminator constructed by machine learning using the constructed first learning data, and a second class comprising additional image data for performing learning for identifying the first learning data and the state of the subject.
  • a determination unit that determines the state of the subject in the acquired target image data, and outputs a result of determining the state of the subject A first output value obtained from the first discriminator by inputting the target image data to the first discriminator, a number for determining the state of the subject.
  • a range is set as a first determination criterion
  • the second output value obtained from the second discriminator by inputting the target image data to the second discriminator is for determining the state of the subject.
  • the numerical range of the second determination criterion is set narrower than the numerical range of the first determination criterion
  • the determination unit includes the second discriminator.
  • the “subject” and the “state” of the subject to be identified need not be particularly limited, and may be appropriately selected according to the embodiment.
  • the “subject” may be, for example, the face of the subject, the body of the subject, the work to be worked, and the like.
  • the state to be identified may be, for example, the type of facial expression, the state of facial parts, and the like.
  • the state to be identified may be, for example, a body pose.
  • the state to be identified may be, for example, the position and posture of the work.
  • the identification device includes a data acquisition unit that acquires target data including a feature to be identified, and a first data configured to perform learning for identifying the feature.
  • a data acquisition unit that acquires target data including a feature to be identified, and a first data configured to perform learning for identifying the feature.
  • a determination unit that determines the feature appearing in the acquired target data using the constructed second discriminator; and an output unit that outputs a result of determining the feature; and the first discriminator
  • a numerical range for judging the feature is set as a first judgment criterion, and the second discriminator has the above-mentioned value range.
  • a numerical range for determining the feature is set as a second determination criterion, and the numerical range of the second determination criterion is , which is set narrower than the numerical range of the first determination criterion, and the determination unit acquires the second output value from the second discriminator by inputting the target data to the second discriminator.
  • the feature is determined based on the second output value, and the acquired second output value is
  • the first output value is obtained from the first discriminator by inputting the target data to the first discriminator, and the obtained first output The feature is determined based on the value.
  • the “data” may include all kinds of data that can be analyzed by the discriminator, and may be, for example, sound data (voice data), numerical data, text data, etc. in addition to image data.
  • a “feature” may include any feature identifiable from data.
  • the “data” is sound data
  • the “feature” may be, for example, whether or not a specific sound (for example, abnormal noise of a machine) is included.
  • the “data” is numerical data or text data related to biological data such as blood pressure and activity amount
  • the “feature” may be, for example, the state of the subject.
  • the “data” is numerical data such as a driving amount of the machine or text data
  • the “feature” may be, for example, the state of the machine.
  • the first output value of the first discriminator and the second output value of the second discriminator each have a first feature as the feature as the value increases.
  • the numerical range of the first determination criterion is a range equal to or greater than the first threshold.
  • the numerical range of the second determination criterion is composed of a range equal to or greater than the second threshold and a range equal to or smaller than the third threshold smaller than the second threshold.
  • the determination unit may include the second output value in the numerical range of the second determination criterion when the second output value acquired from the second discriminator is larger than the second threshold value.
  • the target data It may be determined that the first feature appears, and when the second output value acquired from the second discriminator is smaller than the third threshold, the second output value is determined as the second determination. It may be determined that the second feature appears in the target data as being included in a reference numerical range, and the second output value acquired from the second discriminator is greater than the second threshold value. When the value is smaller and larger than the third threshold value, the second output value is not included in the numerical range of the second determination criterion, and the target data is input to the first discriminator.
  • the first output value may be acquired from a discriminator, and when the acquired first output value is larger than the first threshold, it is determined that the first feature appears in the target data. If the acquired first output value is smaller than the first threshold value, The serial target data may be determined that the second feature is evident.
  • each of the first output value of the first identifier and the second output value of the second identifier indicates a probability that a predetermined feature appears in the target data.
  • the numerical range of the first determination criterion may be configured by a range equal to or greater than the first threshold and a range less than the first threshold, and the numerical range of the second determination criterion is a range equal to or greater than the second threshold.
  • the determination unit may be configured such that when the second output value acquired from the second discriminator is larger than the second threshold, the second output value is the second determination criterion. It may be determined that the predetermined characteristic appears in the target data as included in the numerical range, and the second output value acquired from the second discriminator is smaller than the second threshold value.
  • the second output value is the second criterion.
  • the first output value may be acquired from the first discriminator by inputting the target data to the first discriminator as not being included in the value range, and the acquired first output value is the first discriminator. When it is larger than one threshold, it may be determined that the predetermined feature appears in the target data.
  • an information processing method that realizes each of the above configurations, a program, or such It may be a storage medium readable by a computer, other devices, machines, etc. that stores various programs.
  • the computer-readable storage medium is a medium that stores information such as programs by electrical, magnetic, optical, mechanical, or chemical action.
  • an inspection method is an information processing method for inspecting the quality of a product, wherein the computer acquires target image data in which the product to be inspected is captured; A first discriminator constructed by machine learning using first learning data composed of image data for learning the quality of a product, and an additional for learning the quality of the first learning data and the product A second step of determining pass / fail of the product in the acquired target image data using a second discriminator constructed by machine learning using second learning data composed of image data; and the product The third step of outputting the result of determining the quality of the first, and the first output value obtained from the first discriminator by inputting the target image data to the first discriminator, A numerical range for determining the quality of the product is set as a first determination criterion, and the second output value obtained from the second discriminator is obtained by inputting the target image data to the second discriminator.
  • the numerical range for determining the quality of the product is set as a second determination criterion, the numerical range of the second determination criterion is set narrower than the numerical range of the first determination criterion,
  • the computer obtains the second output value from the second discriminator by inputting the target image data to the second discriminator, and the obtained second output value is When included in the numerical range of the second determination criterion, the quality of the product is determined based on the second output value, and the acquired second output value is included in the numerical range of the second determination criterion.
  • the first discriminator By inputting the serial target image data, it acquires the first output value from said first identifier, obtained on the basis of the first output value to determine the quality of the product, an information processing method.
  • an inspection program is a program for inspecting the quality of a product, and obtains target image data in which the product to be inspected is captured in a computer, A first discriminator constructed by machine learning using first learning data composed of image data for learning the quality of the product, and for learning the quality of the first learning data and the product A second step of determining pass / fail of the product reflected in the acquired target image data using a second discriminator constructed by machine learning using second learning data composed of additional image data; A third step of outputting a result of determining the quality of the product, and obtaining the target image data from the first discriminator by inputting the target image data to the first discriminator.
  • a numerical range for determining the quality of the product is set as a first determination criterion, and the target image data is input to the second discriminator to input the target image data from the second discriminator.
  • a numerical range for determining the quality of the product is set as a second determination criterion.
  • the numerical range of the second determination criterion is greater than the numerical range of the first determination criterion.
  • the second step is to acquire the second output value from the second discriminator by inputting the target image data to the second discriminator in the second step.
  • the second output value is included in the numerical range of the second determination criterion, the quality of the product is determined based on the second output value, and the acquired second output value is the second determination value.
  • the computer performs a first step of acquiring target image data in which a subject to be identified is captured, and learning for identifying the status of the subject.
  • a first discriminator constructed by machine learning using first learning data composed of image data for the purpose, and additional image data for performing learning for identifying the first learning data and the state of the subject
  • a numerical range for determining the state of the subject is set as a first determination reference, and the second output value obtained from the second discriminator is obtained by inputting the target image data to the second discriminator.
  • the numerical range for determining the state of the subject is set as a second determination criterion, the numerical range of the second determination criterion is set narrower than the numerical range of the first determination criterion,
  • the computer obtains the second output value from the second discriminator by inputting the target image data to the second discriminator, and the obtained second output value is When included in the numerical range of the second determination criterion, the state of the subject is determined based on the second output value, and the acquired second output value is included in the numerical range of the second determination criterion. If not, the first An information processing method for acquiring the first output value from the first discriminator by inputting the target image data to a separate device, and determining the state of the subject based on the acquired first output value It is.
  • the image identification program performs a first step of acquiring target image data in which a subject as a target for identifying a state is captured, and learning for identifying the state of the subject.
  • a first discriminator constructed by machine learning using first learning data composed of image data for the purpose, and additional image data for performing learning for identifying the first learning data and the state of the subject
  • a numerical range for determining the state of the subject is set as a first determination criterion, and a second range obtained from the second discriminator by inputting the target image data to the second discriminator.
  • a numerical range for determining the state of the subject is set as a second determination criterion, and the numerical range of the second determination criterion is set narrower than the numerical range of the first determination criterion.
  • the second output value is obtained from the second discriminator by inputting the target image data to the second discriminator, and the second step is performed.
  • the state of the subject is determined based on the second output value, and the acquired second output value is the numerical value of the second determination criterion. If not in scope By inputting the target image data to the first discriminator, the first output value is acquired from the first discriminator, and the state of the subject is determined based on the acquired first output value. , Is a program for.
  • the identification method includes a first step in which a computer acquires target data including a feature to be identified, and data for performing learning for identifying the feature.
  • a first discriminator constructed by machine learning using the first learning data, and second learning data composed of additional data for performing learning for identifying the first learning data and the feature
  • a second discriminator constructed by machine learning, executing a second step of determining the feature appearing in the acquired target data, and a third step of outputting a result of determining the feature.
  • a numerical range for judging the feature is set as a first judgment criterion.
  • a numerical range for judging the feature is set as a second judgment criterion.
  • the numerical range of the second determination criterion is set narrower than the numerical range of the first determination criterion, and in the second step, the computer inputs the target data to the second discriminator.
  • the second output value is acquired from the second discriminator, and when the acquired second output value is included in the numerical range of the second determination criterion, based on the second output value,
  • the target data is input to the first discriminator, so that the first discriminator The first output value is acquired and the acquired Based on the first output value, determining said characteristic, an information processing method.
  • an identification program includes a first step of acquiring target data including a feature to be identified in a computer, and data for performing learning for identifying the feature.
  • a first discriminator constructed by machine learning using the first learning data, and second learning data composed of additional data for performing learning for identifying the first learning data and the feature
  • a second discriminator constructed by machine learning, executing a second step of determining the feature appearing in the acquired target data, and a third step of outputting a result of determining the feature.
  • the first output value obtained from the first discriminator by inputting the target data to the first discriminator has a numerical range for judging the feature as a first judgment criterion.
  • the second output value obtained from the second discriminator by inputting the target data to the second discriminator is set as a second judgment criterion in the numerical range for judging the feature
  • the numerical range of the second determination criterion is set narrower than the numerical range of the first determination criterion, and in the second step, the target data is input to the computer and to the second discriminator.
  • the second output value is acquired from the second discriminator, and when the acquired second output value is included in the numerical range of the second determination criterion, based on the second output value
  • the target data is input to the first discriminator, thereby determining the feature.
  • Obtaining the first output value from the discriminator; Obtained on the basis of the first output value, to determine the feature is a program for.
  • the present invention it is possible to provide a technique for preventing the reliability of determination using a discriminator from being impaired even when the discrimination performance of the discriminator deteriorates due to re-learning or additional learning. it can.
  • FIG. 1 schematically illustrates an example of a scene to which the present invention is applied.
  • FIG. 2 schematically illustrates an example of a hardware configuration of the inspection apparatus according to the embodiment.
  • FIG. 3 schematically illustrates an example of a hardware configuration of the learning device according to the embodiment.
  • FIG. 4 schematically illustrates an example of the software configuration of the inspection apparatus according to the embodiment.
  • FIG. 5 illustrates an example of determination criteria for each classifier.
  • FIG. 6 schematically illustrates an example of the software configuration of the learning device according to the embodiment.
  • FIG. 7 illustrates an example of a processing procedure of the inspection apparatus according to the embodiment.
  • FIG. 8 illustrates an example of a processing procedure of the learning device according to the embodiment.
  • FIG. 9 illustrates another example of determination criteria for each classifier.
  • FIG. 1 schematically illustrates an example of a scene to which the present invention is applied.
  • FIG. 2 schematically illustrates an example of a hardware configuration of the inspection apparatus according to the embodiment.
  • FIG. 3 schematically
  • FIG. 10 schematically illustrates an example of a hardware configuration of an identification device according to another embodiment.
  • FIG. 11 schematically illustrates an example of the software configuration of the identification device according to another embodiment.
  • FIG. 12 schematically illustrates an example of a hardware configuration of an identification device according to another embodiment.
  • FIG. 13 schematically illustrates an example of the software configuration of the identification device according to another embodiment.
  • this embodiment will be described with reference to the drawings.
  • this embodiment described below is only an illustration of the present invention in all respects. It goes without saying that various improvements and modifications can be made without departing from the scope of the present invention. That is, in implementing the present invention, a specific configuration according to the embodiment may be adopted as appropriate.
  • data appearing in this embodiment is described in a natural language, more specifically, it is specified by a pseudo language, a command, a parameter, a machine language, or the like that can be recognized by a computer.
  • both the first discriminator before re-learning or additional learning and the second discriminator after re-learning or additional learning are held.
  • the numerical range of the second determination criterion of the second discriminator is set narrower than the numerical range of the first determination criterion of the first discriminator. That is, the numerical range of the second determination criterion is provided with a range in which features are not identified by the second classifier.
  • the 2nd output value of a 2nd discriminator is contained in the numerical range of a 2nd criterion, it appears in object data based on the 2nd output value of the said 2nd discriminator. Determine the characteristics.
  • the second output value of the second discriminator is not included in the numerical range of the second determination criterion, the feature appearing in the target data is determined based on the first output value of the first discriminator.
  • the second discriminator when the second output value of the second discriminator clearly indicates the feature discrimination result, the second discriminator is used to perform the feature determination, and otherwise
  • the feature can be determined by the first discriminator without using the second discriminator.
  • the determination result before re-learning or additional learning is obtained. It is possible to guarantee and suppress fluctuations in the determination result due to re-learning or additional learning. Therefore, according to an example of the present invention, even when the discrimination performance of the discriminator deteriorates due to re-learning or additional learning, it is possible to prevent the reliability of determination using the discriminator from being impaired. it can.
  • FIG. 1 schematically illustrates an example of a scene in which the present invention is applied to an appearance inspection of a product R.
  • the application range of the present invention is not limited to the example of visual inspection exemplified below.
  • the present invention can be applied to any scene where the classifier is updated by re-learning or additional learning.
  • the inspection apparatus 1 is a computer configured to inspect the quality of the product R. Specifically, the inspection apparatus 1 acquires target image data 121 in which a product R to be inspected is captured. In the present embodiment, the inspection apparatus 1 is connected to the camera 31, and acquires the target image data 121 by photographing the product R with the camera 31.
  • the inspection apparatus 1 uses the first discriminator 5 before re-learning or additional learning and the second discriminator 6 after re-learning or additional learning, and the quality of the product R reflected in the acquired target image data 121 is determined. Determine.
  • the relearning or additional learning “before” and “after” merely indicate a relative relationship between the first discriminator 5 and the second discriminator 6.
  • the first discriminator 5 may be a discriminator after relearning or additional learning in relation to other discriminators other than the second discriminator 6.
  • the second discriminator 6 may be a discriminator before re-learning or additional learning in the relationship with other discriminators other than the first discriminator 5.
  • the numerical range for judging the quality of the product R is used as the first judgment criterion. Is set.
  • a numerical range for determining the quality of the product R is set as a second determination criterion in the second output value obtained from the second identifier 6 by inputting the target image data 121 to the second identifier 6.
  • the numerical range of the second determination criterion is set narrower than the numerical range of the first determination criterion.
  • the numerical range of the second determination criterion is a numerical range for determining that the product R is a non-defective product (ie, there is no defect) with respect to the output of the second discriminator 6, and the product R is defective. (That is, the product R is a defective product (defective product)).
  • the numerical range of the first determination criterion is configured by a numerical range for determining that the product R is a non-defective product and a numerical range for determining that the product R is defective with respect to the output of the first discriminator 5.
  • the numerical range of the second determination criterion is, for example, greater than the numerical range of the first determination criterion so that the determination using the second discriminator 6 is stricter than the determination using the first discriminator 5. Is set too narrow.
  • the inspection apparatus 1 acquires the second output value from the second discriminator 6 by inputting the target image data 121 to the second discriminator 6.
  • the inspection apparatus 1 adopts the pass / fail determination by the second discriminator 6, that is, the second output value Based on this, the quality of the product R is determined.
  • the inspection apparatus 1 does not adopt the pass / fail determination by the second discriminator 6 and the first discriminator 5
  • the first output value is acquired from the first discriminator 5 by inputting the target image data 121.
  • the inspection device 1 determines the quality of the product R based on the first output value acquired from the first discriminator 5. After determining the quality of the product R, the inspection apparatus 1 outputs the result.
  • the learning device 2 is a computer that constructs the first discriminator 5 and the second discriminator 6 used in the inspection device 1. Specifically, the learning device 2 acquires the ability to determine the quality of the product by performing machine learning using the first learning data 221 composed of the image data 222 for learning the quality of the product. The learned first discriminator 5 is constructed. In addition, the learning device 2 performs the machine learning using the second learning data 226 including the first learning data 221 and the additional image data 227 for learning the quality of the product, thereby determining the quality of the product. A learned second discriminator 6 that has acquired the ability to determine is constructed.
  • the learning device 2 can construct the first discriminator 5 and the second discriminator 6 used in the inspection device 1.
  • the inspection device 1 can acquire the learned first discriminator 5 and second discriminator 6 constructed by the learning device 2 via, for example, a network.
  • the type of network may be appropriately selected from, for example, the Internet, a wireless communication network, a mobile communication network, a telephone network, and a dedicated network.
  • the numerical range of the second determination criterion that determines whether or not to adopt the determination result by the second discriminator 6 is set narrower than the numerical range of the first determination criterion. Has been. Thereby, the quality determination using the second discriminator 6 is performed more strictly than the quality determination using the first discriminator 5. That is, when the second output value of the second discriminator 6 clearly indicates the discrimination result of pass / fail, the inspection device 1 according to this embodiment determines whether the product R is pass / fail by the second discriminator 6. Otherwise, the quality of the product R is judged by the first discriminator 5 without adopting the judgment result by the second discriminator 6.
  • the determination before re-learning or additional learning is performed by using the first discriminator 5 without using the second discriminator 6 in the vicinity of the threshold boundary where degradation of discrimination performance due to over-learning may occur. It is possible to guarantee the result and suppress the variation of the determination result due to re-learning or additional learning. Therefore, according to the present embodiment, even when the discrimination performance of the second discriminator 6 is worse than that of the first discriminator 5 due to relearning or additional learning, the reliability of the pass / fail judgment is impaired. Can be prevented.
  • the product R to be subjected to the appearance inspection may not be particularly limited, and may be appropriately selected according to the embodiment.
  • the product R may be, for example, a product that is conveyed on a production line such as an electronic component or an automobile component.
  • the electronic component is, for example, a substrate, a chip capacitor, a liquid crystal, a relay winding, or the like.
  • the automobile parts are, for example, connecting rods, shafts, engine blocks, power window switches, panels, and the like.
  • the determination of pass / fail may be simply determining whether or not the product R has a defect, and in addition to determining whether or not the product R has a defect, the type of the defect is identified. May include.
  • the defects are, for example, scratches, dirt, cracks, dents, dust, burrs, color unevenness, and the like.
  • FIG. 2 schematically illustrates an example of a hardware configuration of the inspection apparatus 1 according to the present embodiment.
  • the inspection device 1 includes a control unit 11, a storage unit 12, a communication interface 13, an external interface 14, an input device 15, an output device 16, and a drive 17 that are electrically connected.
  • Computer In FIG. 2, the communication interface and the external interface are described as “communication I / F” and “external I / F”, respectively.
  • the control unit 11 includes a CPU (Central Processing Unit), a RAM (Random Access Memory), a ROM (Read Only Memory), etc., which are hardware processors, and is configured to execute information processing based on programs and various data.
  • the storage unit 12 is an example of a memory, and includes, for example, a hard disk drive or a solid state drive. In the present embodiment, the storage unit 12 stores various types of information such as the inspection program 81, the first learning result data 224, the second learning result data 229, and the like.
  • the inspection program 81 uses the first discriminator 5 and the second discriminator 6 to cause the inspection apparatus 1 to execute information processing (FIG. 7) described later for determining the quality of the product R shown in the target image data 121. And includes a series of instructions for the information processing.
  • the first learning result data 224 is data for setting the learned first discriminator 5.
  • the second learning result data 229 is data for setting the learned second discriminator 6. Details will be described later.
  • the communication interface 13 is, for example, a wired LAN (Local Area Network) module, a wireless LAN module, or the like, and is an interface for performing wired or wireless communication via a network.
  • the inspection apparatus 1 can perform data communication via a network with another information processing apparatus (for example, the learning apparatus 2).
  • the external interface 14 is, for example, a USB (Universal Serial Bus) port, a dedicated port, or the like, and is an interface for connecting to an external device.
  • the type and number of external interfaces 14 may be appropriately selected according to the type and number of external devices to be connected.
  • the inspection apparatus 1 is connected to the camera 31 via the external interface 14.
  • the camera 31 is used to acquire the target image data 121 by photographing the product R.
  • the type and location of the camera 31 may not be particularly limited, and may be appropriately determined according to the embodiment.
  • a known camera such as a digital camera or a video camera may be used.
  • the camera 31 may be disposed in the vicinity of the production line where the product R is conveyed.
  • the inspection apparatus 1 may be connected to the camera 31 via the communication interface 13 instead of the external interface 14.
  • the input device 15 is a device for inputting, for example, a mouse and a keyboard.
  • the output device 16 is a device for outputting, for example, a display or a speaker. The operator can operate the inspection apparatus 1 by using the input device 15 and the output device 16.
  • the drive 17 is, for example, a CD drive, a DVD drive, or the like, and is a drive device for reading a program stored in the storage medium 91.
  • the type of the drive 17 may be appropriately selected according to the type of the storage medium 91.
  • At least one of the inspection program 81, the first learning result data 224, and the second learning result data 229 may be stored in the storage medium 91.
  • the storage medium 91 stores information such as a program by electrical, magnetic, optical, mechanical, or chemical action so that a computer or other device, machine, or the like can read the recorded program or the like. It is a medium to accumulate.
  • the inspection apparatus 1 may acquire at least one of the inspection program 81, the first learning result data 224, and the second learning result data 229 from the storage medium 91.
  • a disk-type storage medium such as a CD or a DVD is illustrated.
  • the type of the storage medium 91 is not limited to the disk type and may be other than the disk type.
  • Examples of the storage medium other than the disk type include a semiconductor memory such as a flash memory.
  • the control unit 11 may include a plurality of hardware processors.
  • the hardware processor may be constituted by a microprocessor, an FPGA (field-programmable gate array), a DSP (digital signal processor), or the like.
  • the storage unit 12 may be configured by a RAM and a ROM included in the control unit 11. At least one of the communication interface 13, the external interface 14, the input device 15, the output device 16, and the drive 17 may be omitted.
  • the inspection apparatus 1 may be composed of a plurality of computers. In this case, the hardware configurations of the computers may or may not match.
  • the inspection apparatus 1 is an information processing apparatus designed exclusively for the service to be provided, a general-purpose server apparatus, a general-purpose desktop PC (Personal Computer), a notebook PC, a tablet PC, a mobile phone including a smartphone, and the like. May be.
  • a general-purpose server apparatus a general-purpose desktop PC (Personal Computer), a notebook PC, a tablet PC, a mobile phone including a smartphone, and the like. May be.
  • FIG. 3 schematically illustrates an example of a hardware configuration of the learning device 2 according to the present embodiment.
  • the learning device 2 is a computer in which a control unit 21, a storage unit 22, a communication interface 23, an input device 24, an output device 25, and a drive 26 are electrically connected.
  • the communication interface is described as “communication I / F”.
  • the control unit 21 to the drive 26 of the learning device 2 may be configured similarly to the control unit 11 to the communication interface 13 and the input device 15 to the drive 17 of the inspection device 1, respectively. That is, the control unit 21 includes a CPU, RAM, ROM, and the like, which are hardware processors, and is configured to execute various types of information processing based on programs and data.
  • the storage unit 22 is configured by, for example, a hard disk drive, a solid state drive, or the like.
  • the storage unit 22 includes a learning program 82, image data 222 for learning product quality, additional image data 227 for learning product quality, first learning result data 224, second learning result data 229, and the like. Various types of information are stored.
  • the learning program 82 causes the learning device 2 to execute later-described machine learning information processing (FIG. 8) for constructing the first discriminator 5 and the second discriminator 6, and as a result, the first learning result data 224 and the first discriminator 2 is a program for generating learning result data 229.
  • the learning program 82 includes a series of instructions for the information processing.
  • the image data 222 constitutes first learning data 221 used for machine learning of the first discriminator 5.
  • the first learning data 221 and the additional image data 227 constitute second learning data 226 used for machine learning of the second discriminator 6. Details will be described later.
  • the communication interface 23 is, for example, a wired LAN module, a wireless LAN module, or the like, and is an interface for performing wired or wireless communication via a network.
  • the learning device 2 can perform data communication via the network with another information processing device (for example, the inspection device 1) by using the communication interface 23.
  • the input device 24 is a device for inputting, for example, a mouse and a keyboard.
  • the output device 25 is a device for outputting a display, a speaker, or the like, for example. The operator can operate the learning device 2 by using the input device 24 and the output device 25.
  • the drive 26 is, for example, a CD drive, a DVD drive, or the like, and is a drive device for reading a program stored in the storage medium 92. At least one of the learning program 82, the image data 222, and the additional image data 227 may be stored in the storage medium 92. The learning device 2 may acquire at least one of the learning program 82, the image data 222, and the additional image data 227 from the storage medium 92.
  • the control unit 21 may include a plurality of hardware processors.
  • the hardware processor may be configured by a microprocessor, FPGA, DSP, or the like.
  • the storage unit 22 may be configured by a RAM and a ROM included in the control unit 21. At least one of the communication interface 23, the input device 24, the output device 25, and the drive 26 may be omitted.
  • the learning device 2 may be composed of a plurality of computers. In this case, the hardware configurations of the computers may or may not match. Further, the learning device 2 may be a general-purpose server device, a general-purpose PC, or the like, in addition to an information processing device designed exclusively for the provided service.
  • FIG. 4 schematically illustrates an example of the software configuration of the inspection apparatus 1 according to the present embodiment.
  • the control unit 11 of the inspection apparatus 1 expands the inspection program 81 stored in the storage unit 12 in the RAM. Then, the control unit 11 interprets and executes the inspection program 81 expanded in the RAM, and controls each component. Thereby, as FIG. 4 shows, the test
  • the data acquisition unit 111 acquires target image data 121 in which the product R to be inspected is shown.
  • the data acquisition unit 111 acquires the target image data 121 by photographing the product R with the camera 31.
  • the determination unit 112 includes a first discriminator 5 and a second discriminator 6. The determination unit 112 uses the first discriminator 5 and the second discriminator 6 to determine pass / fail of the product R shown in the acquired target image data 121.
  • the output unit 113 outputs the result of determining the quality of the product R.
  • FIG. 5 schematically illustrates each criterion set for each classifier (5, 6) according to the present embodiment.
  • the first numerical value range for determining the quality of the product R with respect to the first output value obtained from the first discriminator 5 by inputting the target image data 121 to the first discriminator 5 is the first.
  • a numerical range for determining the quality of the product R with respect to the second output value obtained from the second discriminator 6 by inputting the target image data 121 to the second discriminator 6 is used as the second judgment criterion.
  • the numerical range of the second determination criterion is set narrower than the numerical range of the first determination criterion.
  • the determination unit 112 acquires the second output value corresponding to the result of determining the quality of the product R from the second classifier 6 by inputting the target image data 121 to the second classifier 6. . Then, when the acquired second output value is included in the numerical range of the second determination criterion, the determination unit 112 adopts the pass / fail determination by the second discriminator 6, that is, the second output value Based on this, the quality of the product R is determined. On the other hand, when the acquired second output value is not included in the numerical range of the second determination criterion, the determination unit 112 does not adopt the pass / fail determination by the second discriminator 6, and the first discriminator 5 The first output value is acquired from the first discriminator 5 by inputting the target image data 121. Then, the determination unit 112 determines the quality of the product R based on the first output value acquired from the first discriminator 5.
  • the numerical range of the first criterion is composed of a range that is greater than or equal to the first threshold and a range that is less than the first threshold.
  • the numerical range of the second determination criterion is configured by a range equal to or greater than the second threshold and a range equal to or smaller than the third threshold smaller than the second threshold.
  • the determination unit 112 inputs the target image data 121 to the second discriminator 6, thereby obtaining the second output value corresponding to the result of determining the quality of the product R from the second discriminator 6. get.
  • the determination unit 112 compares the second output value with the second threshold value. And when the 2nd output value acquired from the 2nd discriminator 6 is larger than the 2nd threshold, the judgment part 112 assumes that the 2nd output value is included in the numerical range of the 2nd judgment standard, and product R Judge that there is a defect.
  • the determination unit 112 compares the second output value with the third threshold value. When the second output value acquired from the second discriminator 6 is smaller than the third threshold value, the determination unit 112 determines that the product R is determined that the second output value is included in the numerical range of the second determination criterion. Judged to be non-defective.
  • the determination unit 112 inputs the target image data 121 to the first discriminator 5, thereby acquiring a first output value that is a result of determining whether the product R is good or bad from the first discriminator 5. Next, the determination unit 112 compares the first output value with the first threshold value. And when the acquired 1st output value is larger than a 1st threshold value, the determination part 112 determines with the product R having a defect. On the other hand, when the acquired first output value is smaller than the first threshold value, the determination unit 112 determines that the product R is a non-defective product.
  • the determination unit 112 performs the quality determination using the second classifier 6 more strictly than the quality determination using the first classifier 5. That is, when the second output value of the second discriminator 6 clearly indicates the pass / fail discrimination result, the determination unit 112 trusts the second discriminator 6 to determine pass / fail of the product R. Otherwise, the quality of the product R is judged by the first discriminator 5 without adopting the judgment result by the second discriminator 6.
  • the first threshold is a value between the second threshold and the third threshold. That is, the second threshold value is set to a value larger than the first threshold value, and the third threshold value is set to a value smaller than the first threshold value.
  • the respective threshold values are set in such a relationship.
  • the criterion for quality determination using the second discriminator 6 can be set more strictly than the criterion for quality determination using the first discriminator 5.
  • the relationship between the threshold values may not be limited to such an example.
  • the first threshold value may coincide with the second threshold value or the third threshold value.
  • the first threshold value may be larger than the second threshold value or smaller than the third threshold value.
  • Each threshold value may be determined as appropriate.
  • the first discriminator 5 is configured by a neural network.
  • the first discriminator 5 is configured by a multilayer neural network used for so-called deep learning, and includes an input layer 51, an intermediate layer (hidden layer) 52, and an output layer 53.
  • the neural network constituting the first discriminator 5 includes one intermediate layer 52, the output of the input layer 51 is input to the intermediate layer 52, and the output of the intermediate layer 52 is Input to the output layer 53.
  • the number of intermediate layers 52 may not be limited to one layer.
  • the first discriminator 5 may include two or more intermediate layers 52.
  • Each layer 51 to 53 includes one or a plurality of neurons.
  • the number of neurons in the input layer 51 may be set according to the target image data 121.
  • the number of neurons in the intermediate layer 52 may be set as appropriate according to the embodiment.
  • the number of neurons in the output layer 53 may be set according to the output format of the pass / fail determination result.
  • the number of neurons in the input layer 51 may be set according to the number of pixels in the target image data 121. In this case, the pixel value of each corresponding pixel of the target image data 121 may be input to each neuron of the input layer 51.
  • Adjacent layers of neurons are appropriately connected to each other, and a weight (connection load) is set for each connection.
  • each neuron is connected to all neurons in adjacent layers.
  • the neuron connection may not be limited to such an example, and may be appropriately set according to the embodiment.
  • a threshold is set for each neuron, and basically, the output of each neuron is determined by whether or not the sum of products of each input and each weight exceeds the threshold.
  • the determination unit 112 inputs the target image data 121 to the input layer 51 of the first discriminator 5 and performs firing determination of each neuron included in each layer in order from the input side as a calculation process of the neural network. Thereby, the determination unit 112 can acquire the first output value corresponding to the result of determining the quality of the product R shown in the input target image data 121 from the output layer 53.
  • the second discriminator 6 is also constituted by a neural network like the first discriminator 5.
  • the second discriminator 6 may be configured similarly to the first discriminator 5. That is, the input layer 61, the intermediate layer (hidden layer) 62, and the output layer 63 may be configured similarly to the layers 51 to 53 of the first discriminator 5.
  • the structure of the neural network of the second discriminator 6 may not match the first discriminator 5.
  • the number of layers of the neural network constituting the second discriminator 6, the number of neurons in each layer, and the connection relationship between the neurons may be different from those of the neural network constituting the first discriminator 5.
  • the determination unit 112 inputs the target image data 121 to the input layer 61 of the second discriminator 6, and performs firing determination of each neuron included in each layer in order from the input side as a calculation process of the neural network. Thereby, the determination unit 112 can acquire the second output value corresponding to the result of determining the quality of the product R shown in the input target image data 121 from the output layer 63.
  • the configuration of the first discriminator 5 (for example, the number of layers in each network, the number of neurons in each layer, the connection between neurons, the transfer function of each neuron), the weight of the connection between the neurons, and Information indicating the threshold value of each neuron is included in the first learning result data 224.
  • the configuration of the second discriminator 6 (neural network) (for example, the number of layers in each network, the number of neurons in each layer, the connection relationship between neurons, the transfer function of each neuron), the weight of the connection between each neuron, Information indicating the threshold value of each neuron is included in the second learning result data 229.
  • the determination unit 112 sets the first discriminator 5 with reference to the first learning result data 224 and sets the second discriminator 6 with reference to the second learning result data 229.
  • FIG. 6 schematically illustrates an example of the software configuration of the learning device 2 according to the present embodiment.
  • the control unit 21 of the learning device 2 expands the learning program 82 stored in the storage unit 22 in the RAM. Then, the control unit 21 interprets and executes the learning program 82 expanded in the RAM, and controls each component. Accordingly, as illustrated in FIG. 6, the learning device 2 according to the present embodiment is configured as a computer including the learning data acquisition unit 211 and the learning processing unit 212 as software modules. That is, in this embodiment, each software module is realized by the control unit 21 (CPU).
  • the learning data acquisition unit 211 acquires first learning data 221 composed of image data 222 for learning the quality of a product.
  • the learning processing unit 212 constructs the learned first discriminator 5 that has acquired the ability to determine the quality of the product by performing machine learning using the acquired first learning data 221.
  • the learning data acquisition unit 211 acquires the second learning data 226 including the first learning data 221 and additional image data 227 for learning the quality of the product.
  • the learning processing unit 212 constructs the learned second discriminator 6 that has acquired the ability to determine the quality of the product by performing machine learning using the acquired second learning data 226.
  • the second classifier 6 is a classifier after re-learning or additional learning in relation to the first classifier 5.
  • the learning model of each discriminator is configured by a neural network. Therefore, correct data 223 indicating the correct answer to the quality determination of the product shown in the image data 222 is assigned to the image data 222.
  • the additional image data 227 is provided with correct answer data 228 indicating a correct answer to the quality determination of the product shown in the additional image data 227.
  • the first learning data 221 is composed of a plurality of data sets each including a combination of the image data 222 and the correct answer data 223.
  • the second learning data 226 includes a plurality of data sets each including a combination of additional image data 227 and correct answer data 228, and first learning data 221.
  • the learning processing unit 212 For each data set included in the first learning data 221, the learning processing unit 212 inputs the image data 222 to the input layer 51, and outputs an output value corresponding to the correct data 223 associated with the input image data 222. Machine learning of the first discriminator 5 is performed so as to output from the output layer 53. Thereby, the learning processing unit 212 constructs the learned first discriminator 5 that has acquired the ability to judge the quality of the product. Then, the learning processing unit 212 stores information indicating the configuration of the learned first discriminator 5, the weight of the connection between the neurons, and the threshold value of each neuron as the first learning result data 224 in the storage unit 22.
  • the learning processing unit 212 inputs image data (222, 227) to the input layer 61 for each data set included in the second learning data 226, the learning processing unit 212 is associated with the input image data (222, 227).
  • the machine learning of the second discriminator 6 is performed so that the output value corresponding to the correct data (223, 228) is output from the output layer 63.
  • the learning processing unit 212 constructs the learned second discriminator 6 that has acquired the ability to determine the quality of the product.
  • the learning processing unit 212 stores information indicating the configuration of the learned second discriminator 6, the weight of the connection between the neurons, and the threshold value of each neuron as the second learning result data 229 in the storage unit 22.
  • each software module of the inspection device 1 and the learning device 2 is realized by a general-purpose CPU.
  • some or all of the above software modules may be implemented by one or more dedicated processors.
  • software modules may be omitted, replaced, and added as appropriate according to the embodiment.
  • FIG. 7 is a flowchart illustrating an example of a processing procedure of the inspection apparatus 1 according to this embodiment.
  • the processing procedure described below is an example of the inspection method of the present invention. However, the processing procedure described below is merely an example, and each processing may be changed as much as possible. Further, in the processing procedure described below, steps can be omitted, replaced, and added as appropriate according to the embodiment.
  • Step S101 the control unit 11 operates as the data acquisition unit 111, and acquires target image data 121 in which the product R to be inspected is captured.
  • the inspection apparatus 1 is connected to the camera 31 via the external interface 14. Therefore, the control unit 11 acquires target image data 121 from the camera 31.
  • the target image data 121 may be moving image data or still image data.
  • the control unit 11 advances the processing to the next step S102.
  • the route for acquiring the target image data 121 may not be limited to such an example, and may be appropriately selected according to the embodiment.
  • another information processing apparatus different from the inspection apparatus 1 may be connected to the camera 31.
  • the inspection apparatus 1 may acquire the target image data 121 by receiving transmission of the target image data 121 from another information processing apparatus.
  • Steps S102 to S108 the control unit 11 operates as the determination unit 112, and uses the first discriminator 5 and the second discriminator 6 to determine pass / fail of the product R shown in the target image data 121.
  • the range is set as the first determination criterion.
  • a numerical range for determining the quality of the product R with respect to the second output value obtained from the second discriminator 6 by inputting the target image data 121 to the second discriminator 6 is used as the second judgment criterion. Is set.
  • the numerical range of the second determination criterion is set narrower than the numerical range of the first determination criterion.
  • each of the first output value of the first discriminator 5 and the second output value of the second discriminator 6 indicates that the product R is more defective as the value is larger, and the product R is smaller as the value is smaller. It indicates that the product is non-defective (ie, has no defects).
  • the numerical range of the first determination criterion is composed of a range that is greater than or equal to the first threshold and a range that is less than the first threshold.
  • the numerical range of the second determination criterion is configured by a range equal to or greater than the second threshold and a range equal to or smaller than the third threshold smaller than the second threshold.
  • Step S102 Therefore, in step S102, the control unit 11 inputs the acquired target image data 121 to the second discriminator 6, and executes the arithmetic processing of the second discriminator 6. Thereby, the control unit 11 acquires the second output value corresponding to the result of determining the quality of the product R shown in the target image data 121 from the second discriminator 6.
  • control unit 11 sets the learned second discriminator 6 with reference to the second learning result data 229. Subsequently, the control unit 11 inputs the target image data 121 to the input layer 61 of the second discriminator 6 and performs firing determination of each neuron included in each of the layers 61 to 63 in order from the input side. Thereby, the control unit 11 acquires the second output value from the output layer 63 of the second discriminator 6. When the second output value is acquired, the control unit 11 proceeds to the next step S103.
  • step S103 the control unit 11 determines whether or not the second output value is larger than the second threshold value by comparing the second output value and the second threshold value. If it is determined that the second output value is greater than the second threshold, the control unit 11 proceeds to the next step S104, assuming that the second output value is included in the numerical range of the second determination criterion, It is determined that the product R has a defect.
  • the determination of pass / fail may be simply determining whether or not the product R is defective, and in addition to determining whether or not the product R is defective, It may include identifying the type.
  • the defects are, for example, scratches, dirt, cracks, dents, dust, burrs, color unevenness, and the like.
  • a method for identifying the type of defect may be appropriately determined according to the embodiment.
  • the second discriminator 6 may be configured to output a plurality of second output values each corresponding to a defect type. In this case, by specifying a second output value larger than the second threshold value from the plurality of second output values, the control unit 11 can identify the type of defect present in the product R.
  • each 2nd threshold value (each 3rd threshold value) may be suitably set according to each 2nd output value.
  • each second threshold value (each third threshold value) may coincide with another second threshold value (other third threshold value), or different from other second threshold values (other third threshold values). May be. The same applies to the quality determination in steps S106 and S108 described later. After determining that the product R is defective, the control unit 11 advances the processing to the next step S109.
  • the control part 11 advances a process to following step S105.
  • the handling of the case where the second output value is equal to the second threshold value may be appropriately determined according to the embodiment. That is, the case where the second output value is equal to the second threshold value may be included in either the case where the second output value is larger than the second threshold value or the case where the second output value is smaller than the second threshold value.
  • step S105 the control unit 11 determines whether the second output value is smaller than the third threshold value by comparing the second output value with the third threshold value. If it is determined that the second output value is smaller than the third threshold, the control unit 11 proceeds to the next step S106, assuming that the second output value is included in the numerical range of the second determination criterion, The product R is determined to be a non-defective product. After determining that the product R is a non-defective product, the control unit 11 advances the processing to the next step S109.
  • the control part 11 advances a process to following step S107.
  • the handling of the case where the second output value is equal to the third threshold value may be appropriately determined according to the embodiment. That is, the case where the second output value is equal to the third threshold value may be included in either the case where the second output value is smaller than the third threshold value or the case where the second output value is larger than the third threshold value.
  • Steps S107 and S108 When it is determined that the second output value acquired from the second discriminator 6 is smaller than the second threshold value and larger than the third threshold value, the control unit 11 outputs the second output value within the numerical range of the second determination criterion. Are not included, the processing of steps S107 and S108 is executed.
  • step S107 the control unit 11 inputs the acquired target image data 121 to the first discriminator 5, and executes the arithmetic processing of the first discriminator 5. Thereby, the control unit 11 acquires from the first discriminator 5 the first output value corresponding to the result of determining the quality of the product R shown in the target image data 121.
  • control unit 11 sets the learned first discriminator 5 with reference to the first learning result data 224. Subsequently, the control unit 11 inputs the target image data 121 to the input layer 51 of the first discriminator 5 and performs firing determination of each neuron included in each of the layers 51 to 53 in order from the input side. Thereby, the control unit 11 acquires the first output value from the output layer 53 of the first discriminator 5. When acquiring the first output value, the control unit 11 advances the processing to the next step S108.
  • step S108 the control unit 11 determines whether or not the acquired first output value is larger than the first threshold value by comparing the first output value and the first threshold value. When it determines with the acquired 1st output value being larger than a 1st threshold value, the control part 11 determines with the product R having a defect. On the other hand, when it determines with the acquired 1st output value being smaller than a 1st threshold value, the control part 11 determines with the product R being non-defective. After determining the quality of the product R by these, the control part 11 advances a process to following step S109.
  • handling of a case where the first output value is equal to the first threshold value may be appropriately determined according to the embodiment. That is, the case where the first output value is equal to the first threshold value may be included in either the case where the first output value is larger than the first threshold value or the case where the first output value is smaller than the first threshold value.
  • this quality determination may be simply determining whether or not the product R is defective, as in steps S104 and S106, or whether or not the product R is defective.
  • it may include identifying the type of defect.
  • a method for identifying the type of defect may be appropriately determined according to the embodiment.
  • the first discriminator 5 may be configured to output a plurality of first output values corresponding to the types of defects.
  • the control unit 11 can identify the type of defect present in the product R.
  • each 1st threshold value may be suitably set according to each 1st output value.
  • each first threshold value may coincide with another first threshold value, or may be different from the other first threshold values.
  • the control unit 11 inputs the target image data 121 to the second discriminator 6 to obtain the second output value corresponding to the result of determining the quality of the product R from the second discriminator 6 (Step S1). S102).
  • the control unit 11 adopts the pass / fail determination by the second discriminator 6, that is, the second output value Based on this, the quality of the product R is determined (steps S103 to S106).
  • the control unit 11 does not adopt the pass / fail determination by the second discriminator 6, and the first discriminator 5 By inputting the target image data 121, the first output value is obtained from the first discriminator 5 (step S107). And the control part 11 determines the quality of the product R based on the 1st output value acquired from the 1st discriminator 5 (step S108).
  • Step S109 the control unit 11 operates as the output unit 113, and outputs a result of determining whether the product R is good or bad.
  • the output format of the result of determining the quality of the product R may not be particularly limited, and may be appropriately selected according to the embodiment.
  • the control unit 11 may output the determination result of the product R to the output device 16 as it is. For example, if it is determined in step S104 or S108 that the product R has a defect, the control unit 11 may perform a warning for notifying that a defect has been found as the output process in step S109.
  • the warning destination may be the output device 16 or another information processing device such as an administrator's mobile terminal.
  • the control unit 11 determines that the defective product R is defective.
  • a process of transmitting a command for conveying along a different route to the production line may be performed as the output process of step S109.
  • control unit 11 ends the process according to this operation example.
  • control unit 11 may execute a series of steps S101 to S109 each time the product R conveyed on the production line enters the imaging range of the camera 31. Thereby, the inspection apparatus 1 can perform the appearance inspection of the product R conveyed on the production line.
  • FIG. 8 is a flowchart illustrating an example of a processing procedure of the learning device 2 according to this embodiment.
  • the processing procedure described below is merely an example, and each processing may be changed as much as possible. Further, in the processing procedure described below, steps can be omitted, replaced, and added as appropriate according to the embodiment.
  • the learning device 2 can construct the first discriminator 5 and the second discriminator 6 by the same processing procedure except that the learning data to be used is different. Below, the process sequence at the time of constructing the 1st discriminator 5 is demonstrated first.
  • step S201 the control unit 21 operates as the learning data acquisition unit 211, and acquires first learning data 221 including image data 222 for learning the quality of the product.
  • the control unit 21 since the learning model of the first discriminator 5 is configured by a neural network, the control unit 21 includes a first data set including a plurality of data sets each including a combination of image data 222 and correct answer data 223. 1 learning data 221 is acquired.
  • the method for acquiring the first learning data 221 is not particularly limited, and may be appropriately determined according to the embodiment.
  • a camera is prepared, and a product of the same type as the product R to be inspected, and a defective product (defective product) and a product without a defect (non-defective product) are photographed under various conditions using the prepared camera.
  • a learning data set can be created by combining the obtained image data 222 with correct answer data 223 indicating pass / fail (correct answer) appearing in the product shown in the image data 222.
  • the specific contents of the correct answer data 223 may be appropriately determined so that the larger the value, the more defective the product is, and the smaller the value, the better the product (that is, no defect).
  • the creation of the first learning data 221 may be performed by the learning device 2.
  • the control unit 21 may create the first learning data 221 according to the operation of the input device 24 by the operator. Further, the control unit 21 may automatically create the first learning data 221 by the processing of the learning program 82. By executing this creation process, in step S201, the control unit 21 can acquire the first learning data 221.
  • the creation of the first learning data 221 may be performed by an information processing device other than the learning device 2.
  • the first learning data 221 may be created manually by an operator or automatically by program processing.
  • the control unit 21 may acquire the first learning data 221 created by another information processing apparatus via the network, the storage medium 92, or the like.
  • the number of data sets constituting the first learning data 221 is not particularly limited, and may be appropriately determined to such an extent that machine learning of the first discriminator 5 can be performed, for example. Thereby, if the 1st learning data 221 is acquired, control part 21 will advance processing to the following step S202.
  • Step S202 the control unit 21 operates as the learning processing unit 212 and has learned the ability to determine the quality of the product by performing machine learning using the first learning data 221 acquired in step S201.
  • the first discriminator 5 is constructed.
  • the control unit 21 when the image data 222 is input to the input layer 51 using each data set constituting the first learning data 221, the control unit 21 outputs an output value corresponding to the correct answer data 223 to the output layer 53.
  • Machine learning of the neural network so that
  • the control unit 21 prepares a neural network (first discriminator 5 before learning) to be subjected to learning processing.
  • Each parameter such as the configuration of the neural network to be prepared, the initial value of the connection weight between the neurons, and the initial value of the threshold value of each neuron may be given by a template or by an operator input.
  • the control unit 21 prepares a neural network before learning based on the learning result data of the other discriminator. Also good.
  • control unit 21 uses the image data 222 included in each data set of the first learning data 221 acquired in step S201 as input data, and uses the correct answer data 223 as teacher data to learn the neural network. Execute the process. A stochastic gradient descent method or the like may be used for this neural network learning process.
  • control unit 21 inputs the image data 222 to the input layer 51 and performs firing determination of each neuron included in each of the layers 51 to 53 in order from the input side. Thereby, the control unit 21 obtains an output value from the output layer 53. Next, the control unit 21 calculates an error between the output value obtained from the output layer 53 and the value corresponding to the correct answer indicated by the correct answer data 223. Subsequently, the control unit 21 uses the error back-propagation method to calculate the weight of the connection between the neurons and the error of the threshold value of each neuron using the error of the calculated output value. Then, the control unit 21 updates the values of the connection weights between the neurons and the threshold values of the neurons based on the calculated errors.
  • the output value obtained from the output layer 53 by inputting the image data 222 to the input layer 51 is represented by the correct data 223 associated with the input image data 222.
  • the control unit 21 repeats this series of processing until it matches the value corresponding to the correct answer shown.
  • the control unit 21 can construct a learned neural network (that is, the first discriminator 5) that outputs an output value corresponding to the correct answer indicated by the correct answer data 223. .
  • the control unit 21 advances the process to the next step S203.
  • Step S203 the control unit 21 operates as the learning processing unit 212, and first sets information indicating the configuration of the first discriminator 5 constructed by machine learning, the connection weight between the neurons, and the threshold value of each neuron.
  • the learning result data 224 is stored in the storage unit 22. Thereby, the control part 21 complete
  • the learning device 2 can construct the second discriminator 6 in the same manner as the first discriminator 5 except that the learning data to be used is different from the first discriminator 5.
  • Step S201 That is, in step S201, the control unit 21 operates as the learning data acquisition unit 211, and acquires the second learning data 226 including the first learning data 221 and additional image data 227 for learning the quality of the product. To do.
  • the control unit 21 since the learning model of the second discriminator 6 is configured by a neural network, the control unit 21 includes a plurality of data sets each including a combination of additional image data 227 and correct data 228 and the first data set. The second learning data 226 configured by the learning data 221 is acquired.
  • the additional image data 227 and correct answer data 228 of the second learning data 226 may be acquired by the same method as the first learning data 221.
  • the control unit 21 can acquire the second learning data 226 by executing the data set creation process. Alternatively, when the second learning data 226 is created by another information processing device, the control unit 21 acquires the second learning data 226 created by the other information processing device via the network, the storage medium 92, or the like. can do. When the second learning data 226 is acquired, the control unit 21 advances the processing to the next step S202.
  • Step S202 the control unit 21 operates as the learning processing unit 212 and has learned the ability to determine the quality of the product by performing machine learning using the second learning data 226 acquired in step S202.
  • the second discriminator 6 is constructed.
  • the control unit 21 inputs the image data (222, 227) to the input layer 61 using each data set constituting the second learning data 226, the control unit 21 outputs an output value corresponding to the correct data (223, 228).
  • Machine learning of the neural network is performed so as to output from the output layer 63.
  • parameters such as the configuration of the neural network to be prepared (second discriminator 6 before learning), the initial value of the connection weight between neurons, and the initial value of the threshold value of each neuron may be given by the template. It may be given by an operator input. Further, when performing re-learning or additional learning of the first discriminator 5, the control unit 21 may prepare a neural network before learning based on the first learning result data 224.
  • the control unit 21 uses the image data (222, 227) included in each data set of the second learning data 226 acquired in step S201 as input data, and uses the correct data (223, 228) as teacher data. To execute the above learning process of the neural network.
  • the control unit 21 performs machine learning of the first learning data 221 (image data 222 and correct answer data 223). May be omitted. Accordingly, when the image data (222, 227) is input, the control unit 21 outputs an output value corresponding to the correct answer indicated by the correct data (223, 228) associated with the input image data (222, 227).
  • a learned neural network ie, the second discriminator 6) can be constructed. When the learning process of the second discriminator 6 is completed, the control unit 21 proceeds to the next step S203.
  • step S203 In step S ⁇ b> 203, the control unit 21 operates as the learning processing unit 212, and stores information indicating the configuration of the second discriminator 6 constructed by machine learning, the connection weight between the neurons, and the threshold value of each neuron.
  • the learning result data 229 is stored in the storage unit 22. Thereby, the control part 21 complete
  • control unit 21 may transfer the created first learning result data 224 and second learning result data 229 to the inspection apparatus 1 after the processing of step S203 is completed.
  • control unit 21 may store the created first learning result data 224 and second learning result data 229 in a data server such as NAS (Network Attached Storage).
  • NAS Network Attached Storage
  • the inspection apparatus 1 may acquire the first learning result data 224 and the second learning result data 229 from this data server.
  • the first learning result data 224 and the second learning result data 229 may be incorporated in the inspection apparatus 1 in advance.
  • the learning device 2 constructs the first discriminator 5 by executing machine learning using the first learning data 221 through a series of processes in steps S201 to S203.
  • the learning device 2 constructs the second discriminator 6 by executing machine learning using the second learning data 226 including the first learning data 221 and the additional image data 227.
  • an unknown case that is not included in the first learning data 221 is prepared as additional image data 227, so that the learning device 2 can determine whether the product R is good or bad.
  • a second discriminator 6 that can handle cases that cannot be handled by the first discriminator 5 can be constructed. Thereby, in the inspection apparatus 1, the quality of the product R can be determined by using the second classifier 6 in the case where the first classifier 5 cannot determine the quality.
  • the numerical range of the second determination criterion that determines whether or not to adopt the determination result by the second discriminator 6 is set narrower than the numerical range of the first determination criterion. ing.
  • the inspection apparatus 1 according to the present embodiment performs the second output value in steps S104 and S106.
  • the quality of the product R can be determined based on the above.
  • the inspection apparatus 1 does not adopt the determination result by the second discriminator, but based on the first output value obtained from the first discriminator 5 in steps S107 and S108, Pass / fail can be determined.
  • the determination result before re-learning or additional learning can be guaranteed in a range near the threshold boundary where the discrimination performance can be deteriorated due to over-learning, and fluctuations in the determination result due to re-learning or additional learning can be suppressed.
  • the threshold value that may cause deterioration of the identification performance due to over-learning by setting the second threshold value to a value larger than the first threshold value and the third threshold value to a value smaller than the first threshold value.
  • the determination result before re-learning or additional learning can be appropriately ensured. Therefore, according to the present embodiment, even when the discrimination performance of the second discriminator 6 is worse than that of the first discriminator 5 due to relearning or additional learning, the reliability of the pass / fail judgment is impaired. Can be prevented.
  • each discriminator (5, 6) is configured by a fully connected neural network having a multilayer structure.
  • the configuration of each discriminator (5, 6) may not be limited to such an example, and may be appropriately selected according to the embodiment.
  • each discriminator (5, 6) may be configured by a convolutional neural network, a recursive neural network, or the like.
  • a neural network is adopted as a learning model for each classifier (5, 6).
  • the learning model of each discriminator (5, 6) may not be limited to such an example, and may be appropriately selected according to the embodiment.
  • a learning model of each discriminator (5, 6) for example, a support vector machine, a self-organizing map, a learning model that performs machine learning by reinforcement learning, or the like may be employed.
  • each correct answer data (223, 228) may be omitted in each learning data (221, 226).
  • each learning result data (224, 229) includes information indicating the configuration of the neural network.
  • the configuration of each learning result data (224, 229) does not have to be limited to such an example. If the learning result data (224, 229) can be used to set each learned classifier (5, 6), the embodiment It may be determined appropriately according to For example, when the configuration of the neural network to be used is shared by each device, each learning result data (224, 229) may not include information indicating the configuration of the neural network.
  • the control unit 11 performs the arithmetic processing of the first discriminator 5 in step S107.
  • the first output value corresponding to the quality determination result by the first discriminator 5 is acquired.
  • the processing procedure of the inspection apparatus 1 may not be limited to such an example.
  • the control unit 11 executes the arithmetic processing of the first discriminator 5 and corresponds to the pass / fail determination result by the first discriminator 5.
  • One output value may be acquired.
  • the 1st output value obtained from the 1st discriminator 5 and the 2nd output value obtained from the 2nd discriminator 6 respectively show that the product R has a defect, so that a value is large, and a value is small.
  • the product R indicates that it is a non-defective product.
  • the correspondence relationship between the first output value and the second output value and the quality of the product R may not be limited to such an example.
  • the first output value obtained from the first discriminator 5 and the second output value obtained from the second discriminator 6 each indicate that the product R is a non-defective product as the value increases, and the product R as the value decreases. May be set to indicate that there is a defect.
  • the numerical range of the first determination criterion may be configured by a range greater than or equal to the first threshold and a range less than the first threshold, and the numerical range of the second determination criterion is the second threshold. You may be comprised by the above range and the range below the 3rd threshold value smaller than the said 2nd threshold value.
  • the second output value is a numerical range of the second determination criterion.
  • the product R may be determined to be non-defective.
  • the control unit 11 determines in step S105 that the second output value acquired from the second discriminator 6 is smaller than the third threshold value, the second output value is included in the numerical range of the second determination criterion.
  • it may be determined that the product R has a defect.
  • the control unit 11 determines that the second output value is the second determination criterion.
  • the first output value may be acquired from the first discriminator 5 by executing the process of step S107.
  • the control part 11 may determine whether a 1st output value is larger than a 1st threshold value by comparing a 1st output value and a 1st threshold value in step S108.
  • the control unit 11 may determine that the product R is a non-defective product.
  • the control part 11 may determine with the product R having a defect.
  • the first output value obtained from the first discriminator 5 and the second output value obtained from the second discriminator 6 are each determined according to the magnitude of the value. Indicates presence or absence.
  • the format of the first output value and the second output value may not be limited to such an example.
  • the first output value obtained from the first discriminator 5 and the second output value obtained from the second discriminator 6 are respectively the probability (defect rate) that the product R is defective or The probability that the product R is a non-defective product (non-defective product rate) may be indicated.
  • FIG. 9 illustrates another example of determination criteria for each classifier (5, 6).
  • the numerical range of the first determination criterion is configured by a range greater than or equal to the first threshold and a range less than the first threshold.
  • the numerical range of the second determination criterion is configured by a range equal to or greater than the second threshold.
  • the second threshold value is set to a value larger than the first threshold value.
  • the respective threshold values are set in such a relationship.
  • the relationship between the threshold values may not be limited to such an example.
  • the first threshold value and the second threshold value may coincide with each other, and the second threshold value may be smaller than the first threshold value.
  • Each threshold value may be determined as appropriate.
  • the control unit 11 When the first output value obtained from the first discriminator 5 and the second output value obtained from the second discriminator 6 indicate the probability that the product R is defective, the control unit 11 performs the above steps S103 to S106. Instead of the process, the quality of the product R is determined as follows. That is, the control unit 11 determines whether or not the second output value is larger than the second threshold value by comparing the second output value acquired from the second discriminator 6 with the second threshold value. When it is determined that the second output value is greater than the second threshold, the control unit 11 determines that the product R is defective, assuming that the second output value is included in the numerical range of the second determination criterion. Then, the process proceeds to the next step S109.
  • step S ⁇ b> 107 the control unit 11 acquires the first output value from the first discriminator 5 by inputting the target image data 121 to the first discriminator 5 as described above.
  • step S108 the control unit 11 compares the acquired first output value with the first threshold value, and determines whether or not the acquired first output value is larger than the first threshold value. When it determines with a 1st output value being larger than a 1st threshold value, the control part 11 determines with the product R having a defect.
  • control unit 11 may determine that the product R is a non-defective product, or may determine that the quality of the product R is unknown. Good. Thereby, when the determination of the quality of the product R is completed, the control unit 11 advances the processing to the next step S109.
  • the handling of the case where the second output value is equal to the second threshold and the case where the first output value is equal to the first threshold may be appropriately determined according to the embodiment. That is, the case where the second output value is equal to the second threshold value may be included in either the case where the second output value is larger than the second threshold value or the case where the second output value is smaller than the second threshold value.
  • the case where the first output value is equal to the first threshold value may be included in either the case where the first output value is larger than the first threshold value or the case where the first output value is smaller than the first threshold value.
  • the control unit 11 determines that the second output value is included in the numerical range of the second determination criterion, and the product R is determined to be a non-defective product.
  • the control unit 11 determines that the second output value is not included in the numerical range of the second determination criterion, and sets the target image data 121 as the first value.
  • a first output value is acquired from the first discriminator 5 by inputting to the discriminator 5. And when it determines with the acquired 1st output value being larger than a 1st threshold value, the control part 11 determines with the product R being non-defective. On the other hand, when it determines with the acquired 1st output value being smaller than a 1st threshold value, the control part 11 may determine with the product R having a defect, and the quality of the product R is unknown. May be determined.
  • each discriminator (5, 6) may be configured to output a plurality of output values.
  • one of the output values indicates the probability that the product R is a non-defective product, and the product R is defective (or the product R Other output values may indicate the probability of a particular defect).
  • the control unit 11 can determine the quality of the product R based on the output values of the discriminators (5, 6).
  • FIGS. 10 and 11 schematically illustrate an example of a hardware configuration and a software configuration of the image identification device 1A according to the present modification.
  • the image identification device 1A according to the present modification may be configured in the same manner as the inspection device 1 except that the data to be processed is replaced with image data in which some object is captured from image data in which the product is imaged. That is, as shown in FIGS. 10 and 11, the hardware configuration and software configuration of the image identification device 1 ⁇ / b> A may be the same as the hardware configuration and software configuration of the inspection device 1, respectively.
  • the state of the subject and the subject to be identified need not be particularly limited, and may be appropriately selected according to the embodiment.
  • the subject may be, for example, the subject's face, the subject's body, the work target work, or the like.
  • the state to be identified may be, for example, the type of facial expression, the state of the facial parts, the individual who owns the face, and the like. Identification of the person who owns the face may be performed to perform face authentication.
  • the state to be identified may be, for example, a body pose.
  • the state to be identified may be, for example, the position and posture of the work.
  • the storage unit 12 of the image identification device 1A stores various information such as an image identification program 81A, first learning result data 224A, and second learning result data 229A.
  • the image identification program 81A is a program for causing the image identification apparatus 1A to perform information processing for determining the state of the subject by the same processing procedure as that of the inspection apparatus 1, and includes a series of instructions for the information processing.
  • the first learning result data 224A is data for setting the learned first discriminator 5A.
  • the second learning result data 229A is data for setting the learned second discriminator 6A.
  • the first discriminator 5A is constructed by machine learning using first learning data composed of image data for performing learning for identifying the state of the subject.
  • the second discriminator 6A is constructed by machine learning using second image data composed of additional image data and first learning data for performing learning for identifying the state of the subject.
  • Each discriminator (5A, 6A) is constructed so as to acquire the ability to identify the state of the subject in the target image data. Machine learning of each discriminator (5A, 6A) may be executed by the learning device 2 in the same manner as in the above embodiment.
  • the first discriminator 5A uses, for example, first learning data composed of image data showing the face and correct data indicating the correct answer to the identification of the face state shown in the image data.
  • the second discriminator 6A receives, for example, additional image data showing a face, correct answer data indicating a correct answer for identification of a face state shown in the additional image data, and second learning data composed of first learning data. It is constructed by using machine learning.
  • the image identification device 1A is connected to the camera 31 via the external interface 14 as in the above embodiment.
  • the camera 31 is appropriately arranged at a place where the subject whose state is to be determined can be taken.
  • the camera 31 may be disposed at a location where the subject who is the subject may exist.
  • the camera 31 may be arranged toward a place where the work can exist.
  • the image identification device 1A operates as a computer including the data acquisition unit 111, the determination unit 112, and the output unit 113 as software modules by executing the image identification program 81A by the control unit 11.
  • the data acquisition unit 111 acquires target image data 121A in which a subject to be identified is captured.
  • the determination unit 112 includes a first discriminator 5A and a second discriminator 6A, and uses the first discriminator 5A and the second discriminator 6A to determine the state of the subject in the acquired target image data 121A.
  • the output unit 113 outputs the result of determining the state of the subject.
  • the subject image data 121A is input to the first discriminator 5A, and the first output value obtained from the first discriminator 5A determines the state of the subject.
  • a numerical range for this is set as the first criterion.
  • a numerical range for determining the state of the subject is set as a second determination criterion in the second output value obtained from the second identifier 6A by inputting the target image data 121A to the second identifier 6A. The Then, the numerical range of the second determination criterion is set narrower than the numerical range of the first determination criterion.
  • the determination unit 112 acquires the second output value from the second discriminator 6A by inputting the target image data 121A to the second discriminator 6A.
  • the determination unit 112 determines the state of the subject based on the second output value.
  • the determination unit 112 inputs the target image data 121A to the first identifier 5A, whereby the first identifier 5A. To obtain the first output value. Then, the determination unit 112 determines the state of the subject based on the acquired first output value.
  • the image identification device 1A can determine the state of the subject by a processing procedure substantially similar to that of the inspection device 1.
  • step S101 the control unit 11 operates as the data acquisition unit 111, and acquires target image data 121A in which a subject whose state is to be determined is captured.
  • the control unit 11 operates as the determination unit 112, and determines the state of the subject using the first discriminator 5A and the second discriminator 6A.
  • the first output value of the first discriminator 5A and the second output value of the second discriminator 6A each indicate that the subject is in the first state as the value increases, and the subject decreases as the value decreases. You may show that it is a 2nd state.
  • the numerical range of the first determination criterion may be configured by a range greater than or equal to the first threshold and a range less than the first threshold.
  • the numerical range of the second determination criterion may be configured by a range equal to or greater than the second threshold and a range equal to or smaller than the third threshold smaller than the second threshold.
  • first state and the second state may be appropriately determined according to the embodiment.
  • the first state corresponds to a specific facial expression (for example, a smile)
  • the second state corresponds to a non-specific facial expression. It may be. The same applies to other examples.
  • step S102 the control unit 11 sets the learned second discriminator 6A with reference to the second learning result data 229A. Subsequently, the control unit 11 inputs the target image data 121A to the second discriminator 6A, and executes the calculation process of the second discriminator 6A. Thereby, the control unit 11 acquires the second output value corresponding to the result of determining the state of the subject in the target image data 121A from the second discriminator 6A.
  • step S103 the control unit 11 determines whether or not the second output value is larger than the second threshold value by comparing the second output value and the second threshold value. If it is determined that the second output value is greater than the second threshold, the control unit 11 proceeds to the next step S104, assuming that the second output value is included in the numerical range of the second determination criterion, It is determined that the subject in the target image data 121A is in the first state. On the other hand, when it determines with a 2nd output value being smaller than a 2nd threshold value, the control part 11 advances a process to following step S105. Handling of the case where the second output value is equal to the second threshold value may be appropriately selected according to the embodiment.
  • step S105 the control unit 11 determines whether or not the second output value is smaller than the third threshold value by comparing the second output value and the third threshold value. If it is determined that the second output value is smaller than the third threshold, the control unit 11 proceeds to the next step S106, assuming that the second output value is included in the numerical range of the second determination criterion, The subject shown in the target image data 121A is determined to be in the second state. On the other hand, when it determines with a 2nd output value being larger than a 3rd threshold value, the control part 11 advances a process to following step S107. The handling of the case where the second output value is equal to the third threshold value may be appropriately determined according to the embodiment.
  • step S107 the control unit 11 sets the learned first discriminator 5A with reference to the first learning result data 224A. Subsequently, the control unit 11 inputs the target image data 121A to the first discriminator 5A, and executes arithmetic processing of the first discriminator 5A. Thereby, the control part 11 acquires the 1st output value corresponding to the result of having determined the state of the subject reflected in the target image data 121A from the first discriminator 5A.
  • step S108 the control unit 11 determines whether or not the acquired first output value is larger than the first threshold value by comparing the first output value and the first threshold value. When it is determined that the acquired first output value is greater than the first threshold value, the control unit 11 determines that the subject in the target image data 121A is in the first state. On the other hand, when it is determined that the acquired first output value is smaller than the first threshold value, the control unit 11 determines that the subject shown in the target image data 121A is in the second state.
  • the handling of the case where the first output value is equal to the first threshold value may be appropriately determined according to the embodiment.
  • step S109 the control unit 11 operates as the output unit 113 and outputs a result of determining the state of the subject.
  • the output format of the result of determining the state of the subject is not particularly limited, and may be appropriately selected according to the embodiment.
  • the control unit 11 may output the result of determining the state of the subject to the output device 16 as it is.
  • a predetermined output process may be associated with the result of determining the state of the subject.
  • the control unit 11 determines whether the subject's face is not smiling (is in the second state). An e-mail for notifying the user's portable terminal may be transmitted as the output process.
  • the first output value of the first discriminator 5A and the second output value of the second discriminator 6A are predetermined for the subject imaged in the target image data 121A.
  • the probability of being in the state may be shown.
  • the numerical range of the first determination criterion may be configured by a range that is greater than or equal to the first threshold and a range that is less than the first threshold
  • the numerical range of the second determination criterion is configured by a range that is greater than or equal to the second threshold. May be.
  • control unit 11 of the image identification device 1A compares the second output value acquired from the second identifier 6A with the second threshold value in place of the processing of the above steps S103 to S106, thereby obtaining the second output. It may be determined whether the value is greater than a second threshold value. When determining that the second output value is greater than the second threshold value, the control unit 11 assumes that the second output value is included in the numerical range of the second determination criterion, and the subject captured in the target image data 121A is You may determine with it being a predetermined state.
  • step S107 if it is determined that the second output value is smaller than the second threshold, the control unit 11 proceeds to step S107, assuming that the second output value is not included in the numerical range of the second determination criterion. Also good. Handling of the case where the second output value is equal to the second threshold value may be appropriately selected according to the embodiment.
  • the control unit 11 may acquire the first output value from the first discriminator 5A by inputting the target image data 121A to the first discriminator 5A.
  • the control unit 11 may compare the acquired first output value with the first threshold value and determine whether the acquired first output value is larger than the first threshold value. When it is determined that the first output value is greater than the first threshold, the control unit 11 may determine that the subject that is captured in the target image data 121A is in a predetermined state. On the other hand, when it is determined that the first output value is smaller than the first threshold, the control unit 11 may determine that the subject shown in the target image data 121A is not in a predetermined state, and the state of the subject is unknown. It may be determined that Handling of the case where the first output value is equal to the first threshold value may be appropriately selected according to the embodiment.
  • the identification device 1B according to the present modification may be configured in the same manner as the inspection device 1 except that the data to be processed is replaced with other types of data including some characteristics from the image data representing the product. That is, as shown in FIGS. 12 and 13, the hardware configuration and software configuration of the identification device 1B may be the same as the hardware configuration and software configuration of the inspection device 1, respectively.
  • the storage unit 12 of the identification device 1B stores various information such as an identification program 81B, first learning result data 224B, and second learning result data 229B.
  • the identification program 81B is a program for causing the identification device 1B to perform information processing for determining data characteristics according to the same processing procedure as the inspection device 1, and includes a series of instructions for the information processing.
  • the first learning result data 224B is data for setting the learned first discriminator 5B.
  • the second learning result data 229B is data for setting the learned second discriminator 6B.
  • the data to be processed may include all kinds of data that can be analyzed by the classifier.
  • the feature identified from the target data may include any feature that can be identified from the data.
  • the target data is sound data
  • the identified feature may be, for example, whether or not a specific sound (for example, mechanical noise) is included.
  • the target data is numerical data or text data related to biological data such as activity data
  • the identified feature may be, for example, the state of the subject (for example, whether or not he / she is healthy).
  • the target data is numerical data such as driving amount of the machine or text data
  • the identified feature is, for example, the state of the machine (for example, whether or not the machine is in a predetermined state) Good.
  • the first discriminator 5B is constructed by machine learning using first learning data constituted by data for performing learning for identifying features.
  • the second discriminator 6B is constructed by machine learning using second learning data composed of first learning data and additional data for performing learning for identifying features.
  • Each classifier (5B, 6B) is constructed to acquire the ability to identify features contained in the data of interest.
  • Machine learning of each discriminator (5B, 6B) may be executed by the learning device 2 in the same manner as in the above embodiment.
  • the first discriminator 5B uses, for example, machine learning using first learning data composed of sound data and correct data indicating correct answers for identification of features included in the sound data. It is constructed by.
  • the second discriminator 6B uses, for example, machine learning using additional sound data, correct answer data indicating a correct answer for identification of a feature included in the additional sound data, and second learning data including first learning data. It is constructed by.
  • the identification device 1B is connected to the measurement device 31B via the external interface 14.
  • the measuring device 31B is appropriately configured so as to be able to acquire target data.
  • the type of the measuring device 31B may be appropriately determined according to the data to be processed.
  • the measurement device 31B is, for example, a microphone.
  • the measurement device 31B is a device configured to be able to measure biological information, such as an activity meter or a blood pressure monitor.
  • the measuring device 31B is a device configured to be able to measure a target physical quantity such as an encoder, for example.
  • the arrangement of the measurement device 31B may be appropriately determined according to the embodiment.
  • the identification device 1B operates as a computer including the data acquisition unit 111, the determination unit 112, and the output unit 113 as software modules by executing the identification program 81B by the control unit 11.
  • the data acquisition unit 111 acquires target data 121B including a feature to be identified.
  • the determination unit 112 includes a first discriminator 5B and a second discriminator 6B, and uses the first discriminator 5B and the second discriminator 6B to determine a feature appearing in the acquired target data 121B.
  • the output unit 113 outputs the result of determining the state of the subject.
  • the first output value obtained from the first discriminator 5B by inputting the target data 121B to the first discriminator 5B is a numerical value for determining the feature.
  • a range is set as the first criterion.
  • a numerical range for determining the state is set as a second determination criterion for the second output value obtained from the second identifier 6B by inputting the target data 121B to the second identifier 6B. Then, the numerical range of the second determination criterion is set narrower than the numerical range of the first determination criterion.
  • the determination unit 112 acquires the second output value from the second discriminator 6B by inputting the target data 121B to the second discriminator 6B.
  • the determination unit 112 determines a feature appearing in the target data 121B based on the second output value.
  • the determination unit 112 inputs the target data 121B to the first identifier 5B, so that the first identifier 5B Obtain the first output value.
  • the determination part 112 determines the characteristic which appears in the object data 121B based on the acquired 1st output value.
  • the identification device 1B can determine a feature appearing in the target data 121B by a processing procedure substantially similar to that of the inspection device 1.
  • step S101 the control unit 11 operates as the data acquisition unit 111 and acquires target data 121B including a feature to be determined.
  • the control unit 11 acquires the target data 121B from the measurement device 31B via the external interface 14.
  • the route for acquiring the target data 121B may not be limited to such an example, and may be appropriately selected according to the embodiment.
  • the control unit 11 operates as the determination unit 112, and determines the feature appearing in the target data 121B using the first discriminator 5B and the second discriminator 6B.
  • the first output value of the first discriminator 5B and the second output value of the second discriminator 6B each indicate that the larger the value, the first feature appears in the target data 121B. It may be shown that the second feature appears in the target data 121B as the value of is smaller.
  • the numerical range of the first determination criterion may be configured by a range greater than or equal to the first threshold and a range less than the first threshold.
  • the numerical range of the second determination criterion may be configured by a range equal to or greater than the second threshold and a range equal to or smaller than the third threshold smaller than the second threshold.
  • the first feature and the second feature may be appropriately determined according to the embodiment.
  • the target data 121B is sound data
  • the first feature corresponds to the fact that machine noise is included
  • the second feature corresponds to the fact that machine noise is not included. May be.
  • the target data 121B is numerical data or text data related to biometric data
  • the first feature corresponds to the subject being healthy
  • the second feature is that the subject is abnormal. It may correspond. The same applies to other examples.
  • step S102 the control unit 11 sets the learned second discriminator 6B with reference to the second learning result data 229B. Subsequently, the control unit 11 inputs the target data 121B to the second discriminator 6B, and executes the calculation process of the second discriminator 6B. Thereby, the control part 11 acquires the 2nd output value corresponding to the result of having determined the characteristic which appears in the object data 121B from the 2nd discriminator 6B.
  • step S103 the control unit 11 determines whether or not the second output value is larger than the second threshold value by comparing the second output value and the second threshold value. If it is determined that the second output value is greater than the second threshold, the control unit 11 proceeds to the next step S104, assuming that the second output value is included in the numerical range of the second determination criterion, It is determined that the first feature appears in the target data 121B. On the other hand, when it determines with a 2nd output value being smaller than a 2nd threshold value, the control part 11 advances a process to following step S105. Handling of the case where the second output value is equal to the second threshold value may be appropriately selected according to the embodiment.
  • step S105 the control unit 11 determines whether or not the second output value is smaller than the third threshold value by comparing the second output value and the third threshold value. If it is determined that the second output value is smaller than the third threshold, the control unit 11 proceeds to the next step S106, assuming that the second output value is included in the numerical range of the second determination criterion, It is determined that the second feature appears in the target data 121B. On the other hand, when it determines with a 2nd output value being larger than a 3rd threshold value, the control part 11 advances a process to following step S107.
  • the handling of the case where the second output value is equal to the third threshold value may be appropriately determined according to the embodiment.
  • step S107 the control unit 11 sets the learned first discriminator 5B with reference to the first learning result data 224B. Subsequently, the control unit 11 inputs the target data 121B to the first discriminator 5B and executes the arithmetic processing of the first discriminator 5B. Thereby, the control part 11 acquires the 1st output value corresponding to the result of having determined the characteristic which appears in the object data 121B from the 1st discriminator 5B.
  • step S108 the control unit 11 determines whether or not the acquired first output value is larger than the first threshold value by comparing the first output value and the first threshold value.
  • the control part 11 determines with the 1st characteristic appearing in the object data 121B.
  • the control part 11 determines with the 2nd feature appearing in the object data 121B.
  • step S109 the control unit 11 operates as the output unit 113, and outputs the result of determining the characteristics included in the target data 121B.
  • the output format of the determined result is not particularly limited and may be appropriately selected according to the embodiment.
  • the control unit 11 may output the result of determining the characteristics included in the target data 121B to the output device 16 as it is.
  • the control unit 11 may execute a predetermined output process according to the determination result.
  • the target data 121B is sound data and it is determined whether or not mechanical noise is included as a feature of the sound data, when it is determined that the target data 121B includes mechanical noise.
  • the control unit 11 may perform transmission of an e-mail alerting the portable terminal of the manager of the machine as the output process.
  • the reliability of the process for determining the feature appearing in the target data It is possible to prevent the performance from being impaired.
  • the first output value of the first discriminator 5B and the second output value of the second discriminator 6B each have a predetermined characteristic in the target data 121B. May indicate the probability.
  • the numerical range of the first determination criterion may be configured by a range that is greater than or equal to the first threshold and a range that is less than the first threshold, and the numerical range of the second determination criterion is configured by a range that is greater than or equal to the second threshold. May be.
  • the control unit 11 of the identification device 1B compares the second output value acquired from the second discriminator 6B with the second threshold value in place of the processing in steps S103 to S106, so that the second output value is obtained. You may determine whether it is larger than a 2nd threshold value. If it is determined that the second output value is greater than the second threshold, the control unit 11 assumes that the second output value is included in the numerical range of the second determination criterion, and the target data 121B has a predetermined feature. It may be determined that appears. On the other hand, if it is determined that the second output value is smaller than the second threshold, the control unit 11 proceeds to step S107, assuming that the second output value is not included in the numerical range of the second determination criterion. Also good. Handling of the case where the second output value is equal to the second threshold value may be appropriately selected according to the embodiment.
  • the control unit 11 may acquire the first output value from the first discriminator 5B by inputting the target data 121B to the first discriminator 5B.
  • the control unit 11 may compare the acquired first output value with the first threshold value and determine whether the acquired first output value is larger than the first threshold value. When it is determined that the first output value is greater than the first threshold, the control unit 11 may determine that a predetermined feature appears in the target data 121B. On the other hand, when it is determined that the first output value is smaller than the first threshold value, the control unit 11 may determine that the predetermined feature does not appear in the target data 121B, or the predetermined feature is included in the target data 121B. It may be determined that it is unknown whether or not Handling of the case where the first output value is equal to the first threshold value may be appropriately selected according to the embodiment.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

La fiabilité de la détermination dans laquelle un discriminateur est utilisé est empêchée d'être altérée même lorsque les performances de discrimination du discriminateur se détériorent en raison d'un nouvel apprentissage ou d'un apprentissage supplémentaire. Un dispositif d'inspection selon un aspect de la présente invention contient à la fois un premier discriminateur avant un nouvel apprentissage ou un apprentissage supplémentaire, et un second discriminateur après un nouvel apprentissage ou un apprentissage supplémentaire. La plage numérique d'un second critère du second discriminateur est réglée de façon à être plus étroite que la plage numérique d'un premier critère du premier discriminateur. Lorsqu'une seconde valeur de sortie du second discriminateur satisfait le second critère, le dispositif d'inspection détermine l'acceptabilité d'un produit sur la base de la seconde valeur de sortie du second discriminateur. Cependant, lorsque la seconde valeur de sortie du second discriminateur ne satisfait pas le second critère, le dispositif d'inspection détermine l'acceptabilité du produit sur la base d'une première valeur de sortie du premier discriminateur.
PCT/JP2019/010180 2018-03-13 2019-03-13 Dispositif d'inspection, dispositif de discrimination d'image, dispositif de discrimination, procédé d'inspection et programme d'inspection WO2019176990A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-045862 2018-03-13
JP2018045862A JP6844563B2 (ja) 2018-03-13 2018-03-13 検査装置、画像識別装置、識別装置、検査方法、及び検査プログラム

Publications (1)

Publication Number Publication Date
WO2019176990A1 true WO2019176990A1 (fr) 2019-09-19

Family

ID=67906786

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/010180 WO2019176990A1 (fr) 2018-03-13 2019-03-13 Dispositif d'inspection, dispositif de discrimination d'image, dispositif de discrimination, procédé d'inspection et programme d'inspection

Country Status (2)

Country Link
JP (1) JP6844563B2 (fr)
WO (1) WO2019176990A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI721632B (zh) * 2019-11-05 2021-03-11 新加坡商鴻運科股份有限公司 產品檢測閾值設定裝置、方法及電腦可讀取存儲介質
US20220091576A1 (en) * 2020-09-24 2022-03-24 International Business Machines Corporation Detection of defect in edge device manufacturing by artificial intelligence
CN114228008A (zh) * 2021-12-20 2022-03-25 深圳市友联精诚塑胶制品有限公司 一种塑胶成型方法和系统
WO2022215446A1 (fr) * 2021-04-05 2022-10-13 パナソニックIpマネジメント株式会社 Dispositif de détermination d'image, procédé de détermination d'image et programme

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7323177B2 (ja) * 2019-12-17 2023-08-08 株式会社 システムスクエア 検査システム、検査装置、学習装置及びプログラム
JP7016179B2 (ja) * 2020-02-21 2022-02-04 株式会社 システムスクエア 検査装置およびプログラム
JP7096296B2 (ja) * 2020-07-30 2022-07-05 楽天グループ株式会社 情報処理装置、情報処理方法およびプログラム

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011257805A (ja) * 2010-06-04 2011-12-22 Sony Corp 情報処理装置および方法、並びにプログラム
JP2015032119A (ja) * 2013-08-01 2015-02-16 セコム株式会社 対象検出装置
JP2017058833A (ja) * 2015-09-15 2017-03-23 キヤノン株式会社 オブジェクト識別装置、オブジェクト識別方法及びプログラム
JP2018032071A (ja) * 2016-08-22 2018-03-01 株式会社クレスコ 検証装置、検証方法及び検証プログラム

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011257805A (ja) * 2010-06-04 2011-12-22 Sony Corp 情報処理装置および方法、並びにプログラム
JP2015032119A (ja) * 2013-08-01 2015-02-16 セコム株式会社 対象検出装置
JP2017058833A (ja) * 2015-09-15 2017-03-23 キヤノン株式会社 オブジェクト識別装置、オブジェクト識別方法及びプログラム
JP2018032071A (ja) * 2016-08-22 2018-03-01 株式会社クレスコ 検証装置、検証方法及び検証プログラム

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI721632B (zh) * 2019-11-05 2021-03-11 新加坡商鴻運科股份有限公司 產品檢測閾值設定裝置、方法及電腦可讀取存儲介質
US20220091576A1 (en) * 2020-09-24 2022-03-24 International Business Machines Corporation Detection of defect in edge device manufacturing by artificial intelligence
US11493901B2 (en) * 2020-09-24 2022-11-08 International Business Machines Corporation Detection of defect in edge device manufacturing by artificial intelligence
WO2022215446A1 (fr) * 2021-04-05 2022-10-13 パナソニックIpマネジメント株式会社 Dispositif de détermination d'image, procédé de détermination d'image et programme
CN114228008A (zh) * 2021-12-20 2022-03-25 深圳市友联精诚塑胶制品有限公司 一种塑胶成型方法和系统
CN114228008B (zh) * 2021-12-20 2023-08-11 深圳市友联精诚塑胶制品有限公司 一种塑胶成型方法和系统

Also Published As

Publication number Publication date
JP6844563B2 (ja) 2021-03-17
JP2019159820A (ja) 2019-09-19

Similar Documents

Publication Publication Date Title
WO2019176990A1 (fr) Dispositif d'inspection, dispositif de discrimination d'image, dispositif de discrimination, procédé d'inspection et programme d'inspection
US11715190B2 (en) Inspection system, image discrimination system, discrimination system, discriminator generation system, and learning data generation device
CN110619618A (zh) 一种表面缺陷检测方法、装置及电子设备
JP7059883B2 (ja) 学習装置、画像生成装置、学習方法、及び学習プログラム
JP2016085704A (ja) 情報処理システム、情報処理装置、情報処理方法、及びプログラム
CN113095438A (zh) 晶圆缺陷分类方法及其装置、系统、电子设备和存储介质
CN113557536B (zh) 学习系统、数据生成装置、数据生成方法及存储介质
WO2019176989A1 (fr) Système d'inspection, système de discrimination et générateur de données d'apprentissage
WO2019176988A1 (fr) Système d'inspection, système d'identification et dispositif d'évaluation d'appareil d'identification
US20220405586A1 (en) Model generation apparatus, estimation apparatus, model generation method, and computer-readable storage medium storing a model generation program
CN112633461A (zh) 应用辅助系统和方法以及计算机可读记录介质
JP2020135051A (ja) 欠点検査装置、欠点検査方法、欠点検査プログラム、学習装置および学習済みモデル
JP2020042668A (ja) 検査装置及び機械学習方法
JP7059889B2 (ja) 学習装置、画像生成装置、学習方法、及び学習プログラム
US11120541B2 (en) Determination device and determining method thereof
JP7070308B2 (ja) 推定器生成装置、検査装置、推定器生成方法、及び推定器生成プログラム
CN113837173A (zh) 目标对象检测方法、装置、计算机设备和存储介质
JP2021174194A (ja) 学習用データ処理装置、学習装置、学習用データ処理方法、およびプログラム
US20230274409A1 (en) Method for automatic quality inspection of an aeronautical part
US20230005120A1 (en) Computer and Visual Inspection Method
JP7459696B2 (ja) 異常検知システム、学習装置、異常検知プログラム、学習プログラム、異常検知方法、および学習方法演算装置の学習方法
WO2022215446A1 (fr) Dispositif de détermination d'image, procédé de détermination d'image et programme
US20240193460A1 (en) Data processing method and data processing apparatus
JP2010191564A (ja) 特性解析方法および装置、特性分類方法および装置、上記特性解析方法または特性分類方法をコンピュータに実行させるためのプログラム、上記プログラムを記録したコンピュータ読み取り可能な記録媒体
US20210004954A1 (en) Neural network-type image processing device, appearance inspection apparatus and appearance inspection method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19767335

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19767335

Country of ref document: EP

Kind code of ref document: A1