US20220084182A1 - Control method for controlling system and system - Google Patents
Control method for controlling system and system Download PDFInfo
- Publication number
- US20220084182A1 US20220084182A1 US17/533,907 US202117533907A US2022084182A1 US 20220084182 A1 US20220084182 A1 US 20220084182A1 US 202117533907 A US202117533907 A US 202117533907A US 2022084182 A1 US2022084182 A1 US 2022084182A1
- Authority
- US
- United States
- Prior art keywords
- workpiece
- image data
- imaging apparatus
- learning model
- imaging
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 40
- 238000003384 imaging method Methods 0.000 claims abstract description 100
- 238000010801 machine learning Methods 0.000 claims abstract description 12
- 230000007547 defect Effects 0.000 claims description 38
- 230000015654 memory Effects 0.000 claims description 15
- 230000002950 deficient Effects 0.000 claims description 5
- 238000013528 artificial neural network Methods 0.000 claims description 2
- 238000003066 decision tree Methods 0.000 claims description 2
- 238000012706 support-vector machine Methods 0.000 claims description 2
- 238000007689 inspection Methods 0.000 description 12
- 238000010586 diagram Methods 0.000 description 8
- 238000004519 manufacturing process Methods 0.000 description 7
- 238000013473 artificial intelligence Methods 0.000 description 6
- 238000003860 storage Methods 0.000 description 6
- 230000006870 function Effects 0.000 description 5
- 238000004364 calculation method Methods 0.000 description 4
- 230000007423 decrease Effects 0.000 description 4
- 238000013135 deep learning Methods 0.000 description 3
- 238000013527 convolutional neural network Methods 0.000 description 2
- 238000005286 illumination Methods 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000005096 rolling process Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000008054 signal transmission Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
- G06T7/001—Industrial image inspection using an image reference approach
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/89—Investigating the presence of flaws or contamination in moving material, e.g. running paper or textiles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/93—Detection standards; Calibrating baseline adjustment, drift correction
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B13/00—Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion
- G05B13/02—Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion electric
- G05B13/0265—Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion electric the criterion being a learning criterion
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/418—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM]
- G05B19/41875—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM] characterised by quality surveillance of production
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
- G06T1/20—Processor architectures; Processor configuration, e.g. pipelining
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/73—Circuitry for compensating brightness variation in the scene by influencing the exposure time
-
- H04N5/2353—
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/32—Operator till task planning
- G05B2219/32193—Ann, neural base quality management
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10141—Special mode during image acquisition
- G06T2207/10144—Varying exposure
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
- G06T2207/30164—Workpiece; Machine component
Abstract
A control method for controlling a system including an imaging apparatus and a processing apparatus having a learning model to which image data is input includes capturing a first workpiece using the imaging apparatus set to a first imaging condition, thereby obtaining first image data, performing machine learning on the learning model using the first image data as supervised data, obtaining second image data using the imaging apparatus set to the first imaging condition, inputting the second image data to the trained learning model and making an estimation regarding a second workpiece based on the second image data, in a case where an accuracy of the estimation is lower than a predetermined value, obtaining third image data using the imaging apparatus set to a second imaging condition different from the first imaging condition, and performing machine learning on the learning model using the third image data as the supervised data.
Description
- This application is a Continuation of International Patent Application No. PCT/JP2020/020449, filed May 25, 2020, which claims the benefit of Japanese Patent Application No. 2019-101750, filed May 30, 2019, both of which are hereby incorporated by reference herein in their entirety.
- The present invention relates to a control method for controlling a system and a system.
- A system that recognizes a product (a workpiece) produced in a factory is known. Examples of the system include an inspection system where, in the external appearance inspection of a workpiece, the presence or absence of a defect is determined by a machine based on image data acquired by the imaging apparatus, instead of visually checking the image data with human eyes. Japanese Patent Application Laid-Open No. 2018-164272 discusses an inspection system where an imaging apparatus captures a workpiece, image data obtained by the imaging apparatus is input to a processing apparatus including artificial intelligence, and the processing apparatus inspects a defect.
- In a case where an inspection is performed using artificial intelligence as in the inspection system discussed in Japanese Patent Application Laid-Open No. 2018-164272, a trained model is generated. For example, in the external appearance inspection of a workpiece produced in a factory, a workpiece moving down a production line may be inspected. The workpiece, however, is moving at high speed, and therefore, if the shutter speed is too slow, the workpiece displayed in image data may be blurred. If a trained model is created based on such image data, the accuracy of the recognition of a workpiece by artificial intelligence may decrease. The accuracy of the recognition of a workpiece is, for example, the accuracy of the distinction of the presence or absence of a defect. It may also be possible that image data is obtained by increasing the shutter speed. However, this may limit an image capturing location because a predetermined amount of light is required. Japanese Patent Application Laid-Open No. 2018-164272 does not consider such a decrease in the accuracy of the recognition of a workpiece due to an imaging condition when image data for generating a trained model is obtained.
- PTL 1: Japanese Patent Laid-Open No. 2018-164272
- According to an aspect of the present invention, a control method for controlling a system including an imaging apparatus and a processing apparatus having a learning model to which image data captured by the imaging apparatus is input, the control method includes capturing a first workpiece using the imaging apparatus set to a first imaging condition while changing a position of the first workpiece relative to the imaging apparatus, thereby obtaining first image data, performing machine learning on the learning model using the first image data as supervised data, capturing a second workpiece different from the first workpiece using the imaging apparatus set to the first imaging condition while changing a position of the second workpiece relative to the imaging apparatus, thereby obtaining second image data, inputting the second image data to the trained learning model and making an estimation regarding the second workpiece based on the second image data, in a case where an accuracy of the estimation is lower than a predetermined value, capturing a third workpiece using the imaging apparatus set to a second imaging condition different from the first imaging condition while changing a position of the third workpiece relative to the imaging apparatus, thereby obtaining third image data, and performing machine learning on the learning model using the third image data as the supervised data.
- According to another aspect of the present invention, a system includes an imaging apparatus, and a processing apparatus having a learning model to which image data captured by the imaging apparatus is input, the processing apparatus configured to perform machine learning on the learning model using, as supervised data, first image data obtained by capturing a first workpiece using the imaging apparatus set to a first imaging condition while changing a position of the first workpiece relative to the imaging apparatus, input, to the trained learning model, second image data obtained by capturing a second workpiece different from the first workpiece using the imaging apparatus set to the first imaging condition while changing a position of the second workpiece relative to the imaging apparatus and make an estimation regarding the second workpiece based on the second image data, and in a case where an accuracy of the estimation is lower than a predetermined value, perform machine learning on the learning model using, as the supervised data, third image data obtained by capturing a third workpiece using the imaging apparatus set to a second imaging condition different from the first imaging condition while changing a position of the third workpiece relative to the imaging apparatus.
- Further features of the present invention will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
-
FIG. 1 is a diagram illustrating an overview of a system. -
FIG. 2 is a diagram illustrating a processing flow of the system. -
FIGS. 3A and 3B are diagrams illustrating a learning model used in the system. -
FIGS. 4A to 4D are diagrams illustrating a concept of a method for generating a trained model. -
FIG. 5 is a diagram illustrating a processing flow of the method for generating a trained model. - Exemplary embodiments illustrated below merely make the technical idea of the present invention specific, and do not limit the present invention. The sizes of and the positional relationships between members illustrated in the drawings are occasionally exaggerated to clarify the description. In the following description, similar components are designated by the same numbers and not described.
-
FIG. 1 illustrates the basic configuration of an inspection system as an example of a system. The system according to the present exemplary embodiment can be applied to various systems in addition to the inspection system. Examples of the various systems include a system that identifies whether a particular object is present in image data, and an automatic sorting system in a delivery center. - With reference to a processing flow illustrated in
FIG. 2 , the inspection system according to the present exemplary embodiment will be described below. - First, in step S1 in
FIG. 2 , the presence or absence of a workpiece (a target object) in a predetermined range is detected using asensor 10. Thesensor 10 is, for example, a sensor for detecting a workpiece moving at high speed on a production line. As thesensor 10, for example, an infrared sensor is used. If thesensor 10 detects a workpiece in a predetermined range, thesensor 10 outputs a signal to atrigger generation circuit 20. In step S2 inFIG. 2 , thetrigger generation circuit 20 generates an image capturing trigger signal based on the signal from thesensor 10. - The
trigger generation circuit 20 is composed of a logic circuit such as a field-programmable gate array (FPGA) or an application-specific integrated circuit (ASIC). Thetrigger generation circuit 20 performs hardware processing on the signal input from thesensor 10 and transmits a trigger signal for capturing an image that is generated by the hardware processing to animaging apparatus 30. Then, in step S3 inFIG. 2 , the workpiece is captured using theimaging apparatus 30. - According to the present exemplary embodiment, the
trigger generation circuit 20 is composed of a logic circuit and performs parallel processing by hardware processing. Then, a signal from thetrigger generation circuit 20 is input to theimaging apparatus 30 without performing software processing. Thus, unnecessary delay is less likely to occur than in software processing in which processing is sequentially performed. It is desirable that thetrigger generation circuit 20 should transmit the trigger signal to theimaging apparatus 30 by wire. - The
imaging apparatus 30 includes a lens unit, an image sensor, a signal processing unit that processes a signal output from the image sensor, an output unit that outputs image data generated by the signal processing unit, and an input unit to which the trigger signal is input. - If the trigger signal is input to the input unit, the
imaging apparatus 30 starts capturing an image. The lens unit is provided to be detachable from the imaging apparatus. Thus, an appropriate lens unit can be selected according to the size of the target object or the image capturing scene. The signal processing unit generates image data based on a signal output from the image sensor. The output unit outputs the image data generated by the signal processing unit. - The image sensor is an element in which photoelectric conversion units are arranged in an array, and for example, is a complementary metal-oxide-semiconductor (CMOS) sensor. The image sensor may be a rolling shutter image sensor in which an exposure period starts and ends at different times in each row, or may be a global electronic shutter image sensor in which an exposure period starts and ends at the same times in all the rows.
- In the present exemplary embodiment, it is assumed that the system is used to inspect a defect in a product (a workpiece) produced on a production line. Thus, to capture a workpiece moving at high speed with higher accuracy, it is desirable to use the global electronic shutter image sensor.
- In step S4 in
FIG. 2 , the image data output from theimaging apparatus 30 is input to aprocessing apparatus 40 and estimated by theprocessing apparatus 40. Theprocessing apparatus 40 makes an estimation regarding the target object in the image data. In the estimation regarding the target object in the image data, a process according to the purpose of the system is performed. A description will be given below of, as the estimation process, an external appearance inspection process for determining whether the workpiece as the image capturing target has a defect. Besides, for example, in the case of the system that identifies whether a particular object is present in image data, the identifying process corresponds to the estimation process. In the case of the automatic sorting system, the process of distinguishing a workpiece according to the size of the workpiece corresponds to the estimation process. It is desirable that theimaging apparatus 30 should transmit the image data to theprocessing apparatus 40 by wire. - The
processing apparatus 40 has a trained model. Using the trained model, theprocessing apparatus 40 determines whether the workpiece has a defect. A graphics processing unit (GPU) 42 can make efficient calculations by performing parallel processing on more data. Thus, in a case where learning is performed multiple times using a learning model as in deep learning, the execution of processing by theGPU 42 is effective. In the present exemplary embodiment, theGPU 42 is used in addition to a central processing unit (CPU) 41 in the processing of theprocessing apparatus 40. Specifically, in a case where a learning program including a learning model is executed, theCPU 41 and theGPU 42 cooperatively make calculations, thereby performing learning. In the processing of theprocessing apparatus 40, only theCPU 41 or theGPU 42 may make calculations. - Each of the
CPU 41 and theGPU 42 includes a memory, and the image data output from theimaging apparatus 30 is held in these memories. As described above, thetrigger generation circuit 20 inputs trigger signals to theimaging apparatus 30 at the same clock time. Thus, pieces of image data at the same clock time are held in the memories of theCPU 41 and theGPU 42. Theprocessing apparatus 40 may include a main memory different from the memory of theCPU 41 and the memory of theGPU 42. In this case, the pieces of image data are held in the main memory. Then, the pieces of image data held in the main memory are written to the memory of theCPU 41 and the memory of theGPU 42 as needed. - The
GPU 42 accesses the pieces of image data held in the memories and processes the pieces of image data in parallel. Using the trained model, theGPU 42 determines whether the workpiece has a defect. TheGPU 42 is more suitable for performing an enormous typical calculation process than theCPU 41 is, and theGPU 42 can quickly perform the process of determining the presence or absence of a defect based on image data of the workpiece. - In step S5 in
FIG. 2 , based on the image data acquired by theimaging apparatus 30, theprocessing apparatus 40 determines whether a defect is present in the region of the image data. The determination result of theprocessing apparatus 40 is output to a programmable logic controller (PLC) 50. If the final determination result indicates that the workpiece has a defect, then in step S6 inFIG. 2 , thePLC 50 inputs a signal for operation control to arobot 60. Therobot 60 switches the operation of moving the workpiece and moves the workpiece that has a defect according to the determination from the production line. - When the
processing apparatus 40 outputs the determination result to thePLC 50, signal transmission at high speed is not required. Thus, a signal can be transferred by wire or wirelessly based on a general-purpose standard such as Ethernet. -
FIGS. 3A and 3B are diagrams illustrating artificial intelligence (AI) that determines a defect in a workpiece in theGPU 42 of theprocessing apparatus 40. -
FIG. 3A is a conceptual diagram of a learning phase. Alearning model 420 has an algorithm for a defect determination, and supervised data is input to thelearning model 420. The supervised data is pieces of image data obtained by capturing a workpiece as a non-defective product, a workpiece including a first defect, and a workpiece including a second defect different from the first defect moved down the production line. That is, as the supervised data, L pieces ofnon-defective product images 411, M pieces offirst defect images 412, and N pieces ofsecond defect images 413 are input. A trainedmodel 430 is obtained by AI performing learning so that the algorithm of thelearning model 420 is an algorithm with higher accuracy. - As a specific algorithm of machine learning, a nearest neighbor algorithm, a Naive Bayes algorithm, a decision tree, or a support-vector machine may be used. Alternatively, deep learning may be used in which AI itself generates a feature amount to be learned and a connection weight coefficient, using a neural network. For example, as a model for deep learning, a convolutional neural network (CNN) model may be used.
-
FIG. 3B is a conceptual diagram of an estimation phase. If aworkpiece image 440 is input to the trainedmodel 430 constructed in the learning phase, adetermination result 450 of the presence or absence of a defect is output from the trainedmodel 430. This process using the trainedmodel 430 is executed by theGPU 42. Specifically, theworkpiece image 440 is image data obtained by theimaging apparatus 30 capturing a workpiece moving down the production line. - Next, with reference to
FIGS. 4 and 5 , a method for generating trained data will be described. - As described above, workpieces used to generate supervised data are a workpiece as a non-defective product, a workpiece including the first defect, and a workpiece including the second defect, and the supervised data is pieces of image data obtained by capturing these workpieces.
- In the present exemplary embodiment, the shutter speed is changed between a first imaging condition and a second imaging condition. Specifically, the step of creating a trained model using, as the supervised data, the pieces of image data obtained by capturing the above workpieces at a certain shutter speed is repeatedly performed while changing the shutter speed until the accuracy of the estimation based on the trained model is a predetermined accuracy or higher. For example, as illustrated in
FIG. 4B , if the shutter speed is too slow, a defect in a workpiece becomes blurred, and therefore, the presence or absence of a defect cannot be distinguished. Thus, even if these pieces of image data are input as a supervised model to the learning model, it is difficult to recognize the presence or absence of a defect based on a constructed trained model. If the shutter speed is too fast, then as illustrated inFIG. 4C , a defect can be determined, but the amount of light of illumination needs to be increased, and versatility decreases. For example, an image capturing location may be limited, such as a case where the system is unavailable in a case where an inspection needs to be performed in a dark place. Additionally, an amount of light greater than or equal to a certain amount is required, and therefore, expensive illumination may be required. In the present exemplary embodiment, a trained model is generated using, as the supervised data, the pieces of image data obtained by capturing these workpieces under a certain imaging condition. If the determined accuracy of the trained model is lower than a predetermined value, a trained model is generated using, as the supervised data, pieces of image data captured by changing the imaging condition. Then, the generation of a supervised data and the determination of the accuracy of the trained model using the generated supervised data are repeatedly performed until the accuracy is greater than or equal to the predetermined value. Then, a trained model the accuracy of which is greater than or equal to the predetermined value is used in the inspection system. This can heighten the accuracy of an estimation process regarding a workpiece in image data. - A method for generating the trained
model 430 will be specifically described below. - First, in step S1 in
FIG. 5 , the shutter speed of theimaging apparatus 30 is set to a predetermined shutter speed (a first shutter speed). It is desirable that the first shutter speed should be as slow as possible. As described above, if the shutter speed is fast, versatility decreases. In step S4 where the accuracy of a trained model based on supervised data generated at the first shutter speed is determined, if it is determined that the accuracy is a predetermined accuracy, this may lead to excessive performance. The first shutter speed is made slow, whereby it is possible to prevent excessive performance. For example, the first shutter speed is set to 8 ms or more and 20 ms or less. - Next, in step S2 in
FIG. 5 , the production line on which workpieces are placed is activated, and pieces of image data are created by capturing a plurality of workpieces while changing the relative positions of the workpieces and theimaging apparatus 30, thereby creating a trained model using the pieces of image data as supervised data. At this time, as the plurality of workpieces, at least a workpiece including a defect 1 and a workpiece as a non-defective product are used. If the plurality of workpieces further includes a workpiece including adefect 2 different from the defect 1, it is possible to improve the accuracy of a quality determination. - As the supervised data, pieces of image data obtained by capturing the workpieces while changing the XY coordinates of the workpieces or pieces of image data obtained by capturing the workpieces while rotating the workpieces may be further used. These pieces of image data are input as the supervised data to the learning model, whereby it is possible to make a quality determination without decreasing the accuracy even in a case where the workpieces are shifted from predetermined positions.
- Next, in step S3 in
FIG. 5 , a plurality of workpieces is captured using theimaging apparatus 30 set to the first shutter speed. Then, in step S4 inFIG. 5 , a quality determination is made on obtained pieces of image data, thereby determining whether the accuracy of the trained model is a predetermined accuracy. - If the accuracy determined in step S4 is lower than the predetermined accuracy, the processing returns to step S1. In step S1, the shutter speed is adjusted. In a case where workpieces are captured after the processing returns to step S1, the same workpieces as or different workpieces from the workpieces captured at the first shutter speed may be used. If the accuracy determined in step S4 is the predetermined accuracy or higher, this trained model is constructed by setting the shutter speed of the
imaging apparatus 30 to a corresponding shutter speed, and a recognition process on a plurality of workpieces is performed using the constructed trained model. If the accuracy determined in step S4 is lower than the predetermined accuracy, the shutter speed is changed to a second shutter speed, and steps S1 to S4 are repeatedly performed until the accuracy is the predetermined accuracy or higher. - In the present exemplary embodiment, pieces of image data captured at a shutter speed that can maintain a certain recognition accuracy are input as supervised data, thereby constructing a trained model. This can heighten the accuracy of a recognition process on a workpiece.
- The first and second imaging conditions may be differentiated by changing a known condition, instead of changing the shutter speed.
- Although the
processing apparatus 40 generates a trained model inFIG. 1 , a generated trained model may be input to theprocessing apparatus 40. For example, image data captured by theimaging apparatus 30 may be transmitted to an information terminal, the information terminal may generate a trained model, and a trained model having a predetermined accuracy or higher may be input to theprocessing apparatus 40. The information terminal is, for example, a computer such as a personal computer. - The present invention is not limited to the above exemplary embodiments, and can be changed and modified in various ways without departing from the spirit and the scope of the present invention. Thus, the following claims are appended to publicize the scope of the present invention.
- Based on a control method for controlling a system and a system according to the present invention, it is possible to improve the accuracy of an estimation regarding a workpiece in image data.
- Embodiment(s) of the present invention can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s). The computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions. The computer executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)™), a flash memory device, a memory card, and the like.
- While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
Claims (20)
1. A control method for controlling a system including an imaging apparatus and a processing apparatus having a learning model to which image data captured by the imaging apparatus is input, the control method comprising:
capturing a first workpiece using the imaging apparatus set to a first imaging condition while changing a position of the first workpiece relative to the imaging apparatus, thereby obtaining first image data;
performing machine learning on the learning model using the first image data as supervised data;
capturing a second workpiece different from the first workpiece using the imaging apparatus set to the first imaging condition while changing a position of the second workpiece relative to the imaging apparatus, thereby obtaining second image data;
inputting the second image data to the trained learning model and making an estimation regarding the second workpiece based on the second image data;
in a case where an accuracy of the estimation is lower than a predetermined value, capturing a third workpiece using the imaging apparatus set to a second imaging condition different from the first imaging condition while changing a position of the third workpiece relative to the imaging apparatus, thereby obtaining third image data; and
performing machine learning on the learning model using the third image data as the supervised data.
2. The control method according to claim 1 , wherein the imaging apparatus includes a global electronic shutter image sensor.
3. The control method according to claim 1 , wherein the first and second imaging conditions are shutter speeds.
4. The control method according to claim 3 , wherein the shutter speed in the first imaging condition is slower than the shutter speed in the second imaging condition.
5. The control method according to claim 1 , wherein in the estimation, the learning model outputs information indicating whether the second workpiece in the input second image data has a defect.
6. The control method according to claim 5 , wherein, as the supervised data, image data of a workpiece as a non-defective product and image data of a workpiece including a defect are used.
7. The control method according to claim 6 , wherein, as the image data of the workpiece including the defect, image data of a workpiece including a first defect and image data of a workpiece including a second defect different from the first defect are used.
8. The control method according to claim 1 , further comprising:
after the machine learning is performed on the learning model using the third image data obtained by the imaging apparatus set to the second imaging condition, capturing a fourth workpiece different from the third workpiece using the imaging apparatus set to the second imaging condition while changing a position of the fourth workpiece relative to the imaging apparatus, thereby obtaining fourth image data; and
inputting the fourth image data to the trained learning model and making an estimation regarding the fourth workpiece based on the fourth image data.
9. The control method according to claim 1 ,
wherein the processing apparatus includes a graphics processing unit (GPU), and
wherein the GPU makes the estimation.
10. The control method according to claim 9 ,
wherein the GPU includes a memory, and
wherein the memory holds the image data output from the imaging apparatus.
11. The control method according to claim 1 ,
wherein the system includes a sensor and a trigger generation circuit configured to transmit an image capturing trigger signal to the imaging apparatus,
wherein, in the obtaining of the first image data, in a case where the sensor detects the first workpiece in a predetermined range, the sensor outputs a signal to the trigger generation circuit, and
wherein the imaging apparatus captures the first workpiece based on the image capturing trigger signal output based on the signal.
12. The control method according to claim 11 , wherein the trigger generation circuit is composed of a logic circuit and transmits the image capturing trigger signal to the imaging apparatus without performing software processing.
13. The control method according to claim 1 , wherein the learning model uses a nearest neighbor algorithm, a Naive Bayes algorithm, a decision tree, a support-vector machine, or a neural network.
14. A system comprising:
an imaging apparatus; and
a processing apparatus having a learning model to which image data captured by the imaging apparatus is input, the processing apparatus configured to:
perform machine learning on the learning model using, as supervised data, first image data obtained by capturing a first workpiece using the imaging apparatus set to a first imaging condition while changing a position of the first workpiece relative to the imaging apparatus;
input, to the trained learning model, second image data obtained by capturing a second workpiece different from the first workpiece using the imaging apparatus set to the first imaging condition while changing a position of the second workpiece relative to the imaging apparatus and make an estimation regarding the second workpiece based on the second image data; and
in a case where an accuracy of the estimation is lower than a predetermined value, perform machine learning on the learning model using, as the supervised data, third image data obtained by capturing a third workpiece using the imaging apparatus set to a second imaging condition different from the first imaging condition while changing a position of the third workpiece relative to the imaging apparatus.
15. The system according to claim 14 , wherein the first and second imaging conditions are shutter speeds.
16. The system according to claim 14 , further comprising:
a sensor; and
a trigger generation circuit configured to transmit an image capturing trigger signal to the imaging apparatus,
wherein, in a case where the sensor detects that the first workpiece is present in a predetermined range, the sensor outputs a signal to the trigger generation circuit, and
wherein the imaging apparatus captures the first workpiece based on the image capturing trigger signal output from the trigger generation circuit based on the signal.
17. The system according to claim 16 , wherein the trained learning model outputs information indicating whether the second workpiece in the input second image data has a defect.
18. The system according to claim 17 , further comprising a robot,
wherein the robot moves the second workpiece that has a defect according to a determination of the trained learning model.
19. The system according to claim 17 , further comprising a programmable logic controller (PLC) to which information indicating whether the second workpiece in the second image data has a defect is transmitted from the trained learning model,
wherein the processing apparatus and the PLC are connected together wirelessly, and
wherein the sensor and the trigger generation circuit are connected together by wire.
20. The system according to claim 17 , further comprising a PLC to which information indicating whether the second workpiece in the second image data has a defect is transmitted from the trained learning model,
wherein a transfer speed of a signal from the processing apparatus to the PLC is slower than a transfer speed of a signal from the sensor to the trigger generation circuit.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2019-101750 | 2019-05-30 | ||
JP2019101750A JP7267841B2 (en) | 2019-05-30 | 2019-05-30 | System control method and system |
PCT/JP2020/020449 WO2020241540A1 (en) | 2019-05-30 | 2020-05-25 | System control method and system |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2020/020449 Continuation WO2020241540A1 (en) | 2019-05-30 | 2020-05-25 | System control method and system |
Publications (1)
Publication Number | Publication Date |
---|---|
US20220084182A1 true US20220084182A1 (en) | 2022-03-17 |
Family
ID=73553494
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/533,907 Pending US20220084182A1 (en) | 2019-05-30 | 2021-11-23 | Control method for controlling system and system |
Country Status (3)
Country | Link |
---|---|
US (1) | US20220084182A1 (en) |
JP (1) | JP7267841B2 (en) |
WO (1) | WO2020241540A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11783474B1 (en) * | 2021-11-15 | 2023-10-10 | Changzhou Microintelligence Co., Ltd. | Defective picture generation method and apparatus applied to industrial quality inspection |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2024024283A1 (en) * | 2022-07-29 | 2024-02-01 | 株式会社Jvcケンウッド | Image recognition assistance device, method, and program |
Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050276459A1 (en) * | 2004-06-09 | 2005-12-15 | Andrew Eames | Method and apparatus for configuring and testing a machine vision detector |
US20050275834A1 (en) * | 2004-06-09 | 2005-12-15 | Silver William M | Method and apparatus for locating objects |
US20050275728A1 (en) * | 2004-06-09 | 2005-12-15 | Mirtich Brian V | Method for setting parameters of a vision detector using production line information |
US20090147120A1 (en) * | 2007-12-07 | 2009-06-11 | Seiko Epson Corporation | Image sensor, image taking apparatus, and state inspection system |
US20130307977A1 (en) * | 2011-02-15 | 2013-11-21 | Datalogic Ip Tech S.R.L. | Method for image acquisition |
US20160335778A1 (en) * | 2015-04-13 | 2016-11-17 | Gerard Dirk Smits | Machine vision for ego-motion, segmenting, and classifying objects |
US20180211373A1 (en) * | 2017-01-20 | 2018-07-26 | Aquifi, Inc. | Systems and methods for defect detection |
US20180343385A1 (en) * | 2017-05-25 | 2018-11-29 | Canon Kabushiki Kaisha | Image capturing apparatus, system, and method |
US20180348146A1 (en) * | 2017-05-31 | 2018-12-06 | Keyence Corporation | Image Inspection Apparatus |
US20190210159A1 (en) * | 2016-06-28 | 2019-07-11 | Hitachi, Ltd. | Welding monitoring system |
US20190293409A1 (en) * | 2018-03-22 | 2019-09-26 | Keyence Corporation | Image Processing Apparatus |
US20200334801A1 (en) * | 2017-12-06 | 2020-10-22 | Nec Corporation | Learning device, inspection system, learning method, inspection method, and program |
US20210289604A1 (en) * | 2016-11-11 | 2021-09-16 | Omron Corporation | Illumination device |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5923723B2 (en) | 2011-06-02 | 2016-05-25 | パナソニックIpマネジメント株式会社 | Person attribute estimation system, person attribute estimation apparatus, and person attribute estimation method |
JP6809891B2 (en) | 2016-12-15 | 2021-01-06 | 株式会社Fuji | Image processing system and image processing method |
JP6668278B2 (en) | 2017-02-20 | 2020-03-18 | 株式会社日立ハイテク | Sample observation device and sample observation method |
-
2019
- 2019-05-30 JP JP2019101750A patent/JP7267841B2/en active Active
-
2020
- 2020-05-25 WO PCT/JP2020/020449 patent/WO2020241540A1/en active Application Filing
-
2021
- 2021-11-23 US US17/533,907 patent/US20220084182A1/en active Pending
Patent Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050276459A1 (en) * | 2004-06-09 | 2005-12-15 | Andrew Eames | Method and apparatus for configuring and testing a machine vision detector |
US20050275834A1 (en) * | 2004-06-09 | 2005-12-15 | Silver William M | Method and apparatus for locating objects |
US20050275728A1 (en) * | 2004-06-09 | 2005-12-15 | Mirtich Brian V | Method for setting parameters of a vision detector using production line information |
US20090147120A1 (en) * | 2007-12-07 | 2009-06-11 | Seiko Epson Corporation | Image sensor, image taking apparatus, and state inspection system |
US20130307977A1 (en) * | 2011-02-15 | 2013-11-21 | Datalogic Ip Tech S.R.L. | Method for image acquisition |
US20160335778A1 (en) * | 2015-04-13 | 2016-11-17 | Gerard Dirk Smits | Machine vision for ego-motion, segmenting, and classifying objects |
US20190210159A1 (en) * | 2016-06-28 | 2019-07-11 | Hitachi, Ltd. | Welding monitoring system |
US20210289604A1 (en) * | 2016-11-11 | 2021-09-16 | Omron Corporation | Illumination device |
US20180211373A1 (en) * | 2017-01-20 | 2018-07-26 | Aquifi, Inc. | Systems and methods for defect detection |
US20180343385A1 (en) * | 2017-05-25 | 2018-11-29 | Canon Kabushiki Kaisha | Image capturing apparatus, system, and method |
US20180348146A1 (en) * | 2017-05-31 | 2018-12-06 | Keyence Corporation | Image Inspection Apparatus |
US20200334801A1 (en) * | 2017-12-06 | 2020-10-22 | Nec Corporation | Learning device, inspection system, learning method, inspection method, and program |
US20190293409A1 (en) * | 2018-03-22 | 2019-09-26 | Keyence Corporation | Image Processing Apparatus |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11783474B1 (en) * | 2021-11-15 | 2023-10-10 | Changzhou Microintelligence Co., Ltd. | Defective picture generation method and apparatus applied to industrial quality inspection |
US20230326010A1 (en) * | 2021-11-15 | 2023-10-12 | Changzhou Microintelligence Co., Ltd. | Defective picture generation method and apparatus applied to industrial quality inspection |
Also Published As
Publication number | Publication date |
---|---|
WO2020241540A1 (en) | 2020-12-03 |
JP2020198471A (en) | 2020-12-10 |
JP7267841B2 (en) | 2023-05-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20220084182A1 (en) | Control method for controlling system and system | |
US10776911B2 (en) | Information processing apparatus, identification system, setting method, and program | |
US10040199B2 (en) | Apparatus and method for determining work to be picked | |
JP7074460B2 (en) | Image inspection equipment and methods | |
JP6955211B2 (en) | Identification device, identification method and program | |
US10699400B2 (en) | Image processing apparatus, image processing method, and storage medium | |
KR101941585B1 (en) | Embedded system for examination based on artificial intelligence thereof | |
JP2011163766A (en) | Image processing method and image processing system | |
US11521312B2 (en) | Image processing apparatus, image processing method, and storage medium | |
CN111213045A (en) | Automatic defect classification | |
US20220284567A1 (en) | Teacher data generation method, trained learning model, and system | |
US20220067917A1 (en) | Imaging system | |
JP7058324B2 (en) | Inspection equipment, inspection methods, learning methods, and programs | |
JP2021174456A (en) | Abnormality determination method and abnormality determination device | |
WO2021181749A1 (en) | Learning device, image inspection device, learned parameter, learning method, and image inspection method | |
JP2022055953A (en) | Defect classification device, defect classification method and program | |
JP2021179321A (en) | Status management method, program, and status management system | |
WO2020213194A1 (en) | Display control system and display control method | |
KR20200088682A (en) | Electronic apparatus and controlling method thereof | |
JP2021061546A (en) | Imaging apparatus, control method of the same, and program | |
US20240005477A1 (en) | Index selection device, information processing device, information processing system, inspection device, inspection system, index selection method, and index selection program | |
CN111050088B (en) | Mechanism to calibrate imaging brightness of camera for detecting die defects | |
US11631194B2 (en) | Image processing apparatus that performs recognition processing, control method thereof, and storage medium | |
JP5778685B2 (en) | System and method for alignment and inspection of ball grid array devices | |
TWI777696B (en) | Defect detection method and defect detection system using the same |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: CANON KABUSHIKI KAISHA, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:OHYA, TAKERU;SHINGAI, SATORU;REEL/FRAME:058414/0527 Effective date: 20211115 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |