WO2021221621A1 - Ajustement de paramètres de fabrication sur la base d'une inspection - Google Patents

Ajustement de paramètres de fabrication sur la base d'une inspection Download PDF

Info

Publication number
WO2021221621A1
WO2021221621A1 PCT/US2020/030346 US2020030346W WO2021221621A1 WO 2021221621 A1 WO2021221621 A1 WO 2021221621A1 US 2020030346 W US2020030346 W US 2020030346W WO 2021221621 A1 WO2021221621 A1 WO 2021221621A1
Authority
WO
WIPO (PCT)
Prior art keywords
computing device
defects
images
inspection
parameter
Prior art date
Application number
PCT/US2020/030346
Other languages
English (en)
Inventor
Xiaoliang Zhu
Subrata Kumar KUNDU
Naveen Kumar Bangalore Ramaiah
Original Assignee
Hitachi America, Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi America, Ltd. filed Critical Hitachi America, Ltd.
Priority to PCT/US2020/030346 priority Critical patent/WO2021221621A1/fr
Publication of WO2021221621A1 publication Critical patent/WO2021221621A1/fr

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
    • G05B19/41875Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM] characterised by quality surveillance of production
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8883Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges involving the calculation of gauges, generating models
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/31From computer integrated manufacturing till monitoring
    • G05B2219/31447Process error event detection and continuous process image detection, storage
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/35Nc in input of data, input till input file format
    • G05B2219/35502Display picture, image of place of error
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B23/00Testing or monitoring of control systems or parts thereof
    • G05B23/02Electric testing or monitoring
    • G05B23/0205Electric testing or monitoring by means of a monitoring system capable of detecting and responding to faults
    • G05B23/0259Electric testing or monitoring by means of a monitoring system capable of detecting and responding to faults characterized by the response to fault detection
    • G05B23/0275Fault isolation and identification, e.g. classify fault; estimate cause or root of failure
    • G05B23/0281Quantitative, e.g. mathematical distance; Clustering; Neural networks; Statistical analysis
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Definitions

  • This disclosure relates to the technical fields of inspecting products and controlling manufacturing equipment.
  • a vehicle suspension system may include a suspension strut that includes a suspension rod with an attached piston, an oil seal, absorber oil, and a structural cylinder. Vibrations and shocks from the road may be transferred via the suspension rod (moving in and out of the structural cylinder) and dissipated in the absorber oil.
  • defects on the surface of the suspension rod such as small dents, nodules and/or scratches, may damage the oil seal, which can lead to oil leakage, noise and eventual failure of the suspension system.
  • the surface quality of the suspension rod is inspected before assembling the suspension rod with the other strut components.
  • the most common technique for inspecting the surface of a suspension rod is visual inspection by workers.
  • this technique may be subjective, may have low accuracy, may have a long cycle time, and may have a large error rate.
  • Some implementations include a computing device that is configured to receive inspection information including a plurality of images of surfaces of inspected products including defects on the surfaces of the inspected products.
  • the computing device may recognize, from the plurality of images, a pattern in the defects for determining a cause of the defects and an associated adjustment to a manufacturing parameter.
  • the computing device may send information related to the associated adjustment to the manufacturing parameter to cause, at least in part, at least one control signal to be sent to control at least one manufacturing parameter based on the determined associated adjustment.
  • FIG. 1 illustrates an example architecture of a system able to perform inspection of products and adjustment of manufacturing parameters according to some examples.
  • FIG. 2 illustrates an example arrangement for inspecting products according to some implementations .
  • FIG. 3 is a diagram illustrating an example process for performing inspection of products and adjusting manufacturing parameters according to some implementations.
  • FIGS. 4 and 5 are flow diagrams illustrating an example process for performing inspection of products and adjusting manufacturing parameters according to some implementations .
  • FIG. 6 is a flow diagram illustrating a defect determining process for identifying defects based on received images according to some implementations.
  • FIG. 7 illustrates an example distribution of nodule and dent sizes and an example distribution 702 of nodule diameter vs. height according to some implementations.
  • FIGS. 8 A and 8B illustrate example data structures for maintaining inspection information according to some implementations.
  • FIG. 9 illustrates an overview of a framework for building the image processing MLM according to some implementations.
  • FIG. 10 is a flow diagram illustrating an example process for preparing input and output images according to some implementations.
  • FIG. 11 illustrates an example framework and process for execution of the image processing MLM according to some implementations.
  • FIG. 12 illustrates an example graph showing a plot of training loss and validation loss according to some implementations.
  • FIG. 13 illustrates an example of calculating of defect size accuracy according to some implementations .
  • FIG. 14 illustrates an example histogram showing inspection results according to some implementations .
  • FIG. 15 is a flow diagram illustrating an example process performed following completion of inspection according to some implementations.
  • FIG. 16 illustrates an example user interface for viewing inspection information according to some implementations.
  • FIG. 17 is a flow diagram illustrating an example process for adjusting one or more manufacturing parameters according to some implementations.
  • FIG. 18 illustrates an example architecture of an artificial neural network (ANN) that may be used as the adjustment MLM according to some implementations.
  • ANN artificial neural network
  • FIG. 19 illustrates a graph showing examples of different learning rates for finding a global minimum according to some implementations.
  • FIG. 20 is a flow diagram illustrating an example process for determining defect causes and adjusting manufacturing parameters according to some implementations.
  • FIG. 21 illustrates a table as an example data structure of an experimentally obtained training dataset according to some implementations.
  • FIG. 22 illustrates select example components of the service computing device(s) that may be used to implement at least some of the functionality of the systems described herein.
  • Some implementations herein are directed to techniques and arrangements for adjusting a manufacturing parameter based on surface quality inspection. Examples may include a novel adaptive learning rate updating algorithm for determining appropriate parameter adjustments.
  • the system herein may be capable of classifying and counting different types of defects and based on detected defect patterns, identifying root causes of the defects and determining one or more manufacturing parameters to adjust for eliminating the defects by changing or otherwise adjusting the one or more manufacturing parameters.
  • the adjustments to the manufacturing parameters may include one or more of changing a machining parameter, such as machine speed, tool type, or the like, changing a coating process, finishing process, heat treatment process, cleaning process, etc., changing a material used, changing equipment used, changing a conveying, transportation or handling process, or the like.
  • a high intensity light may be is used for illuminating the product surface.
  • a product inspection support or other conveying and support system may be motorized with one or more motors, such as for product rotation and axial movement to fully automate the inspection process.
  • a line or area scan camera may be used to capture surface images of the product.
  • a robotic arm or other manipulation and conveying system may be provided for moving products into and out of the inspection support.
  • the system may include at least one computing device that may execute an inspection program that configures the computing device to coordinate the inspection support and the camera for image capturing.
  • the computing device may capture and analyze images of the product and may provide an indication to an operator of the condition of a product.
  • the computing device may activate a light indicator based on the detected condition, e.g., a red light may indicate a product that does not pass, and a green light may indicate a product that does pass.
  • the computing device or another computing device may determine one or more defect patterns based on the inspection results.
  • the inspection results from multiple products may be sent to a service computing device over a network where an analysis may be performed.
  • the service computing device may execute one or more artificial neural networks (ANNs) or other machine-learning models (MLMs) using an adaptive learning rate algorithm disclosed herein.
  • ANNs artificial neural networks
  • MLMs machine-learning models
  • the results of the analysis may be used, e.g., by a user device or a control computing device to control one or more machine controllers for adjusting one or more parameters of machining processes and/or for determining one or more design, material or equipment modifications, or for making other changes to the manufacturing procedures.
  • a user at a user device may employ a web portal for remote monitoring and data access.
  • Implementations disclosed herein may be used for product surface quality inspection, classifying defects, counting defects and performing analysis to determine a machining process or other manufacturing process that is an underlying cause of the detected defects.
  • the system may determine appropriate corrective action, such as adjusting one or more manufacturing parameters.
  • the examples disclosed herein may be expanded to other shapes and types of products and other manufacturing areas where defect size estimation, system control automation, and/or remote monitoring is applicable.
  • some implementations are described in the environment of determining the surface quality of a product.
  • implementations herein are not limited to the particular examples provided, and may be extended to other types of machined products, other machining techniques, other types of machine tools, other types of computer-aided manufacturing (CAM) systems, and so forth, as will be apparent to those of skill in the art in light of the disclosure herein.
  • CAM computer-aided manufacturing
  • FIG. 1 illustrates an example architecture of a system 100 able to perform inspection of products and adjustment of manufacturing parameters according to some examples.
  • the system 100 includes one or more service computing devices 102 that are able to communicate with, or otherwise coupled to one or more control computing devices 104, such as through one or more networks 106. Further, the service computing device(s) 102 are able to communicate over the one or more networks 106 with one or more user computing devices 108.
  • the service computing device(s) 102 may include one or more servers, personal computing devices, workstations, or any of various other computing devices.
  • the control computing device(s) 104 and the user device(s) 108 may be any of various types of computing devices, as discussed additionally below.
  • the service computing device(s) 102 may include one or more servers that may be embodied in any number of ways.
  • the programs, other functional components, and at least a portion of data storage of the service computing device(s) 102 may be implemented on at least one server, such as in a cluster of servers, a server farm, a data center, a cloud-hosted computing service, and so forth, although other computer architectures may additionally or alternatively be used.
  • the service computing device(s) 102 may include one or more of personal computers, desktops, laptops, or any other computing device with sufficient processing, storage, and communication capabilities. Additional details of the service computing device(s) 102 are discussed below with respect to FIG. 21.
  • the one or more networks 106 may include any suitable network, including a wide area network, such as the Internet; a local area network (LAN), such as an intranet; a wireless network, such as a cellular network, a local wireless network, such as Wi-Fi, and/or short-range wireless communications, such as BLUETOOTH®; a wired network including Fibre Channel, fiber optics, Ethernet, or any other such network, a direct wired connection, or any combination thereof. Accordingly, the one or more networks 106 may include both wired and/or wireless communication technologies. Components used for such communications can depend at least in part upon the type of network, the environment selected, or both. Protocols for communicating over such networks are well known and will not be discussed herein in detail. Implementations herein are not limited to any particular type of network as the network(s) 106.
  • Each control computing device 104 may be any suitable type of computing device such as a server, desktop, laptop, tablet computing device, and/or any other type of computing device able to perform the functions described herein.
  • An operator 110 may be associated with the control computing device 104 and may further be associated with an inspection station 112 and one or more manufacturing devices 114.
  • the manufacturing device(s) 114 may be used to produce a plurality of parts, items or other products of manufacture referred to herein as products 116.
  • the manufacturing devices 114 may include one or more tool controllers 118, such as a first tool controller 118(1), a second tool controller 118(2), a third tool controller 118(3), ... and so forth.
  • each tool controller 118 may control one or more manufacturing parameters that may be controllable or otherwise adjustable for manufacturing the products 116.
  • the products 116 may be provided to an inspection station 112 for inspection of the products 116 for quality control purposes, such as to ensure that the products 116 meet one or more design specifications, which may include specified surface quality.
  • the surface finish or other surface quality may be inspected to check for irregularities such as scratches, dents, or raised nodules on the surface of the products 116.
  • the inspection station 112 may include an inspection support 120 that may receive a product 116 for inspection. As discussed additionally below, the inspection support 120 may be configured to move or otherwise reorient the product 116 to enable inspection of an entire area of interest of the product 116, such as the cylindrical surface of the product 116.
  • the inspection station 112 may include a manipulator or other conveyor controller 122 that may control a manipulator or other conveyor 124 for moving the products 116 onto and off of the inspection support 120.
  • the manipulator/conveyor 124 may include a robotic arm although implementations herein are not limited to any particular type of manipulator or other conveyor 124.
  • the inspection station may include a camera 126 and a light source 128.
  • the light source 128 may be projected against the surface of the product 116 to improve the ability of the camera 126 to detect any irregularities or other defects in the surface of the product 116 while the inspection support 120 is controlled by a support controller 130 to control the position of the surface of the product 116 relative to the camera 126 and the light source 128.
  • the images captured by the camera 126 may be processed and analyzed by the control computing device 104 and when a particular product 116 is determined to have a defect, an indicator 132 may be activated to provide a signal to the operator 110.
  • the manipulator or other conveyor 124 may transfer the product into one of a plurality of bins 134.
  • bins 134 may include a first bin 134(1) for products that passed, a second bin 134(2) for products that have dented surfaces, a third bin 134(3) for products that have nodules on their surfaces, and a fourth bin 134(4) for products that have scratched surfaces.
  • Examples of nodules may include small bumps, such as raised rounded lumps of material that extend upward from the surface of the product 116.
  • the nodules herein may be between 10 and 200 micrometers in length and width.
  • examples of dents may include pits, hollows, indentations, or the like that extend into the surface of the product
  • the dents herein may be between 70 micrometers to over 1 mm in length and width.
  • examples of scratches may include furrows, grooves, scores, and other elongated marks formed into the surface of the product 116.
  • a scratch may have a length that is more than twice the width of the scratch.
  • the control computing device 104 may include an inspection program 140 that may receive images from the camera 126 for each product 116 being inspected.
  • the inspection program 140 may process the images for identifying defects based on the images.
  • the inspection program may employ an image processing machine-learning model (MLM)142 such as a trained neural network or other suitable machine-learning model for processing the images.
  • MLM image processing machine-learning model
  • the inspection program 140 may make a decision as to whether the particular product 116 passes or fails the inspection.
  • the inspection program 140 may decide the type of defect that caused the product to not pass inspection such as due to the product being scratched, having nodules, or being dented.
  • the inspection program 140 may be configured to send a control signal to the manipulator/conveyor controller 122 for ensuring that the inspected product 116 is deposited into the correct bin 134 following inspection.
  • the inspection program 140 may send a control signal to the indicator 132 to provide a visual indication to the operator 110 as to whether the product 116 passed or failed the inspection.
  • the inspection program 140 may present, on a display (not shown in FIG. 1) of the control computing device 104, a graphical user interface (GUI) that provides details of the inspection of each product 116 and/or the overall results of the inspections of multiple products 116.
  • GUI graphical user interface
  • the inspection program 140 may be configured to send inspection information 144 to the service computing device 102.
  • the inspection program 140 may employ one or more application programming interfaces (APIs) to communicate with the service computing device 102 to send the inspection information 144 to the service computing device 102 to enable analysis of the inspection information 144 as discussed additionally below.
  • APIs application programming interfaces
  • control computing device 104 may include a machine control program
  • the machine control program 146 may be executed to cause the control computing device 104 to send control signals to the manufacturing device 114 such as for controlling the tool controllers 118 and for changing or otherwise adjusting various manufacturing parameters of the tool controllers 118.
  • the control computing device 104 may include a web application 148 that may be accessed by the user computing device 108 for enabling interaction with the inspection program 140 and/or the machine control program 146.
  • the user computing device 108 may include one or more user application(s) 150 which may include a browser in some examples and/or a dedicated application in other examples.
  • Each user device 108 may be any suitable type of computing device such as a desktop, laptop, tablet computing device, mobile device, smart phone, wearable device, terminal, and/or any other type of computing device able to send data over a network.
  • a user 152 may be associated with the user device 108 such as through a respective user account, user login credentials, or the like.
  • the user device 108 may be configured to communicate with the service computing device(s) 102 through the one or more networks 106 through any suitable type of communication connection for accessing a web application 154 on the service computing device(s) 102.
  • each user device 108 may include one or more user applications 150 that may execute on the user device 108, such as for communicating with the web application 148 on the control computing device 104 and/or the web application 154 on the service computing device 102.
  • the application 150 may include a browser or may operate through a browser, while in other cases, the application 150 may include any other type of application having communication functionality enabling communication with the web applications 148 and 154 or other applications on the service computing device(s) 102 or the control computing device 104.
  • the service computing device(s) 102 may execute an analysis program 156, which may configure the service computing device 102 to input the inspection information 144 (e.g., captured images showing defects) into one or more adjustment machine-learning models (MLMs) 160.
  • the adjustment MLM(s) 160 may be used for identifying patterns in the defects detected by the inspection program 140 and for determining one or more recommended adjustments 162 to manufacturing parameters of the manufacturing machine(s) 114 and/or other design or equipment recommendations, material recommendations, handling or transportation recommendations and so forth.
  • the service computing devices 102 may include a machine-learning model (MLM) building program 164 that may be used for configuring, training, testing, validating, retraining and otherwise building the adjustment MLM(s) 160.
  • MLM building program 164 may also be used to build, train, test, validate and retrain the image processing MLM 142.
  • a different computing device and/or model building program may be used for this function.
  • the user 152 may use the web application 154 on the service computing device 102 to view the results of the analysis program such as for obtaining the recommended adjustments 162 from the service computing device 102.
  • the web application 154 may provide a GUI that the user 152 may use to view the inspection information 144 and the recommended adjustments 162.
  • the user 152 may use the web application 154 for configuring the analysis program 156 and/or the adjustment MLM(s) 160.
  • the user may access the web application 148 for accessing the machine control program 146, or may access the machine control program 146 directly, depending on the configuration of the machine control program 146. In either event, the user 152 may apply one or more parameter adjustments 170 to the machine control program 146. This may result in the machine control program 146 sending one or more parameter control signals 172 to the manufacturing device 114.
  • the parameter control signals 172 may control at least one operation parameter of one or more of the tool controllers 118.
  • the operation parameters may be adjusted to attempt to correct one or more defects detected by the inspection program 140, as discussed additionally below.
  • the recommended adjustments 162 are sent to the user computing device 108
  • the recommended adjustments 162 may be sent directly to the machine control program 146, such as via one or more APIs.
  • the machine control program 146 receives the recommended adjustments 162 and may send corresponding parameter control signals 172 to one or more of the tool controllers 118 of the manufacturing devices 114.
  • the system 100 may operate essentially autonomously for improving the quality of the products 116.
  • the control computing device 104 and the service computing device 102 may be the same computing device which may perform all the functions described for both computing devices.
  • the user computing device 108 and the service computing device 102 may be the same computing device which may perform all the functions described for both devices.
  • the control computing device 104 may perform all the functions described herein, and the service computing device 102 and the user computing device 108 are not included.
  • FIG. 2 illustrates an example arrangement 200 for inspecting products 116 according to some implementations.
  • the inspection support 120 includes a pair of rollers
  • rollers 202 and 204 for supporting a product 116 during inspection.
  • 202, 204 may be driven by an electric motor 206, or the like, for causing rotation of the rollers
  • the motor(s) 206 may be controlled by the support controller 130 under control of the inspection program 140, such as for rotating the product 116 and for advancing the product along its central axis 214 to enable portions of the product 116 to be inspected successively.
  • a first motor 206 may control rotation of the product 116
  • a second motor 206 may control axial advancement of the product 116 during inspection for inspecting different sections of the product 116.
  • a robotic arm 216 is included as the manipulator and/or conveyor 124.
  • the robotic arm 216 may be controlled to place a product 116 onto the inspection support 124 inspection and may thereafter remove the product 116 from the inspection support 120 and place the product 116 into the appropriate bin 134 as discussed above with respect to FIG. 1.
  • another robotic arm (not shown) having the camera 126 and/or light 128 installed thereon may be controlled to scan the product surface for image acquisition.
  • the light source 128 may illuminate the surface 220 of the product 116.
  • the light source 128 may be a high intensity LED, laser or the like.
  • the camera 126 may be directed toward the illuminated surface 220 of the product 116 for capturing images of the surface 220 of the product 116.
  • the camera 226 may be directed toward the surface 220 at such an angle that the light 222 reflected off the surface 220 does not reflect directly the toward the camera 126, which may be referred to as “dark field illumination”.
  • the camera may be positioned based on other configurations, e.g., to receive reflected light directly, referred to as “bright field illumination”.
  • the camera 126 may be a line scan camera or the like.
  • the camera 126 When there are no defects on the surface 220 of the product 116, a mostly dark image will be recorded by the camera 126, as indicated at 224, which indicates a surface finish of sufficient quality as shown by an example closeup view 226 of the surface 220.
  • the defects On the other hand, as illustrated in the example at 230, if there are defects on the surface 220 of the product 116, such as scratches, nodules, or dents, the defects cause the reflected light to scatter, as indicated at 232.
  • the scattered light 232 is received by the camera 126 and recorded as one or more captured images 234 that may include one or more bright areas 236.
  • the size and shape of the bright area(s) 236 may be related to the defect size, shape, and type.
  • a rotation speed of the product 116 may be based on product diameter or other product dimension of the product 116, as well as camera line rate and field of view of the camera 126.
  • the camera position may be controlled by a three axis linear stage 240 and a two axis goniometer stage 242 to obtain superior image quality.
  • the surface 220 of the product 116 may be moved relative to the light source 128 and the camera 126, such as toward or away from the light source, to expose successively different portions of the product surface 220 around the circumference of the product 116.
  • the product 116 may be advanced in the axial direction and the light source 128 and camera 126 may begin scanning a next section of the product surface 220 by again rotating the product 116 about its axis 214.
  • the product 116 may be sufficiently short in length to be able to be scanned in one rotation of the product 116.
  • multiple cameras 126 and light sources 128 may be employed to scan the product 116 in a single rotation while capturing the entire surface 220 of the product 116.
  • the examples herein are not limited to using a camera and visible light for detecting the defects in the product 116.
  • alternative examples may include an eddy current sensor instead of a camera and light for detecting variations in a magnetic field applied adjacent to the surface 220 of the product 116, such as by an electromagnet or the like.
  • ultrasound may be used for detecting surface defects such as may be generated and/or detected by a piezoelectric transducer, capacitive micromachined ultrasonic transducers (CMUTs), combinations thereof, or the like.
  • CMUTs capacitive micromachined ultrasonic transducers
  • FIGS. 3-6, 9-11, 15, 17 and 20 include flow diagrams illustrating example processes according to some implementations.
  • the processes are illustrated as collections of blocks in logical flow diagrams, which represent a sequence of operations, some of which can be implemented in hardware, software or a combination thereof.
  • the blocks may represent computer-executable instructions stored on one or more computer-readable media that, when executed by one or more processors, program the processors to perform the recited operations.
  • computer-executable instructions include routines, programs, objects, components, data structures and the like that perform particular functions or implement particular data types. The order in which the blocks are described should not be construed as a limitation.
  • FIG. 3 is a diagram illustrating an example process 300 for performing inspection of products and adjusting manufacturing parameters according to some implementations.
  • the process 300 may be executed at least partially by at least one control computing device 104 executing the inspection program 140 and machine control program 146, and further by at least one service computing device 102 or other suitable computing device executing the analysis program 156.
  • the control computing device 104 may perform image processing to detect defects based on the inspection.
  • the control computing device may send in the defect information to the service computing device 102, allowing a user of the user computing device 108 to access the surface quality information (e.g., defect types, sizes, and counts) remotely.
  • the surface quality information e.g., defect types, sizes, and counts
  • the analysis program 156 may determine defect patterns (type, density, orientation, etc.) using the adjustment MLM(s) 160. For example, errors in a lathing process may generate streak-like, large and dense nodule defects, while errors in a polishing process may generate smaller nodules having less density. Furthermore, scratches may be caused by a cleaning process and dent defects may be generated during transportation, conveying and handling. With such defect pattern information, the root causes of the defects may be determined and the corresponding machine process, handling process or other manufacturing parameter may be further determined by the adjustment MLM 160 which, according to some examples herein, may include an adaptive learning rate algorithm for fast and accurate determination of recommended adjustments to machine parameters or other manufacturing parameters.
  • control computing device 104 may control the manipulator or other conveyor for placing a next product 116 for inspection.
  • the manipulator/conveyor provides the product 116 to the inspection support 120 for inspection.
  • control computing device 104 may control the light source for illuminating the product 116 to be inspected.
  • control computing device 104 may control the camera to acquire captured images of the product 116.
  • control computing device 104 may control the inspection support for rotating the product 116 to conduct the inspection of the product 116.
  • control computing device 104 may process the images received from the camera to identify defects in the product 116.
  • control computing device 104 may send a control signal and/or present the inspection information.
  • control computing device 104 may send a control signal to the indicator 132, such as to cause the indicator 132 to present a visual indication or other indication as to whether the particular product 116 passed or failed the inspection.
  • control computing device 104 may display the defect types, number, and other inspection information on a display of the control computing device 104.
  • control computing device 104 may send the inspection information to the analysis program at the service computing device 102.
  • the service computing device 102 may execute the analysis program to apply the adjustment machine-learning model and adaptive learning to the inspection information to determine one or more recommended adjustments to the manufacturing parameters.
  • the control computing device 104 may send one or more control signals to one or more tool controllers 118(1)-118(M) to control respective manufacturing parameters based on the output of the analysis program.
  • the control computing device 104 may receive the recommended adjustments directly from the service computing device 102.
  • the user computing device (not shown in FIG. 3) may send the recommended adjustments to the control computing device 104, which may cause the control computing device 104 to send that the control signals to the tool controllers 118.
  • the tool controllers 118(1)-118(M) may each control one or more respective manufacturing processes 324(1)-324(M).
  • FIGS. 4 and 5 are flow diagrams illustrating an example process 400 for performing inspection of products and adjusting manufacturing parameters according to some implementations.
  • the process 400 may be executed at least partially by at least one control computing device 104 executing the inspection program 140 and machine control program 146, and further by at least one service computing device 102 or other suitable computing device executing the analysis program 156.
  • the control computing device 104 may initiate inspection of the products.
  • an operator may begin execution of the inspection program on the control computing device 104.
  • the inspection program established communication with the motors, camera, light source, robotic arm, conveyor, or the like as discussed additionally below.
  • the software is unable to communicate with one or more of the components, and air code may be presented on the display of the control computing device 104.
  • the control computing device 104 may determine whether there is a connection to the camera and light source. If so, the process goes to 406. If not, the process goes to 408.
  • control computing device 104 may determine whether there is a connection to the manipulator and/or conveyor. If so, the process goes to 410. If not, the process goes to 408. [0073] At 408, if the control computing device 104 is unable to communicate with any of the components of the system, the control computing device 104 may send an error code for presentation on a display associated with the control computing device 104.
  • the control computing device 104 may receive inspection parameters for the product to be inspected.
  • the inspection parameters may include the outer diameter of the product, the number of sections of the product to be inspected, and so forth.
  • the control computing device 104 may determine whether there is a product on the conveyor. For example, an optical sensor, proximity sensor, or the like may provide an indication to the control computing device as to whether there is a product present for transfer from the conveyor to the inspection support. If so, the process goes to 416. If not, the process goes to 414.
  • an optical sensor, proximity sensor, or the like may provide an indication to the control computing device as to whether there is a product present for transfer from the conveyor to the inspection support. If so, the process goes to 416. If not, the process goes to 414.
  • control computing device 104 may send a control signal to move the conveyor to the next position. The process may then return to block 412 to repeat block 412.
  • control computing device 104 may send a control signal to the manipulator to move the product to the inspection support.
  • a robotic arm may serve as the manipulator for moving the product to the inspection support.
  • the control computing device 104 may determine whether the product is on the inspection support. For example, an optical sensor, proximity sensor or the like may provide an indication to the control computing device 104 as to whether a product has been transferred to the inspection support. If not, the process may go to 420. If so, the process proceeds to 422.
  • an optical sensor, proximity sensor or the like may provide an indication to the control computing device 104 as to whether a product has been transferred to the inspection support. If not, the process may go to 420. If so, the process proceeds to 422.
  • the control computing device 104 may wait for a time T1 and may then repeat block 418 to determine whether the product is on the inspection support.
  • the time T1 may be less than one second, one second, several seconds, etc.
  • the control computing device 104 may turn on a first motor and the light source to begin the inspection of the first section of the product. For example, the control computing device 104 may control the speed of the rotation of the product based at least in part on the diameter of the product, and the image capture abilities of the camera. [0081] At 424, the control computing device 104 may determine whether the motor speed is correct. For example, the motor speed may be controlled based on the indicated outer diameter of the product. The motor speed may be detected by a closed loop feedback with the first motor, or by and a sensor associated with the driveshaft or the product. If the motor speed is correct, the process goes to 428. If not, the process goes to 426.
  • the control computing device 104 may wait for a time T2 and then may repeat block 424.
  • the time T2 may be less than one second, one second, several seconds, etc.
  • the control computing device 104 may perform image capture of the product surface using the camera.
  • the images may be generally dark unless there is a defect on the surface of the product.
  • the captured images may be received by the control computing device 104 from the camera as they are captured.
  • the camera may be configured to send the captured images to the control computing device 104 in one or more batches.
  • the control computing device 104 may determine whether image capture is complete. For example, based on the motor speed and the indicated diameter of the product, the control computing device may determine whether the product has been rotated through at least one full revolution of its circumference. Furthermore, the determination of whether the image capture is complete may include determining whether there are multiple sections of the product to be inspected and whether this is the last section of the product. If image capture is complete, the process goes to tab A at FIG. 5. If not, the process returns to 428 to continue to perform image capture.
  • FIG. 5 is a flow diagram including a continuation of the process 400 according to some implementations .
  • the control computing device 104 may determine whether the last section of the product has been inspected. If so, the process goes to 510. If not, the process goes to 504. [0087] At 504, when there is still at least one more section of the product to be scanned, the control computing device 104 may turn on a second motor to advance the product to scan the next section of the product. For example, the motor may advance the product axially on the inspection support so that the next section is illuminated by the light source.
  • the control computing device 104 may determine whether the position is correct to continue scanning. For example, the control computing device 104 may receive input from one or more sensors associated with the inspection support and/or the product on the suspension support to determine whether the product has been advanced axially with respect to the light source and camera so that the next section of the product is illuminated by the light source. If the position is correct, the process may go to 509. If the position is not correct, the process may go to 508.
  • the control computing device 104 may wait for a time T3 and then repeat block 506.
  • the time T3 may be less than one second, one second, several seconds, etc.
  • control computing device 104 may turn off the second motor, and then return to tab B of FIG. 4 to continue the image capture as discussed above at block 424.
  • control computing device 104 may return the support to a position for receiving the next product.
  • control computing device 104 may turn off the first and second motors if either of them are still on.
  • control computing device 104 may process and analyze the captured images for the product being inspected.
  • control computing device 104 may determine the condition of the product, e.g., whether the product is defective and if so the type of defect.
  • control computing device 104 may determine whether the product passed inspection. If so, process goes to 522. If not, the process goes to 520.
  • control computing device 104 may send a control signal to the manipulator to move the product to a been corresponding to the reason for the failure.
  • Example defects discussed above with respect to FIG. 1 may include surface scratches, nodules on the surface, or dents in the surface.
  • control computing device 104 may send a control signal to cause the manipulator to move the product to a been for receiving products that passed inspection.
  • control computing device 104 may determine whether the manipulator and/or other conveyor is ready to continue with inspection of a next product. If not, the process goes to 526. If so, the process goes to tab C of FIG. 4 to advance the conveyor to obtain the next product for inspection.
  • the control computing device 104 may wait for a time T4 and then repeat block 524.
  • the time T4 may be less than one second, one second, several seconds, etc.
  • at least one of the control computing device 104 or the service computing device 102 may determine a defect pattern based on a plurality of defects detected in a plurality of the products 116.
  • the service computing device 102 may execute the analysis program and the adjustment MLM to determine one or more adjustments to the manufacturing processes used for manufacturing the products 116.
  • the control computing device 104 may an instruction or a control signal based on the one or more determined adjustments.
  • the control computing device may include a control program that may directly control the manufacturing processes by communication with one or more tool controllers as discussed above.
  • the control computing device 104 or the service computing device 102 may send an instruction or other communication to the user computing device 108 such as for instructing the user to make one or more adjustments to the manufacturing parameters.
  • FIG. 6 is a flow diagram illustrating a defect determining process 600 for identifying defects based on received images according to some implementations.
  • the process 600 may be executed by the control computing device 104 or other suitable computing device executing the inspection program 140.
  • the camera 126 may be operated to start capturing image frames.
  • the whole product surface may be divided into certain image frames, and based on product diameter, each image frame may include a fixed number of rows. The number of frames may be determined by camera memory, camera resolution, line speed, etc., for high quality image capture and transfer speed.
  • implementations herein provide multiple image processing techniques.
  • a first image processing technique herein may process the images without employing the image processing MLM 142, while a second image processing technique may employ image processing MLM 142.
  • the image processing MLM 142 may include a convolutional neural network (CNN), but implementations herein are not limited to such.
  • CNN convolutional neural network
  • the image processing speed is a consideration to enable a real time online inspection system.
  • the first image processing technique may be approximately three times faster than the second image processing technique that employs the image processing MLM 142. Nevertheless, this result may be highly dependent on the background intensity and quality of the captured image frames. For example, when the background mean intensity differs substantially from an optimal value (e.g., as illustrated in FIG. 9 below), the defect detection performed using the first image processing technique may be inaccurate.
  • the implementations herein may the first determine whether the image is suitable for processing using the first technique or the second technique and may then apply the appropriate technique based on the determined background intensity of the captured image. In particular, when the captured image frame is determined to be poor, implementations herein may employ the image processing MLM 142 during image processing.
  • control computing device 104 may receive captured image frames from the camera of the inspection system.
  • the control computing device 104 may check the background intensity of the received image frame. For example, if the background intensity varies substantially, e.g., from dark to light in the image, then the background intensity is categorized as “poor”, and the process may go to 612 to use the image processing MLM to improve clarity of the image before further processing. On the other hand, if the background intensity is relatively uniform across the image, then the background intensity may be classified as “good”, and the process may proceed to 606 for processing the image without use of the image processing MLM.
  • control computing device 104 may perform fast defect detention on the received image to identify potential defect candidates in the image.
  • the control computing device 104 may apply a Gaussian blur to the image.
  • the Gaussian blur (various kernel size, e.g., 3x3, 5x5, etc.) may be applied to the image frame to smooth the image and remove noise.
  • control computing device 104 may add weights to sharpen the image.
  • the smoothed version of the image may be subtracted from the original image in a weighted manner so that the values of a constant area remain constant.
  • control computing device 104 may use the trained image processing MLM 142 to clarify defects in the image in and even out the background intensity. Additional details of training and using the image processing MLM 142 are discussed below.
  • control computing device 104 may perform fast defect detention on the received image to identify potential defect candidates in the image.
  • blocks 608 and 610 may be executed following blocks 612 and 614.
  • the control computing device 104 may perform edge detection on the image to identify edges of the candidate defects in the image.
  • the edge detection may include determining the edges of bright areas in the image frame.
  • the control computing device 104 may perform a morphological transform on the image.
  • the morphological transform may be used to connect possible black pixels within white areas.
  • control computing device 104 may determine contours within the image.
  • control computing device 104 may merge and/or discard candidate defects.
  • bright areas may be labeled with rectangles, such as by using OPENCV functions, or the like. Further, labeled areas that have only one or two bright pixels may be treated as noise and may be discarded. Furthermore, labeled areas that lie within other labeled areas may be merged together.
  • the control computing device 104 may perform height estimation and size calculation of the remaining labeled candidate areas. For example, the total defect sizes and numbers in the image may be determined.
  • the aspect ratio and size information may be used to classify the defects. As one example, if the defect aspect ratio is larger than 2, the defect may be classified as a scratch.
  • this technique may not be accurate for classifying nodules and dents because of their similar circular shapes. To overcome this problem, the sizes of large number of nodule defects and dent defects were measured under a microscope for empirically determining relative size distributions of dents and nodules.
  • the control computing device 104 may determine whether the aspect ratio of an identified defect is greater than or equal to 2. If so, the process goes to 634. If not, the process goes to 628.
  • control computing device 104 may determine whether the size of the outer diameter of the defect is less than or equal to 85 micrometers. If so, the process goes to 630. If not, the process goes to 632.
  • the control computing device 104 may label the defect as a nodule.
  • control computing device 104 may label the defect as a dent.
  • the control computing device 104 may label the defect as a scratch.
  • the control computing device 104 may output the image and a comma separated values (.CSV) file for the image.
  • .CSV comma separated values
  • the product condition and the defect information may be saved in .CSV files, e.g., as discussed additionally below with respect to FIG. 8.
  • FIG. 7 illustrates an example distribution 700 of nodule and dent sizes and an example distribution 702 of nodule diameter vs. height according to some implementations.
  • the data for the distributions was obtained by measuring defects using a microscope.
  • nodule defects are indicated at 704, and the nodule sizes were determined to generally range from 2 micrometers to 90 micrometers with a Gaussian distribution.
  • dent sizes were determined to generally range from 70 micrometers to 300 300 micrometer.
  • a size threshold 708 of 85 micrometers may be used in some examples herein to classify nodules from dents with an error rate of approximately 1.5 percent.
  • the measured nodule heights are graphed against the corresponding diameters to provide a calibration curve 708, which may be expressed as a function of X and Y as indicated at 710. Accordingly, using this relationship between height and diameter, implementations herein may estimate the height of a nodule based on the diameter determined from a captured image. In particular, the nodules sizes can be determined using the calibration curve 708 and corresponding function 710 indicating the relationship between nodule diameter and height.
  • FIGS. 8 A and 8B illustrate example data structures for maintaining inspection information according to some implementations.
  • FIG. 8A includes a data structure
  • the data structure 800 includes a product identifier (ID) 802, a condition 804 of the product determined during the inspection, a time 806 of the inspection, a length 808 of the product, a diameter 810 of the product, and a cycle time 812 for the inspection.
  • ID product identifier
  • FIG. 8B includes a data structure 820 for presenting aggregated defect information determined for different sections of a plurality of inspected products.
  • the data structure 820 includes section column 822, which includes first, second, third, and fourth sections.
  • the data structure 820 further includes columns for various identified defects such as nodules less than 30 micrometers 824, nodules less than 60 micrometers
  • the inspection information such as that contained in the data structures 800 and 820, and/or other inspection information, may be provided to the service computing device 102 for analysis as discussed additionally below.
  • FIG. 9 illustrates an overview of a framework 900 for building the image processing MLM 142 according to some implementations.
  • the image processing process may be performed without using the image processing MLM 142 and may be effective when the captured image frames have a constant background intensity.
  • the image processing MLM 142 may be employed to even out the background intensity.
  • a dataset 904 may initially be generated. For example, a plurality of captured images 902 may be received as indicated at 906. In addition, the images 902 may be subsequently labeled as indicated at 908 to generate one or more data sets 904 that may be used for training and testing the image processing MLM 142. For example, following training, if the error rate of the output image is less than a threshold error rate, the training is done, and the trained MLM may then be deployed for use as discussed above with respect to FIG. 6
  • the images 910 may be labeled as discussed below with respect to FIG. 10.
  • the labeled images 910 may represent the information that the MLM 142 will be trained to leam, e.g., locations, sizes and/or other characteristics of defects.
  • the image processing MLM 142 is a neural network
  • an architecture including types and number of layers of the neural network, as well as the structure of their interconnectivities may be defined as indicated at 912.
  • the dataset 904 consisting of both the captured images 902 and the labeled images 910 may be input into the neural network during training.
  • the output of the image processing MLM 142 is compared with the label image and the resulting error is projected backwards through the MLM 142. Based on this result, one or more weights of the MLM 142 may be determined and adjusted to reduce the overall error as indicated at 916. As indicated at 918, when the training is completed, the weights
  • control computing device 104 for use by the control computing device 104 as discussed above, e.g., with respect to FIG. 6.
  • the accuracy and performance of the image processing MLM 142 may be evaluated with a set of test images to determine whether the results are sufficiently accurate, and the image processing MLM 142 may be periodically retrained such as based on using additional training data to improve the accuracy of the image processing MLM 142.
  • FIG. 10 is a flow diagram illustrating an example process 1000 for preparing input and output images according to some implementations.
  • a labeled version of each of the training set images may be created for training the image processing MLM 142. Since a lot of defects may appear to be very small on the image, e.g., occupying only a few pixels, the image processing MLM 142 may make a decision for each pixel in an image as to whether the pixel corresponds to a defect or not. To accomplish this goal, a binary label image may be generated, on which every defect pixel is white and every background pixel is black.
  • a binary threshold is applied to the image to mark the bright defect clearly.
  • a sharpening operation may first be applied to increase the contrast of the image before applying the binary threshold. By first sharpening the image less bright defects can be found on the resulting binary image with the cost of a very noisy detection of the bright defects.
  • the two techniques may be combined to get both advantages of clearly found bright defects and the detection of less bright defects by simply adding or otherwise combining the two binary images for creating a labeled image. Then a region of interest (ROI) may be identified to crop the labeled image for use as an MLM output training image for MLM output training.
  • ROI region of interest
  • the same ROI on the captured image may be cropped and then brightness of the cropped image may be varied by multiplying a random factor between 0.25 and 4, while the corresponding labeled image remains the same.
  • the MLM 142 can leam to not only rely on brightness, but also on shape and the appearance of the defects.
  • the quality of the output may be nearly the same for different input dimensions. This circumstance allows for cropping of the training images into smaller parts and, by doing so, increases the size of the training dataset.
  • the cropping of a small image part of a training image e.g., 128 x
  • the output dimensions of the last layer should fit the image input size. Since most of defects may be only a few pixels in size, there are no pooling layers in the MLM, since small defects may be lost after down sampling the feature maps. Accordingly, the size of the feature maps may be the same all over the MLM 142. In addition, in some examples, there may be no fully connected layers, which maintains the spatial information of the feature maps. In some examples, such as in the case of convolutional neural network, the mentioned constraints may lead to an architecture of the MLM 142 which may consist only of convolutional layers followed by ReLU (Rectified Linear Unit) layers as non linear activation functions.
  • ReLU Rectified Linear Unit
  • the process 1000 of FIG. 10 may be executed by a service computing device 102 or other suitable computing device executing a model building program such as the model building program 164 discussed above with respect to FIG. 1.
  • the computing device may receive a training image.
  • the computing device may sharpen the image such as by applying contrast weightings to the image or through any of various other techniques.
  • the computing device may apply a binary threshold limit to the image such that lighter areas are made white and darker areas are made black.
  • the computing device may also apply a binary threshold limit to the image without first sharpening the image.
  • the computing device may combine the images from 1006 and 1008 to create a labeled image.
  • the computing device may crop a region of interest of the labeled image to a specified pixel size such as 128 by 128 pixels or any other suitable cropped image size.
  • the region of interest may be randomly selected.
  • the computing device may provide the cropped image as a training MLM output image.
  • the computing device may determine the area of the image cropped at 1012 and may crop the original image to the same region of interest as in the output image provided at 1014.
  • the computing device may bury the brightness of the image.
  • the brightness may be varied using any suitable technique such as by multiplying a random factor between 0.25 and 4.
  • the cropped image is provided as an MLM training input image.
  • a technique for generating training images for training the image processing MLM 142 is described above, numerous variations or alternative techniques will be apparent to those of skill in the art having the benefit of the disclosure herein.
  • FIG. 11 illustrates an example framework and process 1100 for execution of the image processing MLM 142 according to some implementations.
  • the image processing MLM 142 may be a convolutional neural network without ReLU layers after the convolutional layers.
  • the degrees of freedom in the MLM of FIG. 11 may include the number of layers, the number of filters of each layer, and the filter kernel size of each layer. These parameters may be set in such a way that in the first convolutional layers there are a few filters and a large kernel size, that can leam rough, high-level features. Subsequently, going deeper into the neural network the number of filters may be increased and the filter kernel size may be reduced to cover more complex lower-level features.
  • an image may be input into the image processing MLM.
  • the image processing MLM may apply the convolutional layers 1106(1) through 1006(L) for determining a uniform background and identifying areas corresponding to defects that are not part of the background.
  • the convolutional layers 1106 may be configured such that the kernel size decreases and the number of filters increases during advancement from layer 1106(1) to layer 1106(L).
  • the image processing MLM may output an enhanced image in with improved background uniformity and/or improved contrast of any defects with the background.
  • FIG. 12 illustrates an example graph 1200 showing a plot of training loss and validation loss according to some implementations.
  • training may include a process of optimizing the weights of the MLM 142 with respect to a loss function. This may include calculating the difference between and an output image that is output by the MLM 142, and the ground truth represented by the corresponding labeled image.
  • the Sigmoid-Cross- Entropy-Loss E may be used for calculating the loss, which is described in the following formula: log P n + (1 - P n ) log(l - p n )] EQ ⁇ 0) [00151]
  • N denotes the number of pixels of the image
  • p mobilis represents the label value of the nth pixel ⁇ 0
  • 1 ⁇ represents the predicted value of the nth pixel.
  • Stochastic Gradient Descent or other suitable algorithm may be employed.
  • the dataset may be divided into three groups, e.g., a training dataset, a validation dataset, and a testing dataset.
  • a forward pass processing input through the MLM 142
  • a backward pass in which the error is propagated back in order to adjust the weights of the MLM 142.
  • only the forward pass is applied.
  • the validation images have no influence on the MLM 142, but the loss of the validation images may be compared to the loss of the training images to determine whether the MLM 142 is overfitting. For example, overfitting may occur when the loss of the validation dataset is much higher than the loss for the training dataset.
  • the graph 1200 of FIG. 12 illustrates an example of the training loss 1202 and validation loss 1204 over the epochs (one epoch equals one pass through the whole dataset).
  • the training loss 1202 fluctuates substantially with respect to the validation loss 1204 in some earlier epochs, but the fluctuations are reduced in later epochs, e.g., after 20 epochs.
  • FIG. 13 illustrates an example 1300 of calculating of defect size accuracy according to some implementations.
  • empirical determinations of structural parameters number of filters, kernel size, etc.
  • hyperparameters learning rate, batch size, etc.
  • an additional method can be used to quantify the quality of the MLM result, which may be expressed as the average of width and height accuracy (referred to herein as
  • the final defect size accuracy may be determined by bounding boxes after applying a contour determination function.
  • the size of each resulting rectangle of the output image may be compared to the corresponding rectangle of the corresponding labeled image. Since nearly no false positives occur, any lingering false positives do not affect the result. On the other hand, missed defect detections are considered with a size accuracy of zero for the underlying defect.
  • FIG. 13 illustrates details of examples for calculating the size accuracy and shows a labeled image 1302 including three white rectangles 1304 representative of a defect.
  • Three example output images are illustrated as output images 1306(1), 1306(2) and 1306(3). Since most of the defects tend to have a circular or rounded shape, implementations herein may consider the accuracy of the diameter prediction (“diameter” herein may include a maximum length and maximum width when a shape is not actually round). According, rather than comparing the numbers of pixels within the bounding boxes, instead it may be considered sufficient to compare the widths and heights of the bounding boxes (which may be the same as the diameter in most cases).
  • both the width and height of the bounding box(es) on the output image 1306 compared with the corresponding dimensions of the bounding box(es) of the labeled image 1302 and their accuracies are averaged.
  • This mean ACC value represents the accuracy of the diameter or defect, respectively.
  • the ACC value of every defect in the training image may be averaged to calculate the size accuracy for the whole image, and based on this, the best architecture for the MLM 142 may be selected.
  • the first output image 1306(1) includes three white rectangles 1308 which match the three white rectangles 1304 of the labeled image 1302. Accordingly, as shown at 1310, the ACC may be calculated to be “1” in this example since the output images matches the labeled image 1302, and therefore the height and width of the defect match.
  • the second output image 1306(2) includes two white rectangles 1312. In this example, as indicated at 1314, the heights of the defects match but the width of the output image defect is only half of the width of the labeled image 1302. Accordingly, the ACC of the second output image may be calculated to be “2/3” as shown at 1314.
  • the third output image 1306(3) includes one white rectangle 1316. In this example, the defect detected in the output image 1306(3) is only half the height and half the width of the defect in the labeled image 1302. Accordingly, the ACC of the third output image may be calculated to be “1/2”, as shown at 1318.
  • FIG. 14 illustrates an example histogram 1400 showing inspection results according to some implementations. For example, after images from an entire product surface are processed for detecting defects, all the defect information may be classified, sized, and counted to generate the histogram 1400.
  • the histogram 1400 includes detailed inspection information, such as type, size, and count of the nodule defects, as shown at 1402, and scratch defects as shown at 1404, for an inspected product. An additional histogram (not shown) may be generated if there are any dent defects detected.
  • FIG. 15 is a flow diagram illustrating an example process 1500 performed following completion of inspection according to some implementations.
  • the process 1500 may be performed in part by the control computing device 104 and in part by the service computing device 102 discussed above with respect to FIG. 1.
  • the control computing device 104 may determine or otherwise receive the inspection results such as through execution of the inspection program.
  • the control computing device may present the defect type and count information on the local display such as by presentation of the histogram, such as discussed above with respect to FIG. 14 and/or presentation of a data structure based on comma separated value information e.g., as discussed above with respect to FIG. 8.
  • the control computing device 104 may apply rules to determine pass fail information based on the inspection results.
  • the inspection program may be configured to apply one or more decision-making rules for determining whether the particular product has passed or failed the inspection based on the defect types, number of defects detected, etc.
  • the decision-making rules may be applied to determine a primary defect type that caused the failure decision (i.e., nodules, dents, or scratches).
  • control computing device 104 may send a control signal to activate a local indicator light or other indicator 132 based on the inspection results and the rule determination ⁇
  • the indicator 132 mentioned above may be activated to provide a light signal or other indication of the determined product condition.
  • control computing device 104 may send a control signal to move the product to an appropriate been based on the inspection results and the rule determination.
  • control computing device 104 may send the inspection results to the service computing device 102.
  • the control computing device 104 may send the raw image data, processed images, and .CSV files containing all the defect information as the inspection information 144 discussed above.
  • the service computing device 102 may provide the inspection results via a web portal, such as through a web application or the like.
  • a user computing device may access the web portal to obtain and display the defect information on the user computer device.
  • the control computing device 104 may include a web application that may allow the user to access the inspection information through the control computing device 104.
  • the service computing device 102 may analyze aggregated inspection information for a plurality of products from a same batch, same location, same time period, or the like.
  • the service computing device 102 may include one or more folders or other storage location in which the inspection information may be stored.
  • the inspection information may be saved to a specific folder based on a location of the inspection system, a product type, a date of the inspection, and so forth.
  • the service computing device 102 may execute the analysis program and an adjustment machine- learning model for determining one or more adjustments to make to a manufacturing process based on analysis of the defects identified in the inspection information.
  • control computing device 104 may receive one or more recommended parameter adjustments from the service computing device 102 or, alternatively, from the user computing device 108.
  • control computing device 104 may send one or more control signals to adjust one or more manufacturing parameters controlled by the control computing device 104.
  • the control signals may be sent based on the received recommended parameter adjustments for attempting to prevent the defects identified in the inspection information from occurring in subsequently manufactured parts.
  • FIG. 16 illustrates an example user interface 1600 for viewing inspection information according to some implementations.
  • the user interface 1600 may be presented by a web application, webpage, dedicated application, or the like.
  • a user in the left part 1602 of the user interface 1600, a user may be presented with graphics and/or virtual controls that enable the user to perform a plurality of different operations such as selecting a region 1604, selecting a facility location 1606, selecting a product 1608, viewing product specifications 1610, viewing historical inspection information 1612, or viewing one or more webcams 1614 such as webcams associated with an inspection station, a manufacturing station, or the like.
  • the user may access the historical inspection information 1612 to view the aggregated inspection information such as ratio of good parts to bad parts, types of and numbers of defects detected for each part, and so forth.
  • the right side 1616 of the user interface 1600 may include two tabs, namely an online tab 1618 and an offline tab 1620.
  • the online tab 1618 may be selected to cause the user interface 1600 to present the current product inspection results, such as a histogram showing a number of nodules and scratches in the most recently inspected product, as indicated at 1622, as well as one or more data structures indicating recent inspection information as indicated at 1624.
  • the online tab 1618 may be used to view numerous other types of information such as a daily production goal, the inspection system status, the operator currently on duty and their contact information and/or a webcam view showing a live view of the inspection system (not shown in FIG. 16).
  • the offline tab 1620 may be selected by the user to obtain additional options for information, such as viewing product inspection results based on date, product type, etc., and viewing product surface images (raw and/or processed images) and so forth. Numerous other variations will be apparent to those of skill in the art having the benefit of the disclosure herein.
  • FIG. 17 is a flow diagram illustrating an example process 1700 for adjusting one or more manufacturing parameters according to some implementations.
  • the process 1700 may be executed by the service computing device 102 or other suitable computing device such as by executing the analysis program 156.
  • the computing device may receive defect information.
  • the service computing device 102 may receive inspection information 144 that may include information about defects in a plurality of products inspected at the inspection station as discussed above.
  • the computing device may determine nodule counts, sizes, and locations for a plurality of the inspected products.
  • the computing device may determine dent counts, sizes, and locations for a plurality of the inspected products.
  • the computing device may determine scratch counts, sizes, and locations for a plurality of the inspected products.
  • the computing device may determine one or more nodule patterns from the nodule counts, sizes, and locations of the plurality of inspected products and may determine one or more adjustments to one or more manufacturing parameters based on the determined pattern(s). For instance, the computing device may match the pattern with one or more of a plurality of known patterns such as a first pattern 1712(1), a second pattern 1712(2), a third pattern 1712(3), a fourth pattern 1712(4), ..., each of which may have an associated parameter adjustment such as a first parameter adjustment 1714(1), a second parameter adjustment 1714(2), a third parameter adjustment 1714(3), a fourth parameter adjustment 1714(4), ..., and so forth.
  • a plurality of known patterns such as a first pattern 1712(1), a second pattern 1712(2), a third pattern 1712(3), a fourth pattern 1712(4), ..., each of which may have an associated parameter adjustment such as a first parameter adjustment 1714(1), a second parameter adjustment 1714(2), a third parameter adjustment 1714(3), a fourth parameter
  • the computing device may recommend a parameter adjustment 1714.
  • the pattern may be a “good” pattern in which case the parameter adjustment recommendation may be a material recommendation, an equipment recommendation, a conveying, transportation or handling recommendation, or the like, or no adjustment at all.
  • the determination of patterns and corresponding parameter adjustments may be performed by using the one or more adjustment MLMs 160. Details of building and using the one or more adjustment MLMs 160 are discussed additionally below.
  • the computing device may determine one or more dent patterns from the dent counts, sizes, and locations of the plurality of inspected products and may determine one or more adjustments to one or more manufacturing parameters based on the determined pattem(s). For instance, the computing device may match the pattern with one or more of a plurality of known patterns such as a first pattern 1718(1), a second pattern 1718(2), a third pattern 1718(3), a fourth pattern 1718(4), ..., each of which may have an associated parameter adjustment such as a first parameter adjustment 1720(1), a second parameter adjustment 1720(2), a third parameter adjustment 1720(3), a fourth parameter adjustment 1720(4), ..., and so forth.
  • a plurality of known patterns such as a first pattern 1718(1), a second pattern 1718(2), a third pattern 1718(3), a fourth pattern 1718(4), ..., each of which may have an associated parameter adjustment such as a first parameter adjustment 1720(1), a second parameter adjustment 1720(2), a third parameter adjustment 1720(3), a fourth parameter adjustment
  • the computing device may recommend a parameter adjustment.
  • the determination of patterns and corresponding parameter adjustments may be performed by using the one or more adjustment MLMs 160. Details of building and using the one or more adjustment MLMs 160 are discussed additionally below.
  • the computing device may determine one or more scratch patterns from the dent counts, sizes, and locations of the plurality of inspected products and may determine one or more adjustments to one or more manufacturing parameters based on the determined pattern(s).
  • the computing device may match the pattern with one or more of a plurality of known patterns such as a first pattern 1724(1), a second pattern 1724(2), a third pattern 1724(3), a fourth pattern 1724(4), ..., each of which may have an associated parameter adjustment such as a first parameter adjustment 1726(1), a second parameter adjustment 1726(2), a third parameter adjustment 1726(3), a fourth parameter adjustment 1726(4), ..., and so forth.
  • the computing device may recommend a parameter adjustment.
  • the determination of patterns and corresponding parameter adjustments may be performed by using the one or more adjustment MLMs 160. Details of building and using the one or more adjustment MLMs 160 are discussed additionally below.
  • FIG. 18 illustrates an example architecture of an artificial neural network (ANN) 1800 that may be used as the adjustment MLM 160 according to some implementations.
  • the ANN 1800 may employ adaptive learning, as discussed additionally below.
  • the ANN 1800 includes a plurality of artificial neurons able to communicate with each other via the artificial neural network 1800.
  • Examples of inputs 1804 that may be input into the ANN 1800 may include defect type XI, defect size X2, defect density X3, defect pattern X4, and so forth.
  • Each input 1804 may have a weight 1806 associated with it, such as weights Wi, W2, W3, and W4, respectively.
  • the weights 1806 may be determined as discussed additionally below.
  • the outputs 1808 of the ANN 1800 may be recommended parameter adjustments, such as speed change Yl, feeding change Y2, temperature change Y3, and so forth.
  • FIG. 19 illustrates a graph 1900 showing examples of different learning rates for finding a global minimum according to some implementations.
  • examples herein include an adaptive learning rate that configures the artificial neural network to achieve higher accuracy and a higher convergence rate as compared to conventional techniques that use a constant learning rate, time-based decay, step-based decay, adaptive gradient algorithm (AdaGrad), or root mean square propagation (RMSprop). While AdaGrad and RMSprop may sometimes be referred to as having an adaptive learning rate, these algorithms suffer such problems as monotonically decreasing learning rate or some hyperparameter that must be determined in advance.
  • AdaGrad and RMSprop may sometimes be referred to as having an adaptive learning rate, these algorithms suffer such problems as monotonically decreasing learning rate or some hyperparameter that must be determined in advance.
  • the following equations may be used according to some examples herein:
  • the first equation EQ(1) shows the root mean square error (RMSE) function used in the artificial neural network of MLM 160, where £) is the network output from a training set and Ti is the targeted output.
  • the second equation EQ(2) shows the weight update function for weights W t based on a constant learning rate h.
  • the learning rate controls how quickly the neural network model is adapted to the problem. Smaller learning rates require more training epochs given the smaller changes made to the weights for each update, whereas larger learning rates result in rapid changes and require fewer training epochs. However, a learning rate that is too large can cause the model to converge too quickly or may cause the model to oscillate to a suboptimal solution, whereas a learning rate that is too small can cause the process to get stuck.
  • FIG. 19 includes examples of a constant learning rate 1902, an overly large learning rate 1904, and the adaptive learning rate 1906 described herein.
  • the constant learning rate may converge at a global minimum convergence 1908, but may not be as accurate as the ANN configured using the adaptive learning rate 1906 herein.
  • the overly large learning rate 1904 merely provides large oscillations and fails to converge accurately.
  • the adaptive learning rate 1906 converges progressively toward the global minimum convergence 1908 through multiple increasingly accurate iterations 1910 from a starting point 1912.
  • adaptive learning rate algorithms typically also suffer from issues of monotonically decreasing (e.g. AdaGrad), or predetermination of some other parameters related to the learning rate (e.g., a factor in RMSprop).
  • AdaGrad monotonically decreasing
  • the adaptive learning algorithm herein only uses one predetermined parameter, which is usually between 0 and 1, and overcomes the issues suffered by AdaGrad.
  • the adaptive learning rate algorithm herein uses an initial learning rate by dividing the ratio of the current error over the previous error. Because the learning ratio is defined by the ratio of errors in the two consecutive steps, each weight to be determined may have its own learning rate, leading to accurate and rapid training of the neural network.
  • each defect count, size, and location may be analyzed to get its pattern, e.g., as discussed above with respect to FIG. 17. For example, within a frame window, the size and count of nodules can be used to determine the nodule pattern, which may then be used to locate the root cause, discussed below with respect to FIG. 20.
  • FIG. 20 is a flow diagram illustrating an example process 2000 for determining defect causes and adjusting manufacturing parameters according to some implementations.
  • the manufacturing steps prior to inspection for a product are illustrated.
  • a raw material is received at 2002 and lathing is performed at 2004 to turn the product to an initial desired diameter.
  • the product may be polished using a first grit level, and at 2008, the product may be further polished using a second grit level.
  • the product may be cleaned prior to electroplating at 2012.
  • the plated product is cleaned following the plating, and at 2016, the product is conveyed, transported, or the like for inspection at 2018.
  • the product may be determined to be properly finished and passed inspection.
  • nodules may be found during inspection.
  • at 2024 scratches or dents may be found during inspection.
  • nodules when nodules are found, this may typically be an indication of an improper machining process and the pattern, size, density, and position of the nodules may be checked. For example, if there is a tooling issue during lathing, the surface of the product before electroplating may be very rough and, thus, the surface after electroplating may be extremely nodular, with nodules aligned along the lathing furrows, e.g., as shown by first nodule pattern 2028. By checking the nodules size, density, and the alignment properties, the lathing issue can be identified by the adjustment MLM 160 and an adjustment recommended.
  • a parameter adjustment signal 2030 may be sent to the lathing machine, such as for changing the lathing tool bit, changing the spindle rotation speed, changing the feed speed, or the like.
  • grinding and polishing issues can also be identified based on various nodule sizes and densities. For example, a second nodule pattern 2032 may correspond to improper polishing at 2006, while a third nodule pattern 2034 may correspond to improper polishing at 2008.
  • the orientation and size of the dents or scratches may be determined.
  • dent defects may typically be caused by the conveying and/or transportation process or improper manual handling. If dent defects are detected, the root cause of the dents can thus be identified and a recommendation may be sent to a user, operator, or the like so that corresponding corrective action can be made.
  • scratches may be more likely caused by the cleaning process but may also be caused by transportation in some cases. Based on the scratch orientations, the root cause of the scratch may also be predicted. For example, if scratches on a product surface are circumferential, the scratches are more likely caused by the cleaning process at 2014. Alternatively, axial scratches may be more likely caused by the conveying and/or transportation process. After identifying this defect pattern, one or more parameter adjustments 2038 may be made the related manufacturing processing to eliminate the defects.
  • FIG. 21 illustrates a table 2100 as an example data structure of an experimentally obtained training dataset 2102 according to some implementations.
  • the adjustment MLM 160 may be trained in a manner similar to that discussed above for the image processing MLM 142. For example, a large number of images of various defects having known root causes may be used for training the MLM 160 for identifying a root cause of a defect and selecting a parameter to adjust for correcting or otherwise adjusting the root cause.
  • experimentally obtained data may be used as an input-output training dataset 2102 used for training the adjustment MLM 160, such as in the case that the adjustment MLM 160 is an artificial neural network or other suitable type of machine learning model.
  • the data structure 2100 includes input parameters 2104 and output parameters 2106 for a plurality of adjustment trials.
  • the input parameters 2104 include trial number 2108, defect type 2110, size 2112, density 2114, and pattern 2116.
  • the output parameters 2106 include change in speed 2118, change in feeding 2120, and change in temperature 2122.
  • integers e.g. between 0 and 10 are assigned to each of the defect types 2110, ranges of sizes 2112, ranges of densities 2114, and patterns 2116; however, different valuation conventions may be used in other examples.
  • input parameters 2104 and output parameters 2106 are shown in this example, numerous other or different parameters may be included in other examples described elsewhere herein, or as will be apparent to those of skill in the art having the benefit of the disclosure herein.
  • this defect information may be saved for use by the model building program 164, as discussed above.
  • various machining parameters such as feeding speed, rotation speed, temperature, etc., may be checked and adjusted by the operators to minimize or eliminate the defects on the products. Together with the corresponding surface defect data, these machining parameter adjustments may be used as the output parameters 2106 for the corresponding input parameters 2104 in the training dataset 2102. Alternatively, in other examples, the machining parameters may be changed first to generate corresponding surface defects.
  • a large number of input-output training datasets 2102 can be generated for training, validating, and testing various MLM configurations, such as various ANN architectures with different layers, different numbers of neurons, and so forth.
  • various MLM configurations such as various ANN architectures with different layers, different numbers of neurons, and so forth.
  • the architecture with best performance i.e. smallest output error (e.g., absolute percentage error between predicted values and experimental values) may be chosen as the adjustment MLM 160 that is deployed.
  • the determination of the causes may be performed by providing the defect information as input to an adjustment MLM 160, such as the artificial neural network with the adaptive learning rate updating algorithm discussed above with respect to FIGS. 18 and 19.
  • the MLM 160 may be used for determination of the adjustments to the manufacturing parameters, such as feeding speed, rotation speed, etc.
  • the ANN can be converged faster with less training steps as compared to networks using currently available learning rate updating algorithms.
  • FIG. 22 illustrates select example components of the service computing device(s) 102 that may be used to implement at least some of the functionality of the systems described herein.
  • the service computing device(s) 102 may include one or more servers or other types of computing devices that may be embodied in any number of ways. For instance, in the case of a server, the programs, other functional components, and data may be implemented on a single server, multiple servers, a cluster of servers, a server farm or data center, a cloud-hosted computing service, and so forth, although other computer architectures may additionally or alternatively be used. Multiple service computing device(s) 102 may be located together or separately, and organized, for example, as virtual servers, server banks, and/or server farms.
  • the described functionality may be provided by the servers of a single entity or enterprise, or may be provided by the servers and/or services of multiple different entities or enterprises.
  • the service computing device(s) 102 includes, or may have associated therewith, one or more processors 2202, one or more computer-readable media 2204, and one or more communication interfaces 2206.
  • Each processor 2202 may be a single processing unit or a number of processing units, and may include single or multiple computing units, or multiple processing cores.
  • the processor(s) 2202 can be implemented as one or more central processing units, microprocessors, microcomputers, microcontrollers, digital signal processors, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions.
  • the processor(s) 2202 may include one or more hardware processors and/or logic circuits of any suitable type specifically programmed or configured to execute the algorithms and processes described herein.
  • the processor(s) 2202 may be configured to fetch and execute computer-readable instructions stored in the computer-readable media 2204, which may program the processor(s) 2202 to perform the functions described herein.
  • the computer-readable media 2204 may include volatile and nonvolatile memory and/or removable and non-removable media implemented in any type of technology for storage of information, such as computer-readable instructions, data structures, program modules, or other data.
  • the computer-readable media 2204 may include, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, optical storage, solid state storage, magnetic tape, magnetic disk storage, storage arrays, network attached storage, storage area networks, cloud storage, or any other medium that can be used to store the desired information and that can be accessed by a computing device.
  • the computer-readable media 2204 may be a tangible non-transitory medium to the extent that, when mentioned, non-transitory computer-readable media exclude media such as energy, carrier signals, electromagnetic waves, and/or signals per se. In some cases, the computer-readable media 2204 may be at the same location as the service computing device 102, while in other examples, the computer-readable media 2204 may be partially remote from the service computing device 102.
  • the computer-readable media 2204 may be used to store any number of functional components that are executable by the processor(s) 2202.
  • these functional components comprise instructions or programs that are executable by the processor(s) 2202 and that, when executed, specifically program the processor(s) 2202 to perform the actions attributed herein to the service computing device(s) 102.
  • Functional components stored in the computer-readable media 2204 may include the analysis program 156, the web application 154, and the model building program 164, each of which may include one or more computer programs, applications, executable code, or portions thereof. Further, while these programs are illustrated together in this example, during use, some or all of these programs may be executed on separate service computing device(s) 102.
  • the computer-readable media 2204 may store data, data structures, and other information used for performing the functions and services described herein.
  • the computer-readable media 2204 may store the inspection information 144, the adjustment
  • MLM(s) 160 and the recommended adjustments 162.
  • the service computing device 102 may also include or maintain other programs and data 2208, which may include various programs, drivers, etc., and the data used or generated by the functional components.
  • the service computing device 102 may include many other logical, programmatic, and physical components, of which those described above are merely examples that are related to the discussion herein.
  • the one or more communication interfaces 2206 may include one or more software and hardware components for enabling communication with various other devices, such as over the one or more network(s) 106.
  • the communication interface(s) 2206 may enable communication through one or more of a LAN, the Internet, cable networks, cellular networks, wireless networks (e.g., Wi-Fi) and wired networks (e.g., Fibre Channel, fiber optic, Ethernet), direct connections, as well as close-range communications such as BLUETOOTH®, and the like, as additionally enumerated elsewhere herein.
  • the service computing device 102 may include one or more input/output (I/O) devices 2210, such as a keyboard, mouse, display, touchscreen, or any of various other I/O devices as is known in the art.
  • I/O input/output
  • control computing device 104 and the user computing device 108 may have similar hardware configurations to those discussed above for the service computing device(s) 102, but with different functional components and data, such as discussed above with respect to FIG. 1. Further, the control computing device 104 and the user computing device 108 may include many other logical, programmatic, and physical components, of which those described above are merely examples that are related to the discussion herein.
  • Various instructions, methods, and techniques described herein may be considered in the general context of computer-executable instructions, such as computer programs and applications stored on computer-readable media and executed by the processor(s) herein.
  • program and application may be used interchangeably, and may include instructions, routines, modules, objects, components, data structures, executable code, etc., for performing particular tasks or implementing particular data types.
  • These programs, applications, and the like may be executed as native code or may be downloaded and executed, such as in a virtual machine or other just-in-time compilation execution environment.
  • the functionality of the programs and applications may be combined or distributed as desired in various implementations.
  • An implementation of these programs, applications, and techniques may be stored on computer storage media or transmitted across some form of communication media.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Manufacturing & Machinery (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)

Abstract

Dans certains exemples, l'invention concerne un dispositif informatique configuré pour recevoir des informations d'inspection comprenant une pluralité d'images de surfaces de produits inspectés comprenant des défauts sur les surfaces des produits inspectés. Le dispositif informatique peut reconnaître, à partir de la pluralité d'images, un motif dans les défauts pour déterminer une cause des défauts et un ajustement associé à un paramètre de fabrication. Le dispositif informatique peut envoyer des informations relatives à l'ajustement associé au paramètre de fabrication pour provoquer, au moins en partie, l'envoi d'au moins un signal de commande pour commander au moins un paramètre de fabrication sur la base de l'ajustement associé déterminé.
PCT/US2020/030346 2020-04-29 2020-04-29 Ajustement de paramètres de fabrication sur la base d'une inspection WO2021221621A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2020/030346 WO2021221621A1 (fr) 2020-04-29 2020-04-29 Ajustement de paramètres de fabrication sur la base d'une inspection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2020/030346 WO2021221621A1 (fr) 2020-04-29 2020-04-29 Ajustement de paramètres de fabrication sur la base d'une inspection

Publications (1)

Publication Number Publication Date
WO2021221621A1 true WO2021221621A1 (fr) 2021-11-04

Family

ID=78373795

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2020/030346 WO2021221621A1 (fr) 2020-04-29 2020-04-29 Ajustement de paramètres de fabrication sur la base d'une inspection

Country Status (1)

Country Link
WO (1) WO2021221621A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050002560A1 (en) * 2003-05-29 2005-01-06 Nidek Co., Ltd. Defect inspection apparatus
US20100245562A1 (en) * 2009-03-26 2010-09-30 Yujin Instec. Co., Ltd. Method and device for detecting shape of sheet roll
US20180071987A1 (en) * 2015-03-12 2018-03-15 Nikon Corporation Apparatus for manufacturing three dimensional shaped object, and method for manufacturing structure
US20200005511A1 (en) * 2018-06-28 2020-01-02 Adobe Inc. Determining Image Handle Locations

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050002560A1 (en) * 2003-05-29 2005-01-06 Nidek Co., Ltd. Defect inspection apparatus
US20100245562A1 (en) * 2009-03-26 2010-09-30 Yujin Instec. Co., Ltd. Method and device for detecting shape of sheet roll
US20180071987A1 (en) * 2015-03-12 2018-03-15 Nikon Corporation Apparatus for manufacturing three dimensional shaped object, and method for manufacturing structure
US20200005511A1 (en) * 2018-06-28 2020-01-02 Adobe Inc. Determining Image Handle Locations

Similar Documents

Publication Publication Date Title
Rifai et al. Evaluation of turned and milled surfaces roughness using convolutional neural network
Sassi et al. A smart monitoring system for automatic welding defect detection
Chauhan et al. Fault detection and classification in automated assembly machines using machine vision
CN113592845A (zh) 一种电池涂布的缺陷检测方法及装置、存储介质
Sacco et al. Automated fiber placement defects: Automated inspection and characterization
US20240160194A1 (en) System and method for manufacturing quality control using automated visual inspection
CN109840900B (zh) 一种应用于智能制造车间的故障在线检测系统及检测方法
US10636133B2 (en) Automated optical inspection (AOI) image classification method, system and computer-readable media
Garfo et al. Defect detection on 3d print products and in concrete structures using image processing and convolution neural network
Pernkopf et al. Visual inspection of machined metallic high-precision surfaces
JP2021515885A (ja) 照明条件を設定する方法、装置、システム及びプログラム並びに記憶媒体
KR20210020065A (ko) 비전 시스템을 갖는 이미지에서 패턴을 찾고 분류하기 위한 시스템 및 방법
Saeedi et al. Measurement and inspection of electrical discharge machined steel surfaces using deep neural networks
KR102676508B1 (ko) 안과용 렌즈들의 이미지들의 획득 및 검사
Kim et al. Systematic deep transfer learning method based on a small image dataset for spaghetti-shape defect monitoring of fused deposition modeling
Kumar et al. Tool wear classification based on machined surface images using convolution neural networks
KR20210050168A (ko) 딥러닝 모델에 적용하기 위한 학습 데이터 확장방법, 딥러닝을 이용한 이미지 분류장치 및 그 방법
CN115082719A (zh) 一种木材质量分级方法
EP4285315A1 (fr) Systèmes et procédés de détection de défaut de peinture à l'aide de l'apprentissage machine
US20230196189A1 (en) Measurement method and apparatus for semiconductor features with increased throughput
García-Moreno A fast method for monitoring molten pool in infrared image streams using gravitational superpixels.
Guo et al. Machine vision-based intelligent manufacturing using a novel dual-template matching: a case study for lithium battery positioning
Li et al. RCA: YOLOv8-Based Surface Defects Detection on the Inner Wall of Cylindrical High-Precision Parts
Jung et al. Anomaly Candidate Extraction and Detection for automatic quality inspection of metal casting products using high-resolution images
CN117085969A (zh) 人工智能工业视觉检测方法、装置、设备及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20933673

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20933673

Country of ref document: EP

Kind code of ref document: A1