US20240176337A1 - Industrial quality monitoring system with pre-trained feature extraction - Google Patents

Industrial quality monitoring system with pre-trained feature extraction Download PDF

Info

Publication number
US20240176337A1
US20240176337A1 US18/072,220 US202218072220A US2024176337A1 US 20240176337 A1 US20240176337 A1 US 20240176337A1 US 202218072220 A US202218072220 A US 202218072220A US 2024176337 A1 US2024176337 A1 US 2024176337A1
Authority
US
United States
Prior art keywords
classifier
article
measurements
station
captured
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US18/072,220
Inventor
Felipe CONDESSA
Devin Willmott
Ivan BATALOV
João D. SEMEDO
Wan-Yi Lin
Bahare AZARI
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Robert Bosch GmbH
Original Assignee
Robert Bosch GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robert Bosch GmbH filed Critical Robert Bosch GmbH
Priority to US18/072,220 priority Critical patent/US20240176337A1/en
Assigned to ROBERT BOSCH GMBH reassignment ROBERT BOSCH GMBH ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BATALOV, Ivan, SEMEDO, JOAO D., Willmott, Devin, CONDESSA, FILIPE, AZARI, BAHARE, LIN, WAN-YI
Priority to DE102023211505.0A priority patent/DE102023211505A1/en
Priority to CN202311611385.0A priority patent/CN118112995A/en
Publication of US20240176337A1 publication Critical patent/US20240176337A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
    • G05B19/41875Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM] characterised by quality surveillance of production
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/32Operator till task planning
    • G05B2219/32368Quality control

Definitions

  • the present disclosure relates to methods and systems for training a classifier to classify articles of manufacture.
  • the trained classifier may be used to monitor articles of manufacture during a process of manufacturing the article.
  • measurements may be taken of the article being manufactured.
  • the article may be an article of manufacture.
  • the article may be a product being manufactured or a part of a product being manufactured.
  • These measurements may be taken by one or more sensors at one or more stations of a facility performing the manufacturing process.
  • There may be a plurality of kinds or types of sensors measuring a plurality of characteristics of the article being manufactured.
  • the plurality of sensors may include cameras that capture imagery data; audio equipment that capture audio data; and sensors that capture data related to dimensions, strength, roughness, or temperature of the article during the manufacturing process.
  • One or more measurements captured at each station may be used, for example, to monitor the quality of an article during a manufacturing process.
  • One or more measurements related to an article captured at a station in a sequence of stations of a manufacturing process may be input into a classifier to produce an output.
  • the output may be a classification of the article that is indicative of whether an article is faulty or otherwise anomalous, for example.
  • the output of the classifier in prior proposals is based only on the measurement(s) obtained from one or more sensors at a station of a manufacturing process. That is, the classifier does not consider the history of measurement data captured by one or more sensors at one or more stations of the manufacturing process prior to the station.
  • the present disclosure describes classifying an article by applying a classifier to an input comprising an aggregation of a feature vector of an article with an encoding of time series data representing a history of measurements of articles of the same type as the article.
  • the feature vector of the article may be extracted from measurement data captured at a station (e.g., location) in the manufacturing process.
  • the time series data may comprise measurements captured at a sequence of prior stations of the manufacturing process.
  • the present disclosure includes a description of embodiments related to manufacturing processes.
  • Some embodiments of methods for classifying an article of manufacture disclosed herein comprise receiving measurements related to an article of manufacture, the measurements being captured at a station in a manufacturing process; applying a feature extractor to the received measurements to generate a feature vector of the article; aggregating the feature vector of the article with encoded time series data representing a history of measurements of articles of the same type as the article of manufacture captured at a sequence of stations of the manufacturing process prior to the station to generate an input to a classifier; and applying the classifier to the input to produce a classification of the article of manufacture.
  • Some embodiments of systems for classifying an article of manufacture disclosed herein comprise one or more processors; and one or more non-transitory memories communicatively connected to the one or more processors, the one or more memories including computer-executable instructions that when executed cause the system to perform one or more of the methods disclosed herein.
  • Some embodiments of methods for training a classifier to classify articles of manufacture disclosed herein comprise generating training data for the classifier, the training data including a plurality of training data pairs, wherein each of the plurality of training data pairs includes an input to the classifier and a predetermined output that the classifier is being trained to produce when the classifier is applied to the input, and wherein each input in the plurality of inputs in the plurality of training data pairs includes an aggregation of: a feature vector of an article of manufacture based on one or more measurements related to the article captured at a station of a manufacturing process; and an encoding of time series data representing a history of measurements of articles of the same type as the article of manufacture captured at a sequence of stations of the manufacturing process prior to the station; and iteratively adjusting parameters of the classifier by reducing an error in the outputs of the classifier generated when the classifier is applied to each of the inputs in the training data pairs.
  • Some embodiments of systems for training a classifier disclosed herein comprise one or more processors; and one or more non-transitory memories communicatively connected to the one or more processors, the one or more memories including computer-executable instructions that when executed cause the system to perform one or more of the methods disclosed herein.
  • FIG. 1 discloses an example system for classifying an article of manufacture in accordance with embodiments disclosed herein.
  • FIG. 2 discloses an example computing system for performing embodiments of methods disclosed herein.
  • FIG. 1 discloses an example system for classifying an article of manufacture in accordance with embodiments disclosed herein.
  • FIG. 1 discloses a feature extractor 108 that may receive one or more sensor measurements 106 related to the article of manufacture.
  • the sensor measurements 106 are measurements captured by one or more sensors at a particular station of a manufacturing process.
  • Each of the one or more sensors may be, for example, a camera, an acoustic sensor, a pressure sensor, an ultrasound sensor, or spectroscopy equipment.
  • the sensor measurement(s) 106 may vary depending on the particular embodiment being implemented.
  • the sensor measurement(s) 106 may be a single measurement, such as a dimension (e.g., length, height, width, weight, temperature, sound volume, pressure) related to the article.
  • the single measurement may be input to the feature extractor 108 to produce an output of the feature extractor 108 .
  • the sensor measurement(s) 106 may be a plurality of measurements.
  • the sensor measurement(s) 106 may include a plurality of dimensions related to the article.
  • the sensor measurement(s) 106 may include one or more two-dimensional values, such as images.
  • the plurality of measurements may be aggregated to generate an input to the feature extractor 108 .
  • the aggregation of the plurality of measurements may include a concatenation of the plurality of measurements. For example, if the plurality of measurements includes a height of H, a width W, and a length L, the measurements may be concatenated to generate an input comprising a one-dimensional vector having three elements (e.g., H, W, L).
  • sensor fusion techniques may be used to aggregate a plurality of sensor measurements into an input to the feature extractor 108 .
  • the feature extractor 108 may comprise sensor fusion techniques to aggregate a plurality of sensor measurements and generate a feature vector of the article that may be input to aggregator 110 .
  • the feature extractor 108 may generate a feature vector of the article that may be input to the aggregator 110 .
  • a feature vector generated by the feature extractor 108 may be a one-dimensional vector having a plurality of elements with each element being a feature of the article.
  • the feature vector may be of the form [f 1 , f 2 , f 3 , . . . f n ] representing n features of the article.
  • the feature extractor 108 may comprise an algorithm that may receive a single sensor measurement as input and produces a feature vector comprising a single feature of the article that may be input to aggregator 110 .
  • the feature extractor 108 may be a trained machine learning model that is trained to receive an input representing the sensor measurement(s) 106 and to produce a feature vector of the article that may be input to aggregator 110 .
  • the feature extractor 108 may be a neural network that may include one or more convolutional kernels or attention maps.
  • the feature extractor 108 may be a support vector machine.
  • the feature extractor 108 may generate a feature vector that is a single value.
  • the generated feature vector may be a positive number indicating a predicted time until the article fails.
  • the feature extractor 108 may generate a feature vector that is a member of the set ⁇ 0, 1 ⁇ , indicating whether the article is faulty or anomalous, for example.
  • the feature extractor 108 may generate a feature vector that indicates a type of anomaly or failure.
  • the generated feature vector may be a member of the set ⁇ 0, a 1 , a 2 , a 3 . . . , a n ⁇ where 0, a 1 , a 2 , a 3 . . .
  • a n are each a type of anomaly or failure of the article.
  • the feature vector many include a plurality of value with each of the plurality of values is a member of the set ⁇ 0, 1 ⁇ , indicating whether a particular feature is present.
  • the aggregator 110 receives a feature vector of an article that is generated by the feature extractor 108 and receives an encoding of time series data from the encoder 104 .
  • the manufacturing data 102 are measurements that may be taken by one or more sensors at one or more stations of a facility performing the manufacturing process. There may be a plurality of kinds or types of sensors measuring a plurality of characteristics of the article being manufactured.
  • the plurality of sensors may include cameras that capture imagery data; audio equipment that capture audio data; and sensors that capture data related to characteristics of the article, such as dimensions, strength, roughness, or temperature of the article during the manufacturing process.
  • the manufacturing data 102 may be provided to the encoder 104 , which may generate an encoding of the manufacturing data 102 .
  • the manufacturing data 102 represents a history of measurements of articles captured during a manufacturing process of the articles.
  • the manufacturing data 102 may include a sequence of measurements M related to articles captured at different stations prior to the station at which the sensor measurement(s) 106 is/are captured.
  • M 1 is captured at station S 1
  • M 2 is captured at station S 2 , . . .
  • manufacturing data 102 may include data [ ⁇ M 1 , M 2 , M 3 , . . . , M i ⁇ 1 ⁇ , ⁇ S 1 , S 2 , . . . , S i ⁇ 1 ⁇ , T x ] for each type T x of an article where ⁇ S 1 , S 2 , . . . , S i ⁇ 1 ⁇ is a sequence of stations in the order at which an article is measured with S i being the first station in the sequence and S i ⁇ 1 being the last station in the sequence.
  • the encoder 104 may generate an encoding of time series data related to the article X from the manufacturing data 102 .
  • the encoder 104 may generate an encoding representing the data ( ⁇ M 1 , M 2 , M 3 , . . . , M i ⁇ 1 ⁇ , ⁇ S 1 , S 2 , . . .
  • an encoding generated by encoder 104 is a one-dimensional vector that is an encoding of [ ⁇ M 1 , M 2 , M 3 , . . . , M i ⁇ 1 ⁇ , ⁇ S 1 , S 2 , . . . , S i ⁇ 1 ⁇ , T x ] to predict the measurements M i of the article X at station S i .
  • the encoder 104 may generate an encoding that is a one-dimensional vector representing data including ( ⁇ M i , S i ⁇ ) where M i are one or more predicted measurements of the article X at station S i .
  • the one or more sensors capturing the sensor measurement(s) 106 at a particular station may include one or more sensors that are not included in the one or more sensors that capture the history of sensor measurements.
  • stations in the sequence of stations ⁇ S 1 , S 2 , . . . , S i ⁇ 1 ⁇ may not use one or more sensors that a station S i uses.
  • station S i may use a camera that captures a 3-dimensional image (i.e., a measurement) of an article and the sequence of stations ⁇ S 1 , S 2 , . . . , S i ⁇ 1 ⁇ may use only sensors that capture one-dimensional data (i.e., a single value) or two-dimensional data (e.g., a 2-dimensional image).
  • the aggregator 110 may receive a feature vector F i of an article X of type T x based on sensor measurement data 106 related to the article X captured at a station S i .
  • the feature vector F i may be a one-dimensional vector including one or more values.
  • the aggregator 110 may receive an encoding E i ⁇ 1 of times series data representing a history of sensor measurements of articles of type T x captured at stations prior to S i .
  • the encoding E i ⁇ 1 may be a one-dimensional vector as described above.
  • the aggregator 110 may aggregate the classification Ct and the encoding E t-1 to generate an output to be included in an input to a classifier 122 included in the machine learning system 120 .
  • the feature vector F i and the encoding E t-1 are both one-dimensional vectors and the aggregator 110 generates a one-dimensional vector [F i , E i1 ] that is a concatenation of F i and E i1 .
  • the classifier 122 may be trained using a supervised learning technique.
  • the classifier 122 comprises a neural network, such as a convolutional neural network, for example.
  • the classifier 122 comprises a support vector machine.
  • the classifier 122 is trained by the machine learning system 120 .
  • the machine learning model 120 may be trained using training data including a plurality of training data pairs.
  • the plurality of training data pairs may be generated by the machine learning system.
  • Each of the plurality of training data pairs may include an input to the classifier 122 and an output from the classifier 122 , wherein the output is a predetermined output that the classifier 122 is being trained to produce when the classifier 122 is applied to the input.
  • Each input in each of the plurality of training data pairs may include an aggregation of 1) a feature vector Ft of an article of manufacture of type T x based on one or more measurements related to the article captured at a particular station S i of a manufacturing process and 2) an encoding E t-1 of time series data representing a history of measurements of articles of type TX captured at a sequence of stations of the manufacturing process prior to the station S i .
  • the machine learning system 120 may train the classifier 122 by iteratively adjusting parameters of the classifier 122 to reduce an error in the outputs of the classifier 122 calculated when the inputs of the plurality training data pairs are input into the classifier 122 . In some embodiments, the error is reduced by minimizing a loss function.
  • the parameters of the classifier 122 may be iteratively adjusted during the minimizing of a loss function.
  • the classifier 122 comprises a neural network, such as a convolutional neural network, and the error is reduced by performing a backpropagation algorithm on the neural network.
  • the classifier 122 is a support vector machine.
  • the support vector machine is trained by optimizing an objective function including a loss term and a regularizing or normalizing term.
  • the machine learning system 120 may include a trained classifier 122 .
  • the machine learning system 120 may receive an aggregate pair related to an article, such as the aggregated pair [F i , E i1 ] disclosed above.
  • the machine learning system 120 may provide the aggregated pair as input to the trained classifier 122 to generate an output that is a predicted class of the article.
  • the machine learning system 120 may output the second classification as the predicted class 130 .
  • the classifier 122 may generate a predicted class that is a single value.
  • the predicted class may be a positive number indicating a predicted time until the article fails.
  • the classifier 122 may be a binary classifier that generates a predicted class that is a member of the set ⁇ 0, 1 ⁇ , indicating whether the article is faulty or anomalous, for example.
  • the classifier 122 may generate a predicted class that indicates a type of anomaly or failure of the article.
  • the predicted class may be a member of the set ⁇ 0, a 1 , a 2 , a 3 . . . , a n ⁇ where 0, a 1 , a 2 , a 3 . . . , a n are each a type of anomaly or failure of the article.
  • the predicted class output by the classifier 122 may be used by a controller in the manufacturing process of the article.
  • the machine learning system 120 may also be the controller.
  • a controller at a station S i in the manufacturing process may receive the predicted class and determine that the article is a failure (e.g., is defective) or an anomaly and direct the manufacturing process to automatically send the article to a station S i+1 to allow a human inspector to personally inspect the article.
  • FIG. 2 shows a block diagram of an example embodiment of a general computer system 200 .
  • the computer system 200 can include a set of instructions that can be executed to cause the computer system 200 to perform any one or more of the methods or computer-based functions disclosed herein.
  • the computer system 200 may include executable instructions to perform the function of encoder 104 , feature extractor 108 , aggregator 110 , machine learning system 120 , and classifier 122 .
  • the computer system 200 may be connected to other computer systems or peripheral devices via a network. Additionally, the computer system 200 may include or be included within other computing devices.
  • the computer system 200 may include one or more processors 202 .
  • the one or more processors 202 may include, for example, one or more central processing units (CPUs), one or more graphics processing units (GPUs), or both.
  • the computer system 200 may include a main memory 204 and a static memory 206 that can communicate with each other via a bus 208 .
  • the computer system 200 may further include a video display unit 210 , such as a liquid crystal display (LCD), a projection television display, a flat panel display, a plasma display, or a solid-state display.
  • LCD liquid crystal display
  • projection television display a flat panel display
  • plasma display or a solid-state display.
  • the disk drive unit 216 may include one or more computer-readable media 222 in which one or more sets of instructions 224 , e.g., software, may be embedded.
  • the instructions 224 may embody one or more of the methods or functionalities, such as the methods or functionalities disclosed herein.
  • the instructions 224 may reside completely, or at least partially, within the main memory 204 , the static memory 206 , and/or within the processor 202 during execution by the computer system 200 .
  • the main memory 204 and the processor 202 also may include computer-readable media.
  • dedicated hardware implementations such as application specific integrated circuits, programmable logic arrays and other hardware devices, can be constructed to implement one or more of the methods or functionalities described herein.
  • Applications that may include the apparatus and systems of various embodiments can broadly include a variety of electronic and computer systems.
  • One or more embodiments described herein may implement functions using two or more specific interconnected hardware modules or devices with related control and data signals that can be communicated between and through the modules, or as portions of an application-specific integrated circuit. Accordingly, the system 200 may encompasses software, firmware, and hardware implementations, or combinations thereof.
  • While the computer-readable medium is shown to be a single medium, the term “computer-readable medium” includes a single medium or multiple media, such as a centralized or distributed database, and/or associated caches and servers that store one or more sets of instructions.
  • the term “computer-readable medium” shall also include any medium that is capable of storing or encoding a set of instructions for execution by a processor or that cause a computer system to perform any one or more of the methods or functionalities disclosed herein.
  • the computer-readable media will be non-transitory media.
  • the computer-readable medium can include a solid-state memory such as a memory card or other package that houses one or more non-volatile read-only memories.
  • the computer-readable medium can be a random access memory or other volatile re-writable memory.
  • the computer-readable medium can include a magneto-optical or optical medium, such as a disk or tapes or other storage device to capture carrier wave signals such as a signal communicated over a transmission medium.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Manufacturing & Machinery (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Image Analysis (AREA)
  • Testing And Monitoring For Control Systems (AREA)

Abstract

Methods and systems for classifying a article of manufacture are disclosed. A classifier is trained with training data including 1) a feature vector related to the article based on measurements related to the article captured at a particular station of a manufacturing process and 2) encoded time series data representing a history of measurements of articles of the same type as the article of manufacture captured at a sequence of stations of the manufacturing process prior to the particular station.

Description

    TECHNICAL FIELD
  • The present disclosure relates to methods and systems for training a classifier to classify articles of manufacture. The trained classifier may be used to monitor articles of manufacture during a process of manufacturing the article.
  • BACKGROUND
  • During a process of manufacturing an article, measurements may be taken of the article being manufactured. The article may be an article of manufacture. The article may be a product being manufactured or a part of a product being manufactured. These measurements may be taken by one or more sensors at one or more stations of a facility performing the manufacturing process. There may be a plurality of kinds or types of sensors measuring a plurality of characteristics of the article being manufactured. For example, the plurality of sensors may include cameras that capture imagery data; audio equipment that capture audio data; and sensors that capture data related to dimensions, strength, roughness, or temperature of the article during the manufacturing process. One or more measurements captured at each station may be used, for example, to monitor the quality of an article during a manufacturing process. One or more measurements related to an article captured at a station in a sequence of stations of a manufacturing process may be input into a classifier to produce an output. The output may be a classification of the article that is indicative of whether an article is faulty or otherwise anomalous, for example.
  • SUMMARY
  • The output of the classifier in prior proposals is based only on the measurement(s) obtained from one or more sensors at a station of a manufacturing process. That is, the classifier does not consider the history of measurement data captured by one or more sensors at one or more stations of the manufacturing process prior to the station.
  • In one or more embodiments, the present disclosure describes classifying an article by applying a classifier to an input comprising an aggregation of a feature vector of an article with an encoding of time series data representing a history of measurements of articles of the same type as the article. The feature vector of the article may be extracted from measurement data captured at a station (e.g., location) in the manufacturing process. The time series data may comprise measurements captured at a sequence of prior stations of the manufacturing process. The present disclosure includes a description of embodiments related to manufacturing processes.
  • Some embodiments of methods for classifying an article of manufacture disclosed herein comprise receiving measurements related to an article of manufacture, the measurements being captured at a station in a manufacturing process; applying a feature extractor to the received measurements to generate a feature vector of the article; aggregating the feature vector of the article with encoded time series data representing a history of measurements of articles of the same type as the article of manufacture captured at a sequence of stations of the manufacturing process prior to the station to generate an input to a classifier; and applying the classifier to the input to produce a classification of the article of manufacture.
  • Some embodiments of systems for classifying an article of manufacture disclosed herein comprise one or more processors; and one or more non-transitory memories communicatively connected to the one or more processors, the one or more memories including computer-executable instructions that when executed cause the system to perform one or more of the methods disclosed herein.
  • Some embodiments of methods for training a classifier to classify articles of manufacture disclosed herein comprise generating training data for the classifier, the training data including a plurality of training data pairs, wherein each of the plurality of training data pairs includes an input to the classifier and a predetermined output that the classifier is being trained to produce when the classifier is applied to the input, and wherein each input in the plurality of inputs in the plurality of training data pairs includes an aggregation of: a feature vector of an article of manufacture based on one or more measurements related to the article captured at a station of a manufacturing process; and an encoding of time series data representing a history of measurements of articles of the same type as the article of manufacture captured at a sequence of stations of the manufacturing process prior to the station; and iteratively adjusting parameters of the classifier by reducing an error in the outputs of the classifier generated when the classifier is applied to each of the inputs in the training data pairs.
  • Some embodiments of systems for training a classifier disclosed herein comprise one or more processors; and one or more non-transitory memories communicatively connected to the one or more processors, the one or more memories including computer-executable instructions that when executed cause the system to perform one or more of the methods disclosed herein.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 discloses an example system for classifying an article of manufacture in accordance with embodiments disclosed herein.
  • FIG. 2 discloses an example computing system for performing embodiments of methods disclosed herein.
  • DETAILED DESCRIPTION
  • Embodiments of the present disclosure are described herein. It is to be understood, however, that the disclosed embodiments are merely examples and other embodiments may take various and alternative forms. The figures are not necessarily to scale; some features could be exaggerated or minimized to show details of particular components. Therefore, specific structural and functional details disclosed herein are not to be interpreted as limiting, but merely as a representative basis for teaching one skilled in the art to variously employ the present embodiments. As those of ordinary skill in the art will understand, various features illustrated and described with reference to any one of the figures may be combined with features illustrated in one or more other figures to produce embodiments that are not explicitly illustrated or described. The combinations of features illustrated provide representative embodiments for typical applications. Various combinations and modifications of the features consistent with the teachings of this disclosure, however, could be desired for particular applications or implementations.
  • FIG. 1 discloses an example system for classifying an article of manufacture in accordance with embodiments disclosed herein. FIG. 1 discloses a feature extractor 108 that may receive one or more sensor measurements 106 related to the article of manufacture. In some embodiments, the sensor measurements 106 are measurements captured by one or more sensors at a particular station of a manufacturing process. Each of the one or more sensors may be, for example, a camera, an acoustic sensor, a pressure sensor, an ultrasound sensor, or spectroscopy equipment. The sensor measurement(s) 106 may vary depending on the particular embodiment being implemented. In some embodiments, the sensor measurement(s) 106 may be a single measurement, such as a dimension (e.g., length, height, width, weight, temperature, sound volume, pressure) related to the article. When the sensor measurement(s) 106 comprise a single measurement, the single measurement may be input to the feature extractor 108 to produce an output of the feature extractor 108. In some embodiments, the sensor measurement(s) 106 may be a plurality of measurements. For example, the sensor measurement(s) 106 may include a plurality of dimensions related to the article. In some embodiments, the sensor measurement(s) 106 may include one or more two-dimensional values, such as images.
  • When the sensor measurement(s) 106 include a plurality of values captured by a plurality of sensors at a particular time, the plurality of measurements may be aggregated to generate an input to the feature extractor 108. The aggregation of the plurality of measurements may include a concatenation of the plurality of measurements. For example, if the plurality of measurements includes a height of H, a width W, and a length L, the measurements may be concatenated to generate an input comprising a one-dimensional vector having three elements (e.g., H, W, L). In some embodiments, sensor fusion techniques may be used to aggregate a plurality of sensor measurements into an input to the feature extractor 108. In some embodiments, the feature extractor 108 may comprise sensor fusion techniques to aggregate a plurality of sensor measurements and generate a feature vector of the article that may be input to aggregator 110. The feature extractor 108 may generate a feature vector of the article that may be input to the aggregator 110. A feature vector generated by the feature extractor 108 may be a one-dimensional vector having a plurality of elements with each element being a feature of the article. For example, the feature vector may be of the form [f1, f2, f3, . . . fn] representing n features of the article.
  • In some embodiments, the feature extractor 108 may comprise an algorithm that may receive a single sensor measurement as input and produces a feature vector comprising a single feature of the article that may be input to aggregator 110. In some embodiments, the feature extractor 108 may be a trained machine learning model that is trained to receive an input representing the sensor measurement(s) 106 and to produce a feature vector of the article that may be input to aggregator 110. For example, in some embodiments, the feature extractor 108 may be a neural network that may include one or more convolutional kernels or attention maps. In some embodiments the feature extractor 108 may be a support vector machine.
  • In some embodiments, the feature extractor 108 may generate a feature vector that is a single value. For example, the generated feature vector may be a positive number indicating a predicted time until the article fails. In some embodiments, the feature extractor 108 may generate a feature vector that is a member of the set {0, 1}, indicating whether the article is faulty or anomalous, for example. In some embodiments, the feature extractor 108 may generate a feature vector that indicates a type of anomaly or failure. For example, the generated feature vector may be a member of the set {0, a1, a2, a3 . . . , an} where 0, a1, a2, a3 . . . , an are each a type of anomaly or failure of the article. In some embodiments, the feature vector many include a plurality of value with each of the plurality of values is a member of the set {0, 1}, indicating whether a particular feature is present.
  • In some embodiments, the aggregator 110 receives a feature vector of an article that is generated by the feature extractor 108 and receives an encoding of time series data from the encoder 104. In FIG. 1 , the manufacturing data 102 are measurements that may be taken by one or more sensors at one or more stations of a facility performing the manufacturing process. There may be a plurality of kinds or types of sensors measuring a plurality of characteristics of the article being manufactured. For example, the plurality of sensors may include cameras that capture imagery data; audio equipment that capture audio data; and sensors that capture data related to characteristics of the article, such as dimensions, strength, roughness, or temperature of the article during the manufacturing process. The manufacturing data 102 may be provided to the encoder 104, which may generate an encoding of the manufacturing data 102.
  • The manufacturing data 102 represents a history of measurements of articles captured during a manufacturing process of the articles. The manufacturing data 102 may include a sequence of measurements M related to articles captured at different stations prior to the station at which the sensor measurement(s) 106 is/are captured. For example, measurements Mi−1 may be represented as Mi−1={m1, m2, m3, . . . , mn} with each measurement mi being captured at a station Si−1 at a time when the article being measured was at Si−1. Thus, for a particular article X, M1 is captured at station S1, M2 is captured at station S2, . . . , and Mi−1 is captured at station Si−1. Accordingly, manufacturing data 102 may include data [{M1, M2, M3, . . . , Mi−1}, {S1, S2, . . . , Si−1}, Tx] for each type Tx of an article where {S1, S2, . . . , Si−1} is a sequence of stations in the order at which an article is measured with Si being the first station in the sequence and Si−1 being the last station in the sequence.
  • For each feature vector F generated by feature extractor 108 based on sensor measurements 106 related to an article X captured at a station Si, the encoder 104 may generate an encoding of time series data related to the article X from the manufacturing data 102. For a feature vector F generated by the feature extractor 108 based on sensor measurement(s) 106 related to articles of type Tx at station Si−1, the encoder 104 may generate an encoding representing the data ({M1, M2, M3, . . . , Mi−1}, {S1, S2, . . . , Si−1}, and Tx) with each of the sensor measurements Mn being captured at a station Sn when an article of type Tx was at station Sn and with Tx being the type of the article X. In some embodiments, an encoding generated by encoder 104 is a one-dimensional vector that is an encoding of [{M1, M2, M3, . . . , Mi−1}, {S1, S2, . . . , Si−1}, Tx] to predict the measurements Mi of the article X at station Si. In some embodiments the encoder 104 may generate an encoding that is a one-dimensional vector representing data including ({Mi, Si}) where Mi are one or more predicted measurements of the article X at station Si.
  • The one or more sensors capturing the sensor measurement(s) 106 at a particular station may include one or more sensors that are not included in the one or more sensors that capture the history of sensor measurements. In some embodiments, stations in the sequence of stations {S1, S2, . . . , Si−1} may not use one or more sensors that a station Si uses. For example, station Si may use a camera that captures a 3-dimensional image (i.e., a measurement) of an article and the sequence of stations {S1, S2, . . . , Si−1} may use only sensors that capture one-dimensional data (i.e., a single value) or two-dimensional data (e.g., a 2-dimensional image).
  • The aggregator 110 may receive a feature vector Fi of an article X of type Tx based on sensor measurement data 106 related to the article X captured at a station Si. As disclosed above, the feature vector Fi may be a one-dimensional vector including one or more values. The aggregator 110 may receive an encoding Ei−1 of times series data representing a history of sensor measurements of articles of type Tx captured at stations prior to Si. In some embodiments, the encoding Ei−1 may be a one-dimensional vector as described above. The aggregator 110 may aggregate the classification Ct and the encoding Et-1 to generate an output to be included in an input to a classifier 122 included in the machine learning system 120. In some embodiments, the feature vector Fi and the encoding Et-1 are both one-dimensional vectors and the aggregator 110 generates a one-dimensional vector [Fi, Ei1] that is a concatenation of Fi and Ei1.
  • The classifier 122 may be trained using a supervised learning technique. In some embodiments, the classifier 122 comprises a neural network, such as a convolutional neural network, for example. In some embodiments, the classifier 122 comprises a support vector machine.
  • In some embodiments, the classifier 122 is trained by the machine learning system 120. The machine learning model 120 may be trained using training data including a plurality of training data pairs. The plurality of training data pairs may be generated by the machine learning system. Each of the plurality of training data pairs may include an input to the classifier 122 and an output from the classifier 122, wherein the output is a predetermined output that the classifier 122 is being trained to produce when the classifier 122 is applied to the input. Each input in each of the plurality of training data pairs may include an aggregation of 1) a feature vector Ft of an article of manufacture of type Tx based on one or more measurements related to the article captured at a particular station Si of a manufacturing process and 2) an encoding Et-1 of time series data representing a history of measurements of articles of type TX captured at a sequence of stations of the manufacturing process prior to the station Si. The machine learning system 120 may train the classifier 122 by iteratively adjusting parameters of the classifier 122 to reduce an error in the outputs of the classifier 122 calculated when the inputs of the plurality training data pairs are input into the classifier 122. In some embodiments, the error is reduced by minimizing a loss function. For example, the parameters of the classifier 122 may be iteratively adjusted during the minimizing of a loss function. In some embodiments, the classifier 122 comprises a neural network, such as a convolutional neural network, and the error is reduced by performing a backpropagation algorithm on the neural network. In some embodiments, the classifier 122 is a support vector machine. In some embodiments, the support vector machine is trained by optimizing an objective function including a loss term and a regularizing or normalizing term.
  • The machine learning system 120 may include a trained classifier 122. In some embodiments, the machine learning system 120 may receive an aggregate pair related to an article, such as the aggregated pair [Fi, Ei1] disclosed above. The machine learning system 120 may provide the aggregated pair as input to the trained classifier 122 to generate an output that is a predicted class of the article. In some embodiments, the machine learning system 120 may output the second classification as the predicted class 130.
  • In some embodiments, the classifier 122 may generate a predicted class that is a single value. For example, the predicted class may be a positive number indicating a predicted time until the article fails. In some embodiments, the classifier 122 may be a binary classifier that generates a predicted class that is a member of the set {0, 1}, indicating whether the article is faulty or anomalous, for example. In some embodiments, the classifier 122 may generate a predicted class that indicates a type of anomaly or failure of the article. For example, the predicted class may be a member of the set {0, a1, a2, a3 . . . , an} where 0, a1, a2, a3 . . . , an are each a type of anomaly or failure of the article.
  • In some embodiments, the predicted class output by the classifier 122 may be used by a controller in the manufacturing process of the article. In some embodiments, the machine learning system 120 may also be the controller. For example, a controller at a station Si in the manufacturing process may receive the predicted class and determine that the article is a failure (e.g., is defective) or an anomaly and direct the manufacturing process to automatically send the article to a station Si+1 to allow a human inspector to personally inspect the article.
  • FIG. 2 shows a block diagram of an example embodiment of a general computer system 200. The computer system 200 can include a set of instructions that can be executed to cause the computer system 200 to perform any one or more of the methods or computer-based functions disclosed herein. For example, the computer system 200 may include executable instructions to perform the function of encoder 104, feature extractor 108, aggregator 110, machine learning system 120, and classifier 122. The computer system 200 may be connected to other computer systems or peripheral devices via a network. Additionally, the computer system 200 may include or be included within other computing devices.
  • As illustrated in FIG. 2 , the computer system 200 may include one or more processors 202. The one or more processors 202 may include, for example, one or more central processing units (CPUs), one or more graphics processing units (GPUs), or both. The computer system 200 may include a main memory 204 and a static memory 206 that can communicate with each other via a bus 208. As shown, the computer system 200 may further include a video display unit 210, such as a liquid crystal display (LCD), a projection television display, a flat panel display, a plasma display, or a solid-state display. Additionally, the computer system 200 may include an input device 212, such as a remote-control device having a wireless keypad, a keyboard, a microphone coupled to a speech recognition engine, a camera such as a video camera or still camera, or a cursor control device 214, such as a mouse device. The computer system 200 may also include a disk drive unit 216, a signal generation device 218, such as a speaker, and a network interface device 220. The network interface 220 may enable the computer system 200 to communicate with other systems via a network 228. For example, the network interface 220 may enable the machine learning system 120 to communicate with a database server (not show) or a controller in a manufacturing system (not shown).
  • In some embodiments, as depicted in FIG. 2 , the disk drive unit 216 may include one or more computer-readable media 222 in which one or more sets of instructions 224, e.g., software, may be embedded. For example, the instructions 224 may embody one or more of the methods or functionalities, such as the methods or functionalities disclosed herein. In a particular embodiment, the instructions 224 may reside completely, or at least partially, within the main memory 204, the static memory 206, and/or within the processor 202 during execution by the computer system 200. The main memory 204 and the processor 202 also may include computer-readable media.
  • In some embodiments, dedicated hardware implementations, such as application specific integrated circuits, programmable logic arrays and other hardware devices, can be constructed to implement one or more of the methods or functionalities described herein. Applications that may include the apparatus and systems of various embodiments can broadly include a variety of electronic and computer systems. One or more embodiments described herein may implement functions using two or more specific interconnected hardware modules or devices with related control and data signals that can be communicated between and through the modules, or as portions of an application-specific integrated circuit. Accordingly, the system 200 may encompasses software, firmware, and hardware implementations, or combinations thereof.
  • While the computer-readable medium is shown to be a single medium, the term “computer-readable medium” includes a single medium or multiple media, such as a centralized or distributed database, and/or associated caches and servers that store one or more sets of instructions. The term “computer-readable medium” shall also include any medium that is capable of storing or encoding a set of instructions for execution by a processor or that cause a computer system to perform any one or more of the methods or functionalities disclosed herein.
  • In some embodiments, some or all of the computer-readable media will be non-transitory media. In a particular non-limiting, exemplary embodiment, the computer-readable medium can include a solid-state memory such as a memory card or other package that houses one or more non-volatile read-only memories. Further, the computer-readable medium can be a random access memory or other volatile re-writable memory. Additionally, the computer-readable medium can include a magneto-optical or optical medium, such as a disk or tapes or other storage device to capture carrier wave signals such as a signal communicated over a transmission medium.
  • While exemplary embodiments are described above, it is not intended that these embodiments describe all possible forms encompassed by the claims. The words used in the specification are words of description rather than limitation, and it is understood that various changes can be made without departing from the spirit and scope of the disclosure. As previously described, the features of various embodiments can be combined to form further embodiments of the invention that may not be explicitly described or illustrated. While various embodiments could have been described as providing advantages or being preferred over other embodiments or prior art implementations with respect to one or more desired characteristics, those of ordinary skill in the art recognize that one or more features or characteristics can be compromised to achieve desired overall system attributes, which depend on the specific application and implementation. These attributes can include, but are not limited to strength, durability, marketability, appearance, packaging, size, serviceability, weight, manufacturability, ease of assembly, etc. As such, embodiments described as less desirable than other embodiments or prior art implementations with respect to one or more characteristics are not outside the scope of the disclosure and can be desirable for particular applications.

Claims (20)

What is claimed is:
1. A method for classifying an article of manufacture, comprising:
receiving measurements related to an article of manufacture, the measurements being captured at a first station in a manufacturing process;
applying a feature extractor to the received measurements to generate a feature vector of the article;
aggregating the feature vector of the article with encoded time series data representing a history of measurements of articles of a same type as the article of manufacture captured at a sequence of stations of the manufacturing process prior to the first station to generate an input to a classifier; and
applying a classifier to the input to produce a predicted class of the article of manufacture.
2. A method according to claim 1, wherein the one or more measurements captured at the first station are captured by one or more sensors at the first station and wherein measurements in the history of measurements are captured by one or more sensors at each of the stations in the sequence of stations.
3. A method according to claim 1, wherein the classifier is a neural network.
4. A method according to claim 3, wherein the neural network is a convolutional neural network.
5. A method according to claim 1, wherein the classifier is a support vector machine.
6. A method according to claim 1, wherein the encoded time series data includes one or more predicted measurements of the article at the first station.
7. A system for classifying an article of manufacture, comprising:
one or more processors; and
one or more non-transitory memories communicatively connected to the one or more processors, the one or more memories including computer-executable instructions that when executed cause the system to perform the following functions:
receiving measurements related to an article of manufacture, the measurements being captured at a first station in a manufacturing process;
applying a feature extractor to the received measurements to generate a feature vector of the article;
aggregating the feature vector of the article with encoded time series data representing a history of measurements of articles of a same type as the article of manufacture captured at a sequence of stations of the manufacturing process prior to the first station to generate an input to a classifier; and
applying a classifier to the input to produce a predicted class of the article of manufacture.
8. A system according to claim 7, further comprising:
one or more sensors that produce the sensor measurement data;
a feature extractor that outputs a feature vector of the article when the feature extractor is applied to the received measurements;
an aggregator that aggregates the feature vector with the encoded time series data to generate input data; and
a classifier that outputs the predicted class when the classifier is applied the input data.
9. A method for training a classifier to classify articles of manufacture, comprising:
generating training data for the classifier, the training data including a plurality of training data pairs, wherein each of the plurality of training data pairs includes an input to the classifier and a predetermined output that the classifier is being trained to produce when the classifier is applied to the input, and wherein each input in the plurality of inputs in the plurality of training data pairs includes an aggregation of:
a feature vector of an article of manufacture based on one or more measurements related to the article captured at a first station of a manufacturing process; and
encoded time series data representing a history of measurements of articles of a same type as the article of manufacture captured at a sequence of stations of the manufacturing process prior to the first station; and
iteratively adjusting parameters of the classifier by reducing an error in the outputs of the classifier generated when the classifier is applied to each of the inputs in the training data pairs.
10. A method according to claim 9, wherein the one or more measurements captured at the first station are captured by one or more sensors at the first station and wherein measurements in the history of measurements are captured by one or more sensors at each of the stations in the sequence of stations.
11. A method according to claim 10, wherein the one or more sensors capturing the measurements at the first station include one or more sensors that are not included in the sensors that capture the history of measurements at the stations in the sequence of stations.
12. A method according to claim 9, wherein the feature extractor is a neural network.
13. A method according to claim 12, wherein the neural network is a convolutional neural network.
14. A method according to claim 9, wherein the feature extractor is a support vector machine.
15. A method according to claim 9, wherein the feature vector is a one-dimensional vector including a plurality of elements.
16. A method according to claim 9, wherein the encoded time series data includes one or more predicted measurements of the article at the first station.
17. A method according to claim 9, wherein the classifier is a neural network.
18. A method according to claim 17, wherein the neural network is a convolutional neural network.
19. A method according to claim 9, wherein the classifier is a support vector machine.
20. A system for training a classifier, comprising:
one or more processors; and
one or more non-transitory memories communicatively connected to the one or more processors, the memory including computer-executable instructions that when executed cause the following functions to be performed:
generating training data for the classifier, the training data including a plurality of training data pairs, wherein each of the plurality of training data pairs includes an input to the classifier and a predetermined output that the classifier is being trained to produce when the classifier is applied to the input, and wherein each input in the plurality of inputs in the plurality of training data pairs includes an aggregation of:
a feature vector of an article of manufacture based on one or more measurements related to the article captured at a first station of a manufacturing process; and
encoded time series data representing a history of measurements of articles of a same type as the article of manufacture captured at a sequence of stations of the manufacturing process prior to the first station; and
iteratively adjusting parameters of the classifier by reducing an error in the outputs of the classifier generated when the classifier is applied to each of the inputs in the training data pairs.
US18/072,220 2022-11-30 2022-11-30 Industrial quality monitoring system with pre-trained feature extraction Pending US20240176337A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US18/072,220 US20240176337A1 (en) 2022-11-30 2022-11-30 Industrial quality monitoring system with pre-trained feature extraction
DE102023211505.0A DE102023211505A1 (en) 2022-11-30 2023-11-20 Industrial quality monitoring system with pre-trained feature extraction
CN202311611385.0A CN118112995A (en) 2022-11-30 2023-11-29 Industrial quality monitoring system with pre-training feature extraction

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US18/072,220 US20240176337A1 (en) 2022-11-30 2022-11-30 Industrial quality monitoring system with pre-trained feature extraction

Publications (1)

Publication Number Publication Date
US20240176337A1 true US20240176337A1 (en) 2024-05-30

Family

ID=91078859

Family Applications (1)

Application Number Title Priority Date Filing Date
US18/072,220 Pending US20240176337A1 (en) 2022-11-30 2022-11-30 Industrial quality monitoring system with pre-trained feature extraction

Country Status (3)

Country Link
US (1) US20240176337A1 (en)
CN (1) CN118112995A (en)
DE (1) DE102023211505A1 (en)

Also Published As

Publication number Publication date
DE102023211505A1 (en) 2024-06-06
CN118112995A (en) 2024-05-31

Similar Documents

Publication Publication Date Title
CN109902832B (en) Training method of machine learning model, anomaly prediction method and related devices
CN109741292B (en) Method for detecting abnormal image in first image data set by using countermeasure self-encoder
CN110866891B (en) Image recognition device, image recognition method, and storage medium
US11948061B2 (en) Deep auto-encoder for equipment health monitoring and fault detection in semiconductor and display process equipment tools
US11715190B2 (en) Inspection system, image discrimination system, discrimination system, discriminator generation system, and learning data generation device
TWI822939B (en) Chamber matching with neural networks in semiconductor equipment tools
US11587356B2 (en) Method and device for age estimation
JP2021057042A (en) Product classification system and product classification method
JP6216024B1 (en) Trained model generation method and signal data discrimination device
JP2016085704A (en) Information processing system, information processing device, information processing method, and program
WO2019176990A1 (en) Inspection device, image discriminating device, discriminating device, inspection method, and inspection program
US11448570B2 (en) Method and system for unsupervised anomaly detection and accountability with majority voting for high-dimensional sensor data
JP7056259B2 (en) Inspection system, identification system, and classifier evaluation device
US20240176337A1 (en) Industrial quality monitoring system with pre-trained feature extraction
JP2019113914A (en) Data identification device and data identification method
US20220392051A1 (en) Method and apparatus with image analysis
CN113138589B (en) Industrial equipment control method, electronic device, and storage medium
TWI801820B (en) Systems and methods for manufacturing processes
US20220092473A1 (en) System and method for performing tree-based multimodal regression
CN114120423A (en) Face image detection method and device, electronic equipment and computer readable medium
JP2021105758A (en) Article inspection device
EP3789930A1 (en) Method and system for data validation
JP7345006B1 (en) Learning model generation method and testing device
CN110632418A (en) CNN and RNN-based machine state acquisition method and system and electronic equipment
US11651587B2 (en) Method and apparatus for product quality inspection

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

AS Assignment

Owner name: ROBERT BOSCH GMBH, GERMANY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:CONDESSA, FILIPE;WILLMOTT, DEVIN;BATALOV, IVAN;AND OTHERS;SIGNING DATES FROM 20230105 TO 20230212;REEL/FRAME:063819/0774