WO2020110129A1 - Analyse centralisée d'appareils d'inspection visuelle multiples - Google Patents

Analyse centralisée d'appareils d'inspection visuelle multiples Download PDF

Info

Publication number
WO2020110129A1
WO2020110129A1 PCT/IL2019/051320 IL2019051320W WO2020110129A1 WO 2020110129 A1 WO2020110129 A1 WO 2020110129A1 IL 2019051320 W IL2019051320 W IL 2019051320W WO 2020110129 A1 WO2020110129 A1 WO 2020110129A1
Authority
WO
WIPO (PCT)
Prior art keywords
dcas
defects
analysis
vias
data
Prior art date
Application number
PCT/IL2019/051320
Other languages
English (en)
Inventor
Yonatan HYATT
Harel BOREN
Original Assignee
Inspekto A.M.V Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from IL263399A external-priority patent/IL263399B/en
Application filed by Inspekto A.M.V Ltd filed Critical Inspekto A.M.V Ltd
Priority to US17/297,572 priority Critical patent/US20210398267A1/en
Priority to DE112019005951.3T priority patent/DE112019005951T5/de
Publication of WO2020110129A1 publication Critical patent/WO2020110129A1/fr

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
    • G05B19/41875Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM] characterised by quality surveillance of production
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
    • G05B19/4183Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM] characterised by data acquisition, e.g. workpiece identification
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/32Operator till task planning
    • G05B2219/32181Monitor production, assembly apparatus with multiple sensors
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2223/00Indexing scheme associated with group G05B23/00
    • G05B2223/02Indirect monitoring, e.g. monitoring production to detect faults of a system

Definitions

  • the present invention relates to visual inspection of items on a production line and more specifically to collection and analysis of data from multiple visual production line inspection appliances.
  • Inspection during production processes helps control the quality of products by identifying defects and then acting upon this detection, for example, by fixing the defect or discarding the defective part.
  • the process of defect detection is essential for quality assurance (QA), gating, and sorting on production lines, and is consequently useful in improving productivity, improving production processes and working procedures, reducing defect rates, and reducing re-work and waste.
  • Automated visual inspection methods are used in production lines to identify visually detectable anomalies that may have a functional or aesthetic impact on a manufactured part. Due to the underlying technologies that drive them, current visual inspection solutions for production lines are: (1) typically highly customized to a particular product and the particular QA, gating, or sorting task that is addressed; (2) very expensive; (3) very time consuming to set up; (4) require expert selection and integration of hardware, cameras, lighting and software components; and require expert maintenance of these throughout the lifetime of the inspection solution and the production line.
  • Embodiments of the invention overcomes the drawbacks of the prior art by deploying simplified visual inspection systems that enable gathering of inspection data and enable to determine trends in the production plant.
  • Embodiments of the invention provide multiple automated visual inspection appliances (VIA) for a production plant and a centralized data collection and analytics server (DCAS) that gathers and analyzes data from the VIAs.
  • VIP automated visual inspection appliances
  • DCAS data collection and analytics server
  • the DCAS can then provide reports, dashboards and alerts to determine production trends in the manufacturing plant and thus improve the quality and productivity of the plant.
  • Each VIA can be easily and quickly installed for inspection without significant tailored integration.
  • the ease of setup and operation is enabled by a combination of machine learning, and computer vision algorithms that dynamically adapt to assess the item to be inspected, the target area of inspection, and the characteristics of the surrounding environment effecting the inspection setup.
  • Each VIA comprises a flexible mounting assembly, a camera assembly which comprises an inspection camera and lighting source, and a controller wherein the inspection camera and lighting source are both connected to and controlled by the controller.
  • the VIAs are in wireless or wired communication with the DCAS.
  • defect free embodiments of items to be inspected are first processed in a setup stage where the controller leams parameters of the items as captured in images by the camera assembly. In some embodiments no database of defects is used and only defect-free items are analyzed during the setup stage. Items to be inspected preferably comprise any item type, shape or material, set in any lighting environment. In the inspection stage, inspected items, (manufactured items that are to be inspected for inspection tasks, such as, defect detection, gating or sorting purposes), are imaged and the image data collected by the camera from each inspected item is processed by the controller.
  • the controller uses machine learning algorithms which may provide human-level analysis of defects in inspection images preferably even with differing illumination conditions, different reflections, shading, varying location, shape tolerances, etc. This inspection data collected from the VIAs is sent to the DC AS for analysis.
  • a visual inspection data collection and analysis system comprises: a plurality of visual inspection appliances (VIA) configured to inspect and acquire visual inspection data relating to inspected items; and a data collection and analytics server (DCAS) configured to receive information comprising the visual inspection data from the multiple VIAs and to analyze the received information to form big data analysis.
  • VIAs are adapted for detecting defects or gating or counting the inspected items without the involvement of the DCAS.
  • the inspected items are different types of items.
  • the big data analysis comprises a combination of information related to different types of inspection items.
  • the DCAS further comprises a display and wherein the DCAS outputs the analysis to the display.
  • the acquired inspection data from each one of the multiple VIAs is selected from the group consisting of: image/s of the inspected item; record of decision by VIA whether an item has a defect; images of the defects; number of defects; records of deviations from good item samples which are not significant enough to be reported as defects but can imply to issues in the production line; item unique ID; plant work/job order; batch ID; personnel in charge of the production line or station; production tool ID; part name; part serial number; production tool ID; and a combination of the above.
  • the data is communicated from a VIA to the DCAS according to timing selected from the group consisting of: after inspection of each item by each VIA; after inspection of a configurable number of items per VIA; after a configurable period of time per VIA; based on a date schedule; based on a time of day schedule; and a combination of the above.
  • the analysis is selected from the group consisting of: root cause analysis of detected defects; predictive maintenance analysis - based on detecting trends in defect or deviations that are not defects; intensity of the defects - analysis of trends to increasing occurrences of defects per period of time; significance of the defects- analysis of trends of increasing effect of defects or deviations on the produced item; analysis of product deviations from ideal that are not defects but indicate a trend towards decreasing quality; analysis of defect shape, area and type of defect optionally in the form of a defect“map”; cost of defect - i.e. the cost of discarded items or cost of repair of items determined to be defective; product recall and/or latent product fault vs. defect and/or product deviation history analysis; supplier analysis comparing product raw material suppliers vs defects; and relationship analysis between different production stages of the same item.
  • the DCAS is adapted to issue reports based on received inspection data wherein the reports are selected from the group consisting of: % defects detected per item; defect report including images of item showing where defects were detected; % defects detected per manufacturing area; number of items inspected per period of time; personnel vs item defect report; % defects per shift; % defects per manufacturing type (e.g. casting lines vs molding lines); % defects per defect type; defect report per period of time and production area; and a combination of the above.
  • the reports are selected from the group consisting of: % defects detected per item; defect report including images of item showing where defects were detected; % defects detected per manufacturing area; number of items inspected per period of time; personnel vs item defect report; % defects per shift; % defects per manufacturing type (e.g. casting lines vs molding lines); % defects per defect type; defect report per period of time and production area; and a combination of the above.
  • the DCAS is adapted to initiate activity on one or more VIAs, the activity selected from the group consisting of: DCAS checks the operational status of one or more VIAs; DCAS checks the software version running on one or more VIAs; DCAS checks the security update status of one or more VIAs; DCAS accesses a real-time view of the inspection images from one or more VIAs; DCAS requests specific data from one or more VIAs; DCAS changes inspection or other settings of one or more VIAs; DCAS performs software upgrades to one or more VIAs; DCAS initiates inspection to be performed by one or more VIAs; DCAS changes the region of interest to be inspected by one or more VIAs; DCAS initiates re inspection of previously inspected items; DCAS initiates re- inspection of previously inspected items with changed inspection parameters; and a combination of the above.
  • the DCAS is adapted to store the received visual inspection data.
  • the stored inspection data can be searched.
  • the DCAS is adapted for issuing alerts based on the analysis.
  • the DCAS is adapted to run 3 rd party applications adapted to produce analyses and reports based on the stored inspection data.
  • the term“item” refers to a production item wherein production items may be different production stages of the same product or may be different products or different production stages of different products or the same item inspected from different angles. Items may be of any type, shape, size, material, or any other attribute and no example herein should be considered limiting.
  • the term“defect” may include, for example, a visible flaw on the surface of an item, an undesirable size, shape or color of the item or of parts of the item, an undesirable number of parts of the item, a wrong or missing assembly of its interfaces, a broken or burned part, an incorrect alignment of an item or parts of an item, and in general, any difference between a defect free sample and the inspected item.
  • a defect is a difference which would be evident to a human user between a defect free item (and/or group of defect free items) and a same-type inspected item.
  • a defect may include flaws which are visible only in enlarged or high resolution images, e.g., images obtained by microscopes or other specialized cameras.
  • Inspection of items as described herein should also be understood as inspection for purposes of defect detection, gating and/or sorting. Where one of these terms is used e.g.: “defect detection” this should be understood as referring to any one of inspection tasks, such as, defect detection, gating, or sorting.
  • a plant as used herein refers to a manufacturing environment which contains one or more production lines or production areas for manufacture, assembly, testing, packaging or any other type of industrial processing of items.
  • images refer, for simplicity, to“images”, however it should be appreciated that the processes described herein may be carried out on image data other than or in addition to full images.
  • images also includes video captured by the cameras of the presently described system.
  • product stage should be understood to include any of an assembly stage (items are assembled into a product), manufacturing stage (items are subjected to a form of processing as part of product manufacture), and/or inspection stage (stages are actually different views or sections of the same product).
  • product stages are related to one another by their being production stages or aspects of a product.
  • item may be used to refer to a product stage.
  • a“product” may refer to a completed commercial product but may also refer to a manufactured item or part that is destined for integration into a product.
  • Implementation of the method and system of the present invention involves performing or completing certain selected tasks or steps manually, automatically, or a combination thereof.
  • several selected steps could be implemented by hardware or by software on any operating system of any firmware or a combination thereof.
  • selected steps of the invention could be implemented as a chip or a circuit.
  • selected steps of the invention could be implemented as a plurality of software instructions being executed by a computer using any suitable operating system.
  • selected steps of the method and system of the invention could be described as being performed by a data processor, such as a computing platform for executing a plurality of instructions.
  • machine learning or“artificial intelligence” refer to use of algorithms on a computing device that parse data, leam from this data, and then make a determination, where the determination is not deterministically replicable (such as with deterministically oriented software as known in the art).
  • a“computing device”, a "computer”, or“mobile device” it should be noted that optionally any device featuring a data processor and the ability to execute one or more instructions may be described as a computer, including but not limited to any type of personal computer (PC), a server, a distributed server, a virtual server, a cloud computing platform, a cellular telephone, an IP telephone, a smartphone, or a PDA (personal digital assistant). Any two or more of such devices in communication with each other may optionally comprise a "network” or a "computer network”.
  • FIGS. 1A-1B are illustrative schematic drawings showing collection of data from automated visual inspection appliances on a production line according to at least some embodiments of the present invention.
  • FIG. 2 is a flow diagram showing a process for collection of data from automated visual inspection appliances on a production line according to at least some embodiments of the present invention.
  • the present invention in at least some embodiments is for a system comprising multiple automated visual inspection appliances (VIA) for a production plant and a centralized data collection and analytics server (DCAS) that gathers and analyzes data from the VIAs.
  • VIP automated visual inspection appliances
  • DCAS data collection and analytics server
  • an automated visual inspection system 100 comprises multiple visual inspection appliances (VIA) 110A, B,C, n in communication with a data collection and analytics server (DCAS) 150.
  • VIA visual inspection appliances
  • DCAS data collection and analytics server
  • VIA is preferably provided as an integrated appliance for use in a manufacturing environment or plant.
  • VIAs 110A, B,C, n are optionally installed in one plant or optionally multiple VIAs are installed in multiple plants.
  • Each VIA connects to DCAS 150 using wired or wireless communications protocols and methods as known in the art.
  • DCAS 150 is a computing device as defined above and may optionally comprise a server, distributed server, cloud computing environment, data cluster or any other suitable computing device.
  • DCAS 150 preferably comprises analysis engine 152, database (DB) 154, DCAS user interface (UI) 156, and notification engine 158.
  • Analysis engine 152 receives data from VIAs 110A, B, C and n and analyses the received data to output insights, recommendations, summaries, trends, alerts, and root cause analysis of defects all related to the items inspected and the production environment as described below. Analysis engine 152 optionally uses big data analysis methods.
  • DB 154 is a database (e.g., as known in the art) and stores data transmitted by VIAs 110A, B, C and n and also results and interim results of analysis by engine 152. DB 154 also stores configuration data defined in DCAS 150 for system 100 including VIA profiles.
  • a VIA profile includes information about each VIA in system 100 including but not limited to: unique identifier, name, physical mounting details, position in plant, plant geolocation, items inspected, reference images of items inspected, profiles of items inspected, inspection results and so forth.
  • a manufacturing area 170 is defined for DCAS 150 where each manufacturing area 170 includes one of more VIAs.
  • a manufacturing area 170 optionally comprises VIAs from one plant or optionally comprises VIAs from multiple plants.
  • the manufacturing area 170 defined in figure 1A includes VIAs 110B and HOC but it should be appreciated that any number or any of VIAs could be included in a manufacturing area 170 and any number of manufacturing areas 170 may be defined for DCAS 150. Where more than one manufacturing area 170 is defined these manufacturing areas 170 may optionally overlap, i.e.: a single VIA may optionally be part of several manufacturing areas 170.
  • DCAS UI 156 enables display of the results of analysis engine 152 and also interaction with DCAS 150 by a human operator (not shown).
  • DCAS UI 156 optionally comprises a monitor or screen and information provided to a user of DCAS 150 may be visual (e.g., text or other content displayed on the monitor).
  • DCAS UI 156 comprises an audio player to emit a sound.
  • DCAS UI 156 preferably enables accepting user input such as by a touch screen, keyboard and/or mouse.
  • DCAS UI 156 is provided on a multi-purpose device such as a smartphone, tablet or personal computer in communication with DCAS 150.
  • DCAS UI 156 can be accessed remotely optionally from within the plant where it operates and outside of the plant where it operates.
  • Notification engine 158 is in communication with external communication networks 70 and provides push notification of alerts or other outputs from analysis engine 152.
  • Non- limiting types of notification methods include email, SMS, WhatsApp or any mobile notification mechanism.
  • Notification engine 158 can be configured via DCAS UI 156 to define recipients and notification methods for different types of alerts, reports or analyses.
  • VIAs 110A, B, C, n and DCAS 150 communicate over the external network 70.
  • DCAS 150 may automatically detect when a VIA is connected to the external network 70 and may then register the newly connected VIA and perform data collection and analysis of the VIA performance and of data obtained by the VIA, as described herein.
  • a visual inspection data collection and analysis system includes a plurality of VIAs configured to acquire visual inspection data relating to inspected items, and a central server, such as a DCAS configured to identify a newly connected VIA on a communications network, to register the newly connected VIA and enable data collection and analysis of each registered VIA.
  • a central server such as a DCAS configured to identify a newly connected VIA on a communications network
  • the DCAS 150 may identify each VIA based on an ID, IP address or other unique identifiers connected to each VIA and each VIA may be registered under a unique identifier. Data collection and analysis of each registered VIA may be done according to the registered unique identifier. E.g., data from VIAs registered under an identifier related to inspection line A may be analyzed differently from data from VIAs registered under an identifier related to inspection line B.
  • DCAS 150 may detect when a VIA is connected to the external network 70 based on signals sent over the network (e.g., ethemet) by DCAS 150 and/or VIAs 110A, B, C and n. Signals may include, for example, packets transmitted by multicast addressing using, for example User Datagram Protocol (UDP). Based on the signals, which may be transmitted periodically by the DCAS and/or VIA, the DCAS can determine that a VIA is connected to the network and the DCAS may then compare the VIA identifier to already registered VIA identifiers to determine if the VIA is newly connected or not.
  • UDP User Datagram Protocol
  • DCAS 150 may perform one or more different actions for each registered VIA, as described herein. For example, collecting visual inspection data and the timing of the collection of data may be done based on the registered VIA identifier. Storing the received data, analysis of the received data and issuing reports may be controlled based on the registered VIA identifier. The DCAs may initiate different activities in each VIA based on the registration of each VIA.
  • DCAS 150 is optionally in communication with an external monitoring system 60.
  • Monitoring system 60 is a computing device as described above.
  • Monitoring system 60 is typically a production plant management system such as for gathering and monitoring key performance indicators for manufacturing efficiency.
  • Monitoring system 60 is optionally a production resource management platform.
  • DCAS 150 optionally runs 3 rd party applications 159 where 3 rd party application 159 are operative to produce analyses and reports based on the collected data, which may be stored in DCAS 150.
  • 3 rd party applications 159 can operate VIAs according to the capabilities of DCAS 150.
  • each VIA 110 comprises a controller 130, camera assembly 111, and mounting assembly 108.
  • Camera assembly 111 comprises camera 102, and light source 106.
  • Camera 102 comprises a CCD or CMOS or other appropriate imaging chip.
  • Camera 102 is a 2D camera or optionally a 3D camera.
  • camera 102 comprises the camera integrated into a mobile device such as a smartphone or tablet where the device is attached to mounting assembly 108.
  • Camera 102 optionally comprises a polarizing lens, tele-centric lens, narrow band, zoom lens, or other lens (not shown) placed over the lens of camera 102 or directly upon its imaging chip.
  • Light source 106 comprises LEDs or other known light source.
  • the intensity (brightness) of light source 106 can be adjusted.
  • the color of light source 106 can be adjusted.
  • light source 106 comprises multiple controllable segments, each of which can be activated or provided with the same or different intensity and/or color.
  • light source 106 may comprise a circular array of LEDs surrounding camera 102 lens, where radial portions of circular light source 106 are controlled individually or alternatively the intensity and/or color of every LED or groupings of LEDs, can be controlled individually.
  • Light source 106 is shown as positioned above camera 102 for simplicity of the figures but this position should not be considered limiting.
  • light source 106 is mounted on the side of or below camera 102.
  • Light source 106 is preferably attached to and surrounds or is otherwise fixed in relation to the lens of camera 102 so as to illuminate the field of view (LOV) 104 of camera 102 or portions thereof.
  • Camera assembly 111 is attached to mounting assembly 108.
  • camera 102 and light source 106 are separately attached to mounting assembly 108 allowing individual adjustment of the spatial position of either.
  • Mounting assembly 108 comprises mounts, segments and fasteners allowing adaptation and adjustment of mounting assembly 108 for optimal positioning of camera 102 and light source 106 for inspection of an item.
  • Camera assembly 111 is positioned using mounting assembly 108 such that items 20 to be inspected are within the field of view 104 of camera 102.
  • Mounting assembly 108 is attached to a mounting surface 40.
  • Surface 40 may remain in a fixed position relative to item 20 or alternatively may move so as to repeatedly bring camera assembly 111 into a position where items 20 to be inspected are within the field of view 104 of camera 102.
  • a non-limiting example of a moving surface 40 is a robot arm.
  • LOV 104 herein it is to be understood that light source 106 is positioned to illuminate LOV 104.
  • Surface 40 optionally comprises an aluminum profile including grooves for attachment of mounting brackets.
  • Items 20 to be inspected may be placed on an inspection line 30 which comprises means for supporting and moving items 20 such as but not limited to a conveyer belt, or a cradle or another holding apparatus, moving in direction 22, such that first item 20 is brought into LOV 104 followed by second item 20 which is brought into LOV 104, and so forth.
  • items 20 are successively placed in LOV 104 and then removed such as by a robot or human operator.
  • Controller 130 is a computing device as defined herein. Controller 130 comprises one or more processors (not shown) such as but not limited to a central processing unit (CPU), a graphics processing unit (GPU), a digital signal processor (DSP), a microprocessor, a controller, a chip, a microchip, an integrated circuit (IC), or any other suitable multi-purpose or specific processor or controller. Controller 130 activates light source 106 or any of its components or controllable segments as described above, which may or may not be activated depending on the item being imaged or the inspection lighting environment. Controller 130 preferably alters the intensity or color of light source 106 depending on the item being imaged or the inspection lighting environment. Controller 130 preferably alters the intensity or color of light source for regions of particular interest within the illuminated area.
  • Controller 130 further comprises a memory unit (not shown) which stores executable instructions that, when executed by the processor, facilitate performance of operations of the processor.
  • the memory unit may also store at least part of the image data received from camera 102.
  • Non-limiting examples of memory units include random access memory (RAM), dynamic RAM (DRAM), flash memory, volatile memory, non-volatile memory, cache memory, a buffer, a short term memory unit, a long term memory unit, or other suitable memory units or storage units.
  • Controller 130 further comprises a VIA user interface (UI) 132.
  • VIA UI 132 may comprise a monitor or screen and notifications to a user may be visual (e.g., text or other content displayed on the monitor).
  • VIA UI 132 comprises a light that may light up or change color.
  • VIA UI 132 comprises an audio player to emit a sound.
  • VIA UI 132 preferably enables accepting user input such as by a touch screen, keyboard and/or mouse.
  • VIA UI 132 is provided on a multi purpose device such as a smartphone, tablet or personal computer.
  • DCAS 150 can check the operation status of one or more VIAs 110.
  • DCAS 150 can check the software version running on one or more VIAs 110.
  • DCAS 150 can check the security status of VIAs (e.g., that one or more VIAs 110 are updated with the most recent security updates).
  • an operator can use DCAS 150 to access a real-time view of the inspection images from any VIA 110 for display on DCAS UI 156.
  • an operator can use DCAS 150 to request specific data from any one or more of VIAs 110.
  • DCAS 150 can change inspection or other settings of any one or more of VIA 110.
  • DCAS 150 can perform software upgrades of any one or more of VIA 110.
  • DCAS 150 can initiate inspection to be performed by any one or more of VIA 110.
  • DCAS 150 can change the region of interest to be inspected by any one or more of VIA 110.
  • DCAS 150 can initiate re- inspection of previously inspected items further optionally with changed inspection parameters.
  • FIG 2 is a flow diagram showing collection of data from automated visual inspection appliances on a production line according to at least some embodiments of the present invention.
  • Use of automated visual inspection system 100 preferably proceeds according to process 200 as shown in figure 2.
  • VIA 110 is set up to enable inspection of items 20.
  • System 100 requires setup for each type of item or stage of item that is to be inspected.
  • at least two or more defect free samples of a manufactured item 20 of the same type are placed in succession within field of view 104 of camera 102.
  • Each defect free sample of item 20 is imaged by camera 102.
  • These images which may be referred to as setup images, are optionally obtained by using different imaging parameters of camera 102 and lighting parameters of light source 106.
  • the images comprise image data such as pixel values that represent the intensity of reflected light as well partial or full images or videos.
  • the setup images are analyzed by controller 130 using machine learning/artificial intelligence (AI) and computer vision algorithms to create a complete representation of item 20 used for defect detection, gating, sorting and/or other inspection tasks, on the production line.
  • AI machine learning/artificial intelligence
  • controller 130 can preferably detect and inspect further items of the same type even if these further items were never previously presented, and determine whether these are defect-free.
  • VIA 110 independently of DCAS 150.
  • step 202 items 20 are inspected by each VIA 110 for defect detection, gating, or sorting purposes.
  • the following data is collected by each VIA 110 per item 20 as a result of the inspection process. This data is herein referred to as“per-item collected data”, and one or more of per-item collected data is referred to as“collected data”:
  • step 204 the collected data is transmitted by each VIA 110 to DCAS 150.
  • the communication between VIA 110 and DCAS 150 may use standard communication infrastructure and protocols as known in the art.
  • Collected data is stored in DB 154. Collected data is transmitted by VIA 110 to DCAS 150 according to one or more of the following:
  • Collected data stored in DB 154 can preferably be searched and queried via DCAS UI 156, DB 154 of DCAS 150 functioning as an archive.
  • a non limiting example of such a query is a search for images and other inspection data related to a specific item indexed by an item identifier such as but not limited to the item barcode or serial number.
  • step 206 the collected data from VIAs 110 is analyzed by analysis engine 152 and/or used for generating reports.
  • the analyses or use of collected data of step 206 optionally take place immediately following step 204. Alternatively, step 206 takes place some time after step 204.
  • Reports and/or analyses are preferably generated using big data methods.
  • the analysis is performed for a combination of different type items where different type items may be any of different products, different production stages, different plants, or different industries.
  • One or more of the following reports are preferably generated including but not limited to:
  • One or more of the following analyses are preferably performed including but not limited to:
  • Cost of defect - i. e. the cost of discarded items or cost of repair of items determined to be defective
  • step 206 preferably takes place based on one or more of:
  • step 208 the results of the analyses and/or reports of step 206 are stored in DB 154 and also preferably displayed using DCAS UI 156.
  • results are exported to external systems such as but not limited to external monitor 60.
  • results are displayed on a configurable dashboard presented on DCAS UI 156.
  • results of step 206 generate alerts which are displayed on DCAS UI 156 or communicated to an operator of DCAS 150 such as via notification engine 158 sending, for example but not limited to, text or other messages to a mobile device.
  • a non-limiting example of an alert is“% defects detected in a production area exceeding a defined threshold”.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Manufacturing & Machinery (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • General Factory Administration (AREA)

Abstract

L'invention concerne un système de collecte et d'analyse de données d'inspection visuelle, comprenant : une pluralité d'appareils d'inspection visuelle (VIA) configurés pour effectuer une inspection et acquérir des données d'inspection visuelle relatives à des articles inspectés ; et un serveur de collecte et d'analyse de données (DCAS) configuré pour recevoir des informations comprenant les données d'inspection visuelle provenant des appareils d'inspection visuelle multiples, et pour analyser les informations reçues afin de réaliser une analyse de mégadonnées. Les VIA sont conçus pour détecter des défauts ou déclencher ou compter les articles inspectés, sans la participation du DCAS.
PCT/IL2019/051320 2018-11-29 2019-12-01 Analyse centralisée d'appareils d'inspection visuelle multiples WO2020110129A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US17/297,572 US20210398267A1 (en) 2018-11-29 2019-12-01 Centralized analytics of multiple visual inspection appliances
DE112019005951.3T DE112019005951T5 (de) 2018-11-29 2019-12-01 Zentralisierte Analyse mehrerer visueller Prüfvorrichtungen

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201862772758P 2018-11-29 2018-11-29
US62/772,758 2018-11-29
IL263399A IL263399B (en) 2018-11-29 2018-11-29 Centralized analyzes of multiple devices for visual inspection of a production line
IL263399 2018-11-29

Publications (1)

Publication Number Publication Date
WO2020110129A1 true WO2020110129A1 (fr) 2020-06-04

Family

ID=70852351

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2019/051320 WO2020110129A1 (fr) 2018-11-29 2019-12-01 Analyse centralisée d'appareils d'inspection visuelle multiples

Country Status (1)

Country Link
WO (1) WO2020110129A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111914003A (zh) * 2020-08-14 2020-11-10 知小二(广州)科技有限公司 一种基于云平台的大数据分析系统

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130202200A1 (en) * 2010-10-19 2013-08-08 3M Innovative Properties Company Computer-aided assignment of ratings to digital samples of a manufactured web product
WO2016083897A2 (fr) * 2014-11-24 2016-06-02 Kitov Systems Ltd. Inspection automatisée
US20180276811A1 (en) * 2017-03-21 2018-09-27 Test Research, Inc. Automatic optical inspection system and operating method thereof

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130202200A1 (en) * 2010-10-19 2013-08-08 3M Innovative Properties Company Computer-aided assignment of ratings to digital samples of a manufactured web product
WO2016083897A2 (fr) * 2014-11-24 2016-06-02 Kitov Systems Ltd. Inspection automatisée
US20180276811A1 (en) * 2017-03-21 2018-09-27 Test Research, Inc. Automatic optical inspection system and operating method thereof

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111914003A (zh) * 2020-08-14 2020-11-10 知小二(广州)科技有限公司 一种基于云平台的大数据分析系统

Similar Documents

Publication Publication Date Title
US20210398267A1 (en) Centralized analytics of multiple visual inspection appliances
US11983857B2 (en) System and method for visual production line inspection of different production items
US20240089412A1 (en) Machine-vision system and method for remote quality inspection of a product
US11694317B1 (en) Machine vision system and interactive graphical user interfaces related thereto
TWI587110B (zh) 光學薄膜製程即時監測系統及其方法
EP3086286A1 (fr) Procédé et système pour une inspection automatisée utilisant une base de données multimodales
US20190164270A1 (en) System and method for combined automatic and manual inspection
WO2015191906A1 (fr) Surveillance et compte-rendu automatiques de stabilité de formule
US20220005183A1 (en) Multi-camera visual inspection appliance and method of use
CN104931505A (zh) 机器视觉表面检测系统
CN113588653A (zh) 一种检测和追踪铝用阳极炭块质量的系统及方法
WO2020110129A1 (fr) Analyse centralisée d'appareils d'inspection visuelle multiples
KR20190060548A (ko) 변수 구간별 불량 발생 지수를 도출하여 공정 불량 원인을 파악하고 시각화하는 방법
Foglia et al. An inspection system for pharmaceutical glass tubes
TWI531787B (zh) An automatic optical detection method and an automatic optical detection system for carrying out the method
CN109285138B (zh) 用于机器视觉分析的分布式处理系统及方法
CN110889395B (zh) 基于机器学习的机械运动识别方法及系统
CN101191932A (zh) 一种液晶屏生产中辅助进行统计过程控制的方法及装置
CN114429256A (zh) 数据监测方法、装置、电子设备及存储介质
CN113759854B (zh) 基于边缘计算的智能工厂管控系统及方法
CN110751055A (zh) 一种智能制造系统
Viharos et al. Vision based, statistical learning system for fault recognition in industrial assembly environment
CN203875026U (zh) 一种工业加工产品自动检验系统
CN116363342A (zh) 质检方法、装置、设备、电子设备以及存储介质
WO2023218441A1 (fr) Optimisation d'un groupe de référence pour inspection visuelle

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19888320

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 19888320

Country of ref document: EP

Kind code of ref document: A1