WO2022255518A1 - Dispositif de détermination d'un défaut dans un panneau devant être inspecté à l'aide d'un modèle de réseau neuronal d'apprentissage profond - Google Patents

Dispositif de détermination d'un défaut dans un panneau devant être inspecté à l'aide d'un modèle de réseau neuronal d'apprentissage profond Download PDF

Info

Publication number
WO2022255518A1
WO2022255518A1 PCT/KR2021/006944 KR2021006944W WO2022255518A1 WO 2022255518 A1 WO2022255518 A1 WO 2022255518A1 KR 2021006944 W KR2021006944 W KR 2021006944W WO 2022255518 A1 WO2022255518 A1 WO 2022255518A1
Authority
WO
WIPO (PCT)
Prior art keywords
defect
panel
image
type
deep learning
Prior art date
Application number
PCT/KR2021/006944
Other languages
English (en)
Korean (ko)
Inventor
강래호
장세일
김미진
이지섭
Original Assignee
주식회사 솔루션에이
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 솔루션에이 filed Critical 주식회사 솔루션에이
Publication of WO2022255518A1 publication Critical patent/WO2022255518A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • G06N20/20Ensemble learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/006Electronic inspection or testing of displays and display drivers, e.g. of LED or LCD displays
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]

Definitions

  • the present invention relates to a technique for determining defects in a panel to be inspected, and more particularly, to a technique for easily determining defects in a panel to be inspected using a deep learning model method.
  • the technology to classify defects using machine learning from images of products such as inspection target panels and PCBs is an essential technology for smart factories. After analyzing product defects in real time, the cause of the defect can be removed by feeding back to the production system according to the analyzed defect type, and as a result, mass defects in the continuous process can be drastically reduced. In addition, process optimization can be performed by analyzing defect patterns according to working conditions.
  • defects in the panel to be inspected are determined by using an optical inspector or by visual inspection by a skilled worker.
  • a technical problem to be solved by the present invention relates to an apparatus for determining defects of a panel to be tested using a deep learning neural network model that can easily determine defects of a panel to be tested.
  • An apparatus for determining a defect of a panel to be inspected using a deep learning neural network model for solving the above problems includes an image processing unit that extracts image information for determining defects from an image of the panel to be inspected; a defect learning value extraction unit that classifies each defect pattern using a deep learning neural network model for the image information and outputs a learning value of each classified defect pattern; a defect determination unit determining a final defect type of the panel to be inspected for each defect pattern by reflecting a weight for each type optimized for the learning value of the defect pattern type; and a user interface unit displaying information on a final defect type of the panel to be inspected on a display screen and providing an interface for receiving a command from an operator.
  • the image processing unit divides the image of the panel to be inspected captured by the camera into blocks in units of detailed regions of interest in the longitudinal direction, divides the blocks into regions smaller than the blocks, and is composed of a certain rectangular region, It is characterized by extracting information.
  • the defect learning value extractor extracts a plurality of defect patterns included in each defect image based on a deep learning neural network model for each defect image, and then extracts a learning value corresponding to a defect type for each defect pattern. characterized by extraction.
  • the defect learning value extractor includes the plurality of defect patterns including a point defect pattern, a line defect pattern, and a mura defect pattern of the panel to be inspected, and for the image information, the point defect pattern and the line defect It is characterized in that a defect learning value classified as one of a pattern and the mura defect pattern is extracted.
  • the defect determination unit may receive a learning value and an optimized weight for each defect type and determine a final defect type using a weighted combination decision method.
  • the defect determination unit optimizes the weight for each defect image type based on the defect learning value for the final defect type derived with the same initial weight, and calculates the optimized weight for each defect type and the learning value for each defect type. It is characterized in that the defect learning value for each defect type is output by multiplying and summing, and the defect type having the maximum defect learning value among the defect learning values for each defect type is determined as the final defect type.
  • the defect determination unit is characterized in that it is implemented by applying an ensemble model to the deep learning models.
  • the defect determination unit uses information obtained by a data augmentation technique for a training image used for learning each of the first deep learning model, the second deep learning model, and the third deep learning model. It is characterized by doing.
  • the data augmentation technique is characterized in that the training image is acquired through at least two of rotation, enlargement, horizontal reversal, vertical reversal, horizontal movement, vertical movement, and combinations thereof.
  • each defect pattern is classified using a deep learning neural network model for image information, a learning value of each classified defect pattern is output, and a weight for each type optimized for the learning value of the defect pattern type is It is possible to determine the final defect type of the panel to be inspected for each defect pattern by reflecting the defect pattern, thereby improving the defect determination error of the conventional image pattern classifier and reducing the fatigue of the operator by monitoring the operator's visual inspection rate. There are possible effects.
  • deep learning models can minimize gray zone defects judged by human intuition and experience to less than 5% for false defects by more accurate judgment and classification than humans.
  • FIG. 1 is a block diagram illustrating an apparatus for determining a defect of a panel to be inspected using a deep learning neural network model according to the present invention.
  • FIG. 2 is a conceptual diagram for explaining the entire process of the defect determination device of the panel to be inspected using the deep learning neural network model shown in FIG. 1 .
  • 3A to 3D are reference views illustrating a plurality of defect patterns learned by the defect learning value extraction unit.
  • FIG. 4 is an example of a mura pattern image for explaining the determination of a mura defect pattern according to an automatic change of a set value in a defect determination unit.
  • FIG. 5 is a conceptual diagram of a weighted combination determination method of a defect determination unit.
  • FIG. 6 is a reference diagram illustrating defect types for each channel derived with optimized weights in a defect determination unit.
  • 7A to 7C are reference diagrams illustrating information displayed by a user interface unit.
  • FIG. 1 is a block diagram illustrating a defect determining device (hereinafter, referred to as a defect determining device 100) of a panel to be tested using a deep learning neural network model according to the present invention, and FIG. It is a conceptual diagram to explain the entire process of the defect determination device of the panel to be inspected using a learning neural network model.
  • the defect determination apparatus 100 includes an image processing unit 110 , a defect learning value extraction unit 120 , a defect determination unit 130 , and a user interface unit 140 .
  • the image processing unit 110 extracts image information for defect determination from an image of a panel to be inspected.
  • the image processing unit 110 divides the image of the panel to be inspected captured by the camera into blocks in units of detailed regions of interest in the longitudinal direction, divides the blocks into regions smaller than the blocks and consists of a certain rectangular region, Extract image information.
  • the image processing unit 110 compresses multi-channel image information obtained from at least one image acquisition device according to various angles and/or multi-illumination into low resolution, derives a region of interest, and outputs image data for the derived region of interest. print out
  • the image processing unit 110 compresses and converts image or sensor map data obtained from a multi-channel image acquisition device into a low-capacity compression format and low-resolution, and extracts an image portion of an unnecessary area excluding an object to be detected in a cropping block of a region of interest. Remove.
  • a multiplexer for multiplexing image data of different input channels output from the image processing unit 110 and transmitting the multiplexed image data to the defect learning value extraction unit 120 may be additionally provided.
  • the image data of the region of interest of the image processing unit 110 is a deep learning prediction model with multiple defect types having multiple defect elements and defect patterns included in the image data obtained for each channel by continuously performing the set number of times (Sequence Length). This is built
  • the image processing unit 110 divides an image of a display panel such as a touch screen captured by a camera into blocks in units of detailed regions of interest.
  • the image processing unit 110 may divide the image of the transparent flexible touch screen panel into blocks in units of detailed regions of interest in the longitudinal direction.
  • the image processing unit 110 may divide blocks divided into detailed region-of-interest units into detailed regions that are smaller than the block, and more specifically, may divide blocks into detailed regions consisting of a certain rectangular region.
  • the image processing unit 110 obtains image information on whether or not a panel to be inspected is defective by using a lighting inspection method.
  • the image processing unit 110 is a device for inspecting defects according to patterns of a panel to be inspected by applying various patterns.
  • a Vignetting phenomenon and a Moire phenomenon may occur in an image due to lens distortion.
  • the image processing unit 110 has a purpose of improving reliability by inspecting whether a product is defective in a manufacturing process through an electric flat panel display inspection, or reducing a defect rate through partial repair by detecting a defective product in the middle of the process. This is a test in which an operation of an object is observed after applying an electrical signal, and the test is performed using a probe unit that applies an electrical signal by contacting an electrode of the object to be inspected.
  • AP Auto Probe
  • PG Plasma Generator
  • BLU Back Light Unit
  • the image processing unit 110 strongly applies the inspection recipe to reduce non-detection, many false defects that are detected as defects are generated even for normal products, and if the inspection recipe is applied weakly to reduce this, actual defects are detected. Failure to do so may result in defective products being shipped out as normal products. This is a trade-off phenomenon caused by the rule-based algorithm of the automatic image pattern classifier. As the inspection is conducted based on the standardized rules written by humans, the inspection is good within the set rules, but new types of products or defects Inspection is difficult.
  • the image processing unit 110 may perform an optical inspection based on data stored in a database.
  • the defect image that has been inspected is stored in the database, and the defect data is transmitted for defect determination by the defect determination device and supported for testing in other components.
  • the image processing unit 110 classifies the acquired image information into one of a plurality of defect patterns.
  • the plurality of defect patterns include a point defect pattern, a line defect pattern, and a mura defect pattern of the panel to be inspected.
  • the defect learning value extractor 120 classifies each defect pattern using a deep learning neural network model for the image information, and outputs a learning value of each classified defect pattern.
  • the defect learning value extractor 120 extracts a plurality of defect patterns included in each defect image based on a deep learning neural network model for each defect image, and then learns a defect type corresponding to each extracted defect pattern. extract the value
  • the defect learning value extractor 120 includes the plurality of defect patterns including a point defect pattern, a line defect pattern, and a mura defect pattern of the inspection target panel, and the point defect pattern, the line defect pattern for the image information A defect learning value classified as one of a defect pattern and the mura defect pattern is extracted.
  • the defect learning value extractor 120 includes defect image 1, defect image k, and defect image K for an arbitrary channel k as a plurality of defect images k having defect elements included in the output image data of each channel.
  • Defect image m of any channel m includes defect image 1, defect image m, and defect image M
  • defect image h of any channel h includes defect image 1, defect image h, and defect image H. .
  • the defect learning value extractor 120 uses a deep learning prediction model built with defect image k, defect image m, and defect image h of each channel k, m, and h as inputs to determine a number of defects included in each defect image. extract the pattern. At this time, since each defect image of each channel includes a plurality of defect patterns, the defect learning value extractor 120 detects a defect type for each extracted defect pattern. For example, when there is no defect at all, defect image k, defect image m, and defect image h of each channel k, m, and h become 0.
  • the defect learning value extractor 120 learns based on a deep learning prediction model with each defect image having defect elements of each channel as an input, and extracts and extracts a plurality of defect patterns included in each defect image according to the learning result A defect type is derived for each defect pattern detected. At this time, the defect type is scratch, dent, hole, NA, etc., and a learning value for each defect type is output.
  • the defect learning value extraction unit 120 transfers the learning values for a plurality of defect types for each channel to the defect determination unit 130 .
  • 3A to 3D are reference views illustrating a plurality of defect patterns learned by the defect learning value extractor 120 .
  • the defect learning value extractor 120 classifies the image information into one of the point defect pattern, the line defect pattern, and the mura defect pattern.
  • the defect determination unit 130 determines the final defect type of the panel to be inspected for each defect pattern by reflecting the weight for each type optimized for the learning value of the defect pattern type.
  • the defect determination unit 130 determines the final defect type by a weighted combination decision method by receiving the learning value and the optimized weight for each defect type.
  • the defect determination unit 130 optimizes the weight for each defect image type based on the defect learning value for the final defect type derived with the same initial weight, and learns the optimized weight for each defect type and each defect type.
  • the values are multiplied and summed to output the defect learning values for each defect type, and the defect type having the maximum defect learning value among the defect learning values for each defect type is determined as the final defect type.
  • the defect determining unit 130 may be implemented by applying an ensemble model to the deep learning models.
  • the defect determination unit 130 is information obtained by data augmentation technique on the training image used for learning each of the first deep learning model, the second deep learning model, and the third deep learning model.
  • the data augmentation technique is acquired through at least two of rotation, enlargement, horizontal reversal, vertical reversal, horizontal movement, vertical movement, and combinations thereof of the learning image.
  • FIG. 4 is an example of a mura pattern image for explaining the determination of a mura defect pattern according to an automatic change of a setting value in the defect determination unit 130 .
  • (a) of FIG. 4 shows an original image
  • (b), (c) and (d) of FIG. 4 illustrate an increase in the sharpness of a Mura defect pattern according to a change in a set value, respectively.
  • FIG. 4(d) is an example of checking the clearest Mura defect pattern in the defect determination unit 130 according to the automatic change of the setting value and determining the defect of the original image.
  • the defect determination unit 130 continuously stores images used for determination while automatically determining a defect pattern, and programmatically and automatically applies the stored images to the deep learning models to automatically perform deep learning learning. By performing iteratively, the deep learning model can be optimized.
  • the defect determination unit 130 receives the final decision result information of the inspector for determining the defect pattern, and programmatically and automatically applies the input final decision result information to the deep learning models to automatically perform deep learning learning. It is possible to optimize the deep learning model by repeatedly performing.
  • the defect determination unit 130 determines the final defect type of the display panel by reflecting a plurality of weights for each channel set to a plurality of learning values for each channel.
  • the defect determination unit 130 determines the final defect type in a weighted combination decision method based on the set weight and the learning value for the plurality of defect types of each channel.
  • a weighted combination method the defect determination unit 130 initially assigns the same weight to the defect learning value for each channel, then calculates the probability that the final defect type for each channel is detected, and calculates a plurality of defect types for each channel with the calculated probability Optimize each weight for .
  • An algorithm for optimizing these weights may be provided with various algorithms such as an automatic tuning method (Grid Search).
  • FIG. 5 is a conceptual diagram of a weighted coupling determination method of the defect determination unit 130. Referring to FIG. The values are multiplied by preset weights (w1, w2, w3, w4), then synthesized, and among the defect learning values for the synthesized defect types, a defect type exceeding a predetermined threshold is determined as the final defect type of the corresponding inspection target panel. .
  • FIG. 6 is a reference diagram illustrating defect types for each channel derived with optimized weights in the defect determination unit 130. Referring to FIG.
  • the accuracy of the learning value for the defect type is higher when the average of the existing method is multiplied by the weight obtained by the grid search method and averaged.
  • the weight is optimized based on the ratio of learning values for the defect type (dent, scratch, hole, NA) for each channel derived by
  • the defect determination unit 130 multiplies each learning value for a plurality of defect types for each channel and each weight for each defect type for each channel, and then sums them up to output a defect learning value for each defect type for each channel. do.
  • the user interface unit 140 displays information on the final defect type of the panel to be inspected on a display screen and provides an interface for receiving an operator's command.
  • the user interface unit 140 displays decision result information of the defect determination unit 130 on a display screen and provides a user interface for receiving an operator's command.
  • 7A to 7C are reference diagrams illustrating information displayed by the operator interface device.
  • the user interface unit 140 provides a UI function so that experts can re-determine based on the results determined by the respective decision subjects (AOI, S-ADJ, and S-DRS).
  • the user interface unit 140 provides a function that can be used for education of decision subjects (AOI, S-ADJ, S-DRS). For example, AOI decision result/ADJ decision result/worker decision result are compared, and through a statistical graph of the decision result, the vulnerability of the decision subject can be identified, and AOI/ADJ problem identification and worker retraining decision and retraining function can be performed.
  • the present invention can be applied to various playback devices by being implemented as a software program and recording on a predetermined computer-readable recording medium.
  • Various playback devices may be PCs, laptops, portable terminals, and the like.
  • the recording medium may be a hard disk, flash memory, RAM, ROM, etc. as a built-in type of each playback device, or an optical disk such as CD-R or CD-RW, compact flash card, smart media, memory stick, or multimedia card as an external type. have.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Mathematical Physics (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Biophysics (AREA)
  • Computer Hardware Design (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Computational Linguistics (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Graphics (AREA)
  • Quality & Reliability (AREA)
  • Medical Informatics (AREA)
  • Image Analysis (AREA)

Abstract

D'après la présente invention, un dispositif de détermination d'un défaut dans un panneau devant être inspecté à l'aide d'un modèle de réseau neuronal d'apprentissage profond comprend : une unité de traitement d'image conçue pour extraire des informations d'image permettant de déterminer un défaut dans une image d'un panneau devant être inspecté ; une unité d'extraction de valeurs d'apprentissage de défauts conçue pour classer chacun des modèles de défauts à l'aide d'un modèle de réseau neuronal d'apprentissage profond sur les informations d'image et pour sortir une valeur d'apprentissage de chacun des modèles de défauts classés ; une unité de détermination de défaut conçue pour appliquer des pondérations pour chaque type optimisé à la valeur d'apprentissage du type de modèle de défauts et pour déterminer un type de défaut final du panneau devant être inspecté pour chacun des modèles de défauts ; et une unité d'interface utilisateur qui fournit une interface conçue pour afficher sur un écran d'affichage des informations relatives au type de défaut final du panneau devant être inspecté et pour recevoir à titre d'entrées des instructions d'un travailleur.
PCT/KR2021/006944 2021-06-03 2021-06-03 Dispositif de détermination d'un défaut dans un panneau devant être inspecté à l'aide d'un modèle de réseau neuronal d'apprentissage profond WO2022255518A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2021-0071941 2021-06-03
KR1020210071941A KR20220164099A (ko) 2021-06-03 2021-06-03 딥러닝 신경망 모델을 이용한 검사대상 패널의 결함 판정장치

Publications (1)

Publication Number Publication Date
WO2022255518A1 true WO2022255518A1 (fr) 2022-12-08

Family

ID=84323694

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2021/006944 WO2022255518A1 (fr) 2021-06-03 2021-06-03 Dispositif de détermination d'un défaut dans un panneau devant être inspecté à l'aide d'un modèle de réseau neuronal d'apprentissage profond

Country Status (2)

Country Link
KR (1) KR20220164099A (fr)
WO (1) WO2022255518A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20240103084A (ko) 2022-12-26 2024-07-04 (주)제이에스 시스템 Pcb 불량 검출 시스템

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20190017121A (ko) * 2017-08-10 2019-02-20 울랄라랩 주식회사 기계학습 기법에 기반한 기계의 오류 데이터를 검출하기 위한 알고리즘 및 방법
KR20190085159A (ko) * 2016-12-07 2019-07-17 케이엘에이-텐코 코포레이션 컨볼루션 뉴럴 네트워크 기반 결함 검사를 위한 데이터 증강
US20200090969A1 (en) * 2018-09-19 2020-03-19 Kla-Tencor Corporation System and Method for Characterization of Buried Defects
KR20200071876A (ko) * 2018-12-06 2020-06-22 주식회사 씨에스앤씨 공정 중 생산품 불량 예측시스템 및 불량 예측을 위한 학습모델 생성방법
KR20200092143A (ko) * 2019-01-24 2020-08-03 가천대학교 산학협력단 딥러닝 신경망을 이용한 디스플레이 패널 불량 진단 시스템 및 방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20190085159A (ko) * 2016-12-07 2019-07-17 케이엘에이-텐코 코포레이션 컨볼루션 뉴럴 네트워크 기반 결함 검사를 위한 데이터 증강
KR20190017121A (ko) * 2017-08-10 2019-02-20 울랄라랩 주식회사 기계학습 기법에 기반한 기계의 오류 데이터를 검출하기 위한 알고리즘 및 방법
US20200090969A1 (en) * 2018-09-19 2020-03-19 Kla-Tencor Corporation System and Method for Characterization of Buried Defects
KR20200071876A (ko) * 2018-12-06 2020-06-22 주식회사 씨에스앤씨 공정 중 생산품 불량 예측시스템 및 불량 예측을 위한 학습모델 생성방법
KR20200092143A (ko) * 2019-01-24 2020-08-03 가천대학교 산학협력단 딥러닝 신경망을 이용한 디스플레이 패널 불량 진단 시스템 및 방법

Also Published As

Publication number Publication date
KR20220164099A (ko) 2022-12-13

Similar Documents

Publication Publication Date Title
WO2019107614A1 (fr) Procédé et système d'inspection de qualité basée sur la vision artificielle utilisant un apprentissage profond dans un processus de fabrication
WO2020141882A1 (fr) Système et procédé de modélisation et de simulation explicables d'intelligence artificielle
KR20200092143A (ko) 딥러닝 신경망을 이용한 디스플레이 패널 불량 진단 시스템 및 방법
CN111896549B (zh) 一种基于机器学习的建筑物裂缝监测系统和方法
KR102168724B1 (ko) 이미지 검사를 이용한 이상 판별 방법 및 장치
CN111312023B (zh) 一种中学物理电路实验电路图自动绘制的装置及方法
WO2022255518A1 (fr) Dispositif de détermination d'un défaut dans un panneau devant être inspecté à l'aide d'un modèle de réseau neuronal d'apprentissage profond
CN111401437A (zh) 基于深度学习的输电通道隐患预警等级分析方法
CN117974398A (zh) 基于多源数据的施工安全监管系统
WO2023085717A1 (fr) Dispositif de marquage à base de regroupement, dispositif de détection d'anomalie, et procédés associés
KR102174424B1 (ko) 서버 기반 부품 검사방법 및 그를 위한 시스템 및 장치
WO2022158628A1 (fr) Système de détermination de défaut dans un panneau d'affichage en fonction d'un modèle d'apprentissage automatique
CN112347889B (zh) 一种变电站作业行为辨识方法及装置
CN115481941B (zh) 一种多功能区联合的智能化安防管理方法及系统
WO2023113274A1 (fr) Dispositif et procédé d'inspection de surface de produit sur la base d'une ia
WO2022250190A1 (fr) Système de détermination de défaut d'objet d'inspection d'image à l'aide d'un modèle d'apprentissage profond
WO2023158068A1 (fr) Système et procédé d'apprentissage pour améliorer le taux de détection d'objets
CN114782431B (zh) 一种印刷电路板缺陷检测模型训练方法及缺陷检测方法
CN117314829A (zh) 一种基于计算机视觉的工业零件质检方法和系统
KR100591853B1 (ko) Lcd 모듈 검사 방법 및 장치
JP2021148678A (ja) 不良検出装置及び不良検出方法
WO2024136568A1 (fr) Dispositif d'inspection d'extérieur d'objet et procédé d'inspection d'extérieur d'objet
CN112115870A (zh) 一种基于YOLOv3的考试作弊小抄识别方法
WO2023282612A1 (fr) Dispositif à ia permettant la lecture automatique de résultats de test de plusieurs kits de diagnostic et son procédé associé
WO2023101374A1 (fr) Procédé, dispositif et système de test d'ensemble non destructif basés sur l'intelligence artificielle pour objet

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21944273

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 19.04.2024)