WO2023101375A1 - Procédé, dispositif et système pour une inspection facultative non destructive d'objet basée sur un moteur d'intelligence artificielle - Google Patents

Procédé, dispositif et système pour une inspection facultative non destructive d'objet basée sur un moteur d'intelligence artificielle Download PDF

Info

Publication number
WO2023101375A1
WO2023101375A1 PCT/KR2022/019128 KR2022019128W WO2023101375A1 WO 2023101375 A1 WO2023101375 A1 WO 2023101375A1 KR 2022019128 W KR2022019128 W KR 2022019128W WO 2023101375 A1 WO2023101375 A1 WO 2023101375A1
Authority
WO
WIPO (PCT)
Prior art keywords
learning
electronic device
learning model
defect inspection
artificial intelligence
Prior art date
Application number
PCT/KR2022/019128
Other languages
English (en)
Korean (ko)
Inventor
임태규
설재민
김승환
노은식
민병석
김형철
Original Assignee
(주)자비스
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by (주)자비스 filed Critical (주)자비스
Publication of WO2023101375A1 publication Critical patent/WO2023101375A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N23/00Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00
    • G01N23/02Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00 by transmitting the radiation through the material
    • G01N23/04Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00 by transmitting the radiation through the material and forming images of the material
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N23/00Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00
    • G01N23/02Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00 by transmitting the radiation through the material
    • G01N23/06Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00 by transmitting the radiation through the material and measuring the absorption
    • G01N23/18Investigating the presence of flaws defects or foreign matter
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N2223/00Investigating materials by wave or particle radiation
    • G01N2223/40Imaging
    • G01N2223/401Imaging image processing
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N2223/00Investigating materials by wave or particle radiation
    • G01N2223/60Specific applications or type of materials
    • G01N2223/646Specific applications or type of materials flaws, defects

Definitions

  • the present invention relates to non-destructive inspection of an object, and more particularly, to a method, apparatus, and system for non-destructive defect inspection of an object based on a selective artificial intelligence engine.
  • Defective products can lead to deterioration of supply chain services and loss of automation facilities. Therefore, it is very important to properly inspect the product for defects.
  • Non-destructive inspection that is, non-destructive inspection, which does not destroy an object by using radiation, in particular, X-ray, is used for quality inspection.
  • Conventional radiation non-destructive inspection applies a single technology to determine whether an object is defective in an X-ray image. was detected.
  • this conventional inspection method has limitations in defect detection performance, and thus has a problem in that all defects cannot be detected from the X-ray image, that is, there are undetected defects.
  • this since characteristics are different depending on the composition of the object, this may cause a difference in test result, that is, accuracy, even in the case of an allogeneic test object, thereby reducing the reliability of the test.
  • An object to be solved by the present invention is to provide a method, apparatus, and system that selects an optimal artificial intelligence learning model for an inspection object to increase inspection reliability through non-destructive inspection accuracy improvement and at the same time increase the efficiency of an inspection system.
  • An electronic device for performing non-destructive testing on an object based on an optional artificial intelligence engine for solving the above problems includes a plurality of learning variables for defect inspection and a plurality of learning variables corresponding to the individual learning variables. memory to store the model; and a processor inspecting a defect of the object, wherein the processor includes a processor configured to inspect a defect of the object based on at least one learning model selected from among the plurality of stored learning models.
  • a non-destructive inspection method for an object based on an artificial intelligence engine in an electronic device includes storing a plurality of learning variables for defect inspection and a plurality of learning models corresponding to the individual learning variables; receiving image data of the object; determining a category of the object to be input through the stored learning model, and selectively selecting at least one learning model from among the plurality of stored learning models based on the determined category of the object; performing a defect inspection of the target object based on the selected at least one learning model; and providing the defect inspection result.
  • An optional artificial intelligence engine-based non-destructive examination system for an object includes an image acquisition device that acquires image data of an object by radiating radiation; and an electronic device, wherein the electronic device includes: a memory configured to store a plurality of learning variables for defect inspection and a plurality of learning models corresponding to the individual learning variables; and a processor configured to perform a defect inspection of the object based on at least one learning model selected from among the plurality of stored learning models.
  • the present invention it is possible to increase the efficiency of the inspection system while increasing the reliability of the inspection through the improvement of the accuracy of the non-destructive inspection of the inspection object.
  • FIG. 1 is a block diagram illustrating an artificial intelligence based non-destructive testing system for an object according to an embodiment of the present invention.
  • FIG. 2 is a flowchart illustrating a method for non-destructive testing of an object according to an embodiment of the present invention.
  • FIG. 3 is a configuration block diagram of an electronic device according to an embodiment of the present invention.
  • FIG. 4 is a configuration block diagram of a learning unit according to an embodiment of the present invention.
  • FIG. 5 is a configuration block diagram of a processing unit according to an embodiment of the present invention.
  • FIG. 6 is a configuration block diagram of an electronic device according to another embodiment of the present invention.
  • FIG. 7 is a configuration block diagram of a processing unit of FIG. 6 .
  • FIGS 8 and 9 are diagrams for explaining results of non-destructive testing of an object according to the present invention.
  • FIG. 10 is a flowchart illustrating a non-destructive method of an object according to another embodiment of the present invention.
  • spatially relative terms “below”, “beneath”, “lower”, “above”, “upper”, etc. It can be used to easily describe a component's correlation with other components. Spatially relative terms should be understood as including different orientations of elements in use or operation in addition to the orientations shown in the drawings. For example, if you flip a component that is shown in a drawing, a component described as “below” or “beneath” another component will be placed “above” the other component. can Thus, the exemplary term “below” may include directions of both below and above. Components may also be oriented in other orientations, and thus spatially relative terms may be interpreted according to orientation.
  • 'image or image data' refers to still image or video data obtained through a tube or detector using radiation.
  • the image may be an X-ray image of an object through an X-ray tube or an X-ray detector.
  • the X-ray image is, for example, a 2D (Dimensional) image and a CT (Computed Tomography) image reconstructed from a continuous 2D image aggregation, and a reconstructed CT volume data.
  • a slice image may be included.
  • 'defect' indicates a part that is defined as normal or not a part that can be defined as normal for an object during a non-destructive test on an object to be inspected for a defect based on artificial intelligence according to the present invention. That is, it may be expressed by various names such as defect or error. Depending on the embodiment, the present invention is not limited to such an expression, and may include the same or similar meaning as a defect in a conventional sense.
  • FIG. 1 is a block diagram illustrating an artificial intelligence based non-destructive testing system for an object according to an embodiment of the present invention.
  • a system for performing an artificial intelligence-based non-destructive examination of an object may include an electronic device 100 and an image acquisition device 150.
  • the configuration of the electronic device 100 and the image acquisition device 150 shown in FIG. 1 is not limited thereto as an embodiment, and one or more components are added in relation to the operation according to the present invention It can be configured or vice versa.
  • the electronic device 100 may include a memory and a processor, and the memory may correspond to or include the database 120 shown in FIG. 1 , and the processor may include the controller 110 ) and at least one of the AI engine 130.
  • the AI engine 130 includes a deep learning network, but is not limited thereto.
  • the electronic device 100 may be connected to the image capture device 150 through a network to receive image data of an object.
  • the image acquisition device 150 may include a detector 160, an X-ray tube 170, and a lighting source (not shown), and the detector 160 may be at least one of a 2D detector and a 3D detector.
  • the detector 160 and the X-ray tube 170 are X-ray image capture devices for each object, which may be configured in a conventionally known configuration.
  • the image capture device 150 may further include a device capable of capturing motion of a moving object and a CT detector (not shown).
  • the light source includes, but is not limited to, a transmissive light source, terahertz.
  • the controller 110 controls operations performed by the electronic device 100
  • the database 120 stores an image of an object received from the image capture device 150 and information about the object. It stores data that is received and processed by the electronic device 100, such as a learning dataset used for defect inspection for a defect and a learning model corresponding to the learning dataset.
  • the control unit 110 classifies (or classifies) the category of the object through a learning model stored in the database 120 with image data of the object as input, and inspects defects on the image data of the object, which is a defect inspection target.
  • Various machine learning models for specifying an operation mode, selectively selecting at least one learning model corresponding to the specified defect inspection operation mode, and performing defect inspection from the image data of the object based on the selected learning model may include a hardware unit capable of performing an algorithm of a machine-learning model and related applications.
  • the control unit 110 may include at least one of a central processing unit, a microprocessor, and a graphic processing unit.
  • the controller 110 may further include a separate memory (not shown) for storing machine learning model algorithms or applications.
  • the electronic device 100 may obtain object image data (higher quality or improved) by learning image data of an object for defect inspection and inputting the learned image data to the AI engine 130 .
  • the image acquired through the AI engine 130 is generally higher quality than the object image data input from the image capture device 150 or improved image data as well as defects during non-destructive testing of the object based on artificial intelligence. In terms of inspection, it may represent all or some improved or new image data.
  • the electronic device 100 compares the examination result of the image data of the object before learning with the examination result of the image data of the object acquired through the AI engine 130 to create a new learning model. can be further defined.
  • the electronic device 100 determines whether or not to use the AI engine 130 from the image of the object input based on the learning model added in this way, that is, the information obtained through the AI engine 130 rather than the image data of the input object. It is also possible to determine whether or not to use the image data for defect inspection and perform an operation accordingly.
  • an X-ray image of an object input to the electronic device 100 is used for inspection in various fields such as inspection equipment, for example, semiconductor defect detection, PCB substrate defect detection, and foreign material detection in the food and pharmaceutical fields. It can be.
  • FIG. 2 is a flowchart illustrating a method for non-destructive testing of an object according to an embodiment of the present invention.
  • 3 is a block diagram of an electronic device 100 according to an embodiment of the present invention.
  • 4 is a configuration block diagram of a learning unit 220 according to an embodiment of the present invention.
  • 5 is a configuration block diagram of the processing unit 230 according to an embodiment of the present invention.
  • 6 is a block diagram of the electronic device 100 according to another embodiment of the present invention
  • FIG. 7 is a block diagram of the processing unit 610 of FIG. 6 .
  • 8 and 9 are diagrams for explaining results of non-destructive testing of an object according to the present invention.
  • 10 is a flowchart illustrating a non-destructive method of an object according to another embodiment of the present invention.
  • FIGS. 2 and 10 may be performed through the electronic device 100 of FIG. 1 .
  • the operations of FIGS. 2 and 10 will be described with reference to the configuration of the electronic device 100 shown in FIGS. 3 to 7 .
  • the electronic device 100 may store a plurality of learning variables and a corresponding learning model.
  • 'learning variables' denote training datasets (first and/or second training datasets) to be described later
  • 'corresponding learning models' denote learning models generated using the training datasets. .
  • the input unit 310 receives image data of the object acquired through radiation (eg, X-rays) from the image capture device 150.
  • radiation eg, X-rays
  • the order of operations 11 and 12 may be defined differently from that shown in FIG. 2 . This may be applied not only to operations 11 and 12, but also to the sequence of operations shown in FIG. 2 (and FIG. 10 described later).
  • the image data of the object input to the input unit 310 is raw data received from the image capture device 150 or at least partially processed to be suitable for defect inspection through the AI engine 130 described above. can be data.
  • the improved image data is obtained through the AI engine 130 and used as basic data for defect inspection. available.
  • the learning unit 320 takes the training data set as an input and generates a learning model corresponding to the input training data set through the preprocessing module 410, the feature extraction module 420, and the inspection processing module 430 to store the data in the memory. can be temporarily stored in
  • the processing unit 330 determines a category of the object through a learning model based on image data of the input object, and specifies a defect inspection operation mode for the object based on the determined category. Thus, it is possible to select a learning model corresponding to the specified operation mode.
  • the processing unit 330 inspects the object image data for defects based on the selected learning model, and generates defect inspection result data.
  • the output unit 340 provides defect detection result data generated by the processing unit 330 for the object.
  • 'providing' may be defined as various meanings related to outputting results to the target object, such as direct or indirect output through a display, transmission to a target terminal, and/or output control.
  • an artificial intelligence-based non-destructive examination method for an object according to the present invention will be described in more detail as follows.
  • the electronic device 100 learns a model for defect inspection based on the image data of the inspection target and receives image data of the object, which correspond to operations 11 and 12 of FIG. 2, respectively. It is an action that becomes
  • the electronic device 100 determines a defect inspection operation mode for the target object.
  • the electronic device 100 selects a learning model corresponding to the corresponding operation mode through an operation mode selection module, and selects the image data of the object. processing, that is, performing defect inspection.
  • the electronic device 100 selects a learning model corresponding to the corresponding operation mode through an operation mode selection module, Image data is processed, that is, defect inspection is performed.
  • FIGS. 4 and 5 Details of determining the defect inspection operation mode (first mode, second mode) and processing image data of the object according to the determined defect inspection operation mode are described in FIGS. 4 and 5 below. described later in order to help the understanding of the technical idea of the present invention, only the first and second operation modes are defined and described as defect inspection operation modes in the present specification, but the number of operation modes and the definition of operation modes according to setting or request etc. may be implemented differently.
  • the electronic device 100 may include an input unit 310, a defect inspection unit, and an output unit 340.
  • the defect inspection unit may include a learning part 320 and a processing part 330 .
  • the learning unit 320 constituting the defect inspection unit includes a preprocessing module 410 that receives and preprocesses training data (or data sets) for non-destructive inspection of an object, that is, learning of a defect inspection from a memory; It may include a feature extraction module 420 that extracts features from the preprocessed data, and an inspection processing module 430 that generates a learning model to be used for defect inspection based on the extracted features.
  • the learning unit 320 may use one AI engine for learning the defect inspection, but is not limited thereto.
  • the learning unit 320 is implemented as a plurality of learning modules, and each learning module may use a different AI engine for learning defect inspection.
  • the individual learning module may include at least one or more of a preprocessing module, a feature extraction module, and a test processing module of the learning unit 320 described above.
  • the training dataset that is, Prod #1, Prod #2, ... , Prod #N (where N is a positive integer) may be defined as a first training dataset for predefined individual test subjects.
  • learning models Model #1, Model #2, ... , Model #N (where N is a positive integer) may be defined as a learning model generated corresponding to the first training dataset, that is, a first learning model.
  • the learner 320 may generate a plurality of individual first learning models corresponding to each other using a plurality of learning variables, that is, individual first training datasets.
  • the first learning model generated in this way may be defined as a learning model specialized for an individual inspection target. That is, the learner 320 may generate N first learning models corresponding to N first training datasets (where N is a positive integer).
  • a second training dataset including all first training datasets or a combination of at least two individual first training datasets may be defined.
  • This second training dataset may be named a full training dataset, a combined training dataset, or a united training dataset.
  • the learning unit 320 may generate a second learning model using the second training dataset.
  • the second learning model is a learning model different from the aforementioned first learning model, and the number of the second learning models may be determined by the number of the defined second training datasets.
  • the individual training dataset used for the second training dataset may be the same as or different from the first training dataset described above.
  • the second training dataset may include one or more training datasets not classified as the first training dataset. From this point of view, the first training dataset may be viewed as a classified training dataset that is the basis for generating the first learning model.
  • the second training dataset may be defined as a plurality of second training datasets according to a configuration method, and a plurality of corresponding learning models may also be provided.
  • the generated first and/or second learning models may have different values of parameters, weights, and the like according to objects to be inspected.
  • whether the object is specified may mean, for example, a category or classification determined for the object.
  • whether or not the object is specified may be determined not by the electronic device 100 but by an input or request of an external input, for example, a defect inspection requester or terminal for the object. Accordingly, whether or not the object is specified may be determined based on whether or not a learning model usable for defect inspection can be specified for the object to be inspected.
  • the electronic device 100 may specify and selectively select a learning model based on the received external input.
  • the electronic device 100 selects and uses a specific first learning model corresponding to the specified object. Otherwise, the electronic device 100 inspects whether or not the object is defective using a second learning model. can At this time, when there are a plurality of second learning models, the electronic device 100 determines which second learning model(s) to use or all of the plurality of second learning models among them, and determines whether to use the plurality of second learning models. Defect inspection can be performed.
  • the electronic device 100 may control to select a second learning model other than another first learning model belonging to the first learning model from among the plurality of stored learning models.
  • the electronic device 100 determines the target object based on at least one of a first learning model and a second learning model set by default. You can also perform a defect inspection on .
  • the electronic device 100 may perform a corresponding operation by determining whether to define or re-examine an additional learning model based on the object defect inspection result according to a specific learning model set as default. For example, the electronic device 100 performs a defect inspection on an object based on a specific first learning model set as a default, and if the result is equal to or less than a predefined reference value, the electronic device 100 inspects the defect on the object based on the second learning model. can be re-executed. According to an embodiment, the electronic device 100 selects at least two or more specific first learning models from among the first learning models by default to perform defect inspection on the object, and based on a result of the execution, the defect inspection based on the second learning model.
  • the selected specific first learning model may be determined based on information about an object subject to defect inspection, a history of previous defect inspection, settings of the electronic device 100, information input by a person requesting inspection or a terminal, and the like. can This may also be used in the same or similar form for specifying a target object.
  • the operation of the learning unit 320 shown in FIG. 4 is i) when image data of an object to be inspected is input through the input unit 310 shown in FIG. 3, ii) the previous period of i) or non-regular, iii) may be performed regularly or irregularly regardless of i) above.
  • the processing unit 330 may include a model selection module 510, a pre-processing module 520, a feature extraction module 530, and an inspection processing module 540.
  • the preprocessing module 520, the feature extraction module 530, and the inspection processing module 540 may be separate components from the corresponding components shown in FIG. 4 . Therefore, the pre-processing module, the feature extraction module, and the inspection processing module can be regarded as individually implemented in the learning unit 320 of FIG. 4 and the processing unit 330 of FIG. 5 described above.
  • the preprocessing module is the learning unit of FIG. 4 as shown in FIG. 7 ( 320) and the processing unit 330 of FIG. 5 may be implemented in a shared form.
  • the model selection module 510 determines a defect inspection operation mode for the object and selectively selects at least one of the learning models generated based on the training data set in FIG. 4 corresponding to the determined defect inspection operation mode. According to an embodiment, the model selection module 510 is based on the training dataset in FIG. 4 according to the defect inspection operation mode already determined by the components in the control unit 110 of FIG. 1 or the learning unit 320 of FIG. 4 . At least one of the generated learning models may be simply selected.
  • the processing unit 330 performs pre-processing in the pre-processing module 520 on the image data of the object input through the input unit 310, and the pre-processed image data of the object in the model selection module 510. Based on the learning model selected as the learning model corresponding to the defect inspection operation mode, defects are inspected from the image data of the object through the feature extraction module 530 and the inspection processing module 540, and defect inspection result data based on the inspection results. is generated and transmitted to the output unit 340. Thereafter, the output unit 340 provides defect inspection result data of the target object.
  • a first training dataset may be generated based on previously classified data for examination of an object, and a second training dataset may be generated based on or including unclassified unknown data. It can be.
  • the defect inspection unit 610 of the electronic device 100 may have a different configuration from the defect inspection unit shown in FIG. 3 . . That is, in FIG.
  • the defect inspection unit has a learning unit 320 and a processing unit 330 shown as separate configurations, and each configuration is implemented in such a way as to individually include a preprocessing module, a feature extraction module, and an inspection processing module, but FIG. 6 and 7, the preprocessing module 720, the feature extraction module 740, and the inspection processing module 750 are shared by the defect inspection unit 610 to perform the functions of the learning unit and the processing unit of FIG.
  • the defect inspection processing unit 610 of FIGS. 6 and 7 includes a model selection module 710 as shown in FIG. 5 .
  • the defect detection result data is the defective part detected on the image data of the object in the manner shown in FIG. 8 or FIG. 9(c) to FIG. 9(d), for example. It can be provided in a way that displays.
  • the present invention is not limited to this provision method. For example, text, audio, images (graphs), etc. related to the type of defects detected in the above method, the degree of defects, the number of defects, the defect ratio, etc. may be additionally added or provided individually.
  • FIGS. 8(a) to 8(b) show defect inspection results from image data of an object using the second defect inspection operation mode described above, that is, the second learning model described in FIGS. 4 to 5 .
  • FIGS. 8(c) to 8(d) show defect inspection results from the image data of the object using the first defect inspection operation mode described above, that is, the first learning model specialized for the object described in FIGS. 4 to 5. indicates
  • FIGS. 9(a) to 9(b) show results of inspecting defects in image data of an object using only one pre-generated and learned model.
  • FIGS. 9(c) to 9(d) show the results of examining defects in image data of an object by selectively selecting a learning model from among a plurality of learning models according to the present invention described above. .
  • Rectangular parts 811 to 844 in the image data shown in FIGS. 8(a) to 8(d) and 911 in the image data shown in FIGS. 9(a) to 9(d) to 941) denotes a defect detection portion.
  • the processor may represent all of the input unit, learning unit, processing unit, and output unit shown in FIGS. 3 to 7, or at least one of them, depending on the context.
  • Steps of a method or algorithm described in connection with an embodiment of the present invention may be implemented directly in hardware, implemented in a software module executed by hardware, or implemented by a combination thereof.
  • a software module may include random access memory (RAM), read only memory (ROM), erasable programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), flash memory, hard disk, removable disk, CD-ROM, or It may reside in any form of computer readable recording medium well known in the art to which the present invention pertains.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biochemistry (AREA)
  • Chemical & Material Sciences (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Analytical Chemistry (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Quality & Reliability (AREA)
  • Image Analysis (AREA)

Abstract

L'invention concerne un procédé, un dispositif et un système pour une inspection facultative non destructive d'objet basée sur un moteur d'intelligence artificielle. Un dispositif électronique pour effectuer l'inspection facultative non destructive d'objet basée sur un moteur d'intelligence artificielle comprend : une mémoire pour stocker une pluralité de variables d'apprentissage pour une inspection de défaut et une pluralité de modèles d'apprentissage correspondant aux variables d'apprentissage respectives ; et un processeur pour inspecter un défaut de l'objet, le processeur comprenant une unité centrale pour effectuer une inspection de défaut de l'objet sur la base d'au moins un modèle d'apprentissage sélectionné parmi la pluralité stockée de modèles d'apprentissage.
PCT/KR2022/019128 2021-11-30 2022-11-30 Procédé, dispositif et système pour une inspection facultative non destructive d'objet basée sur un moteur d'intelligence artificielle WO2023101375A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2021-0169106 2021-11-30
KR1020210169106A KR102602559B1 (ko) 2021-11-30 2021-11-30 선택적 인공 지능 엔진 기반 대상체 비파괴 검사 방법, 장치 및 시스템

Publications (1)

Publication Number Publication Date
WO2023101375A1 true WO2023101375A1 (fr) 2023-06-08

Family

ID=86612612

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2022/019128 WO2023101375A1 (fr) 2021-11-30 2022-11-30 Procédé, dispositif et système pour une inspection facultative non destructive d'objet basée sur un moteur d'intelligence artificielle

Country Status (2)

Country Link
KR (2) KR102602559B1 (fr)
WO (1) WO2023101375A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018147240A (ja) * 2017-03-06 2018-09-20 パナソニックIpマネジメント株式会社 画像処理装置、画像処理方法、及び画像処理プログラム
KR101940029B1 (ko) * 2018-07-11 2019-01-18 주식회사 마키나락스 어노말리 디텍션
JP2021039022A (ja) * 2019-09-04 2021-03-11 信越化学工業株式会社 欠陥分類方法及び欠陥分類システム並びにフォトマスクブランクの選別方法及び製造方法
KR20210042267A (ko) * 2018-05-27 2021-04-19 엘루시드 바이오이미징 아이엔씨. 정량적 이미징을 이용하기 위한 방법 및 시스템
KR102316286B1 (ko) * 2021-02-18 2021-10-22 임계현 인공 지능을 이용한 모발 상태 분석 방법 및 이를 수행하기 위한 컴퓨팅 장치

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102249836B1 (ko) 2019-08-26 2021-05-10 레이디소프트 주식회사 투과영상 기반의 비파괴검사 기능을 제공하기 위한 방법 및 컴퓨터 판독 가능한 저장 매체

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018147240A (ja) * 2017-03-06 2018-09-20 パナソニックIpマネジメント株式会社 画像処理装置、画像処理方法、及び画像処理プログラム
KR20210042267A (ko) * 2018-05-27 2021-04-19 엘루시드 바이오이미징 아이엔씨. 정량적 이미징을 이용하기 위한 방법 및 시스템
KR101940029B1 (ko) * 2018-07-11 2019-01-18 주식회사 마키나락스 어노말리 디텍션
JP2021039022A (ja) * 2019-09-04 2021-03-11 信越化学工業株式会社 欠陥分類方法及び欠陥分類システム並びにフォトマスクブランクの選別方法及び製造方法
KR102316286B1 (ko) * 2021-02-18 2021-10-22 임계현 인공 지능을 이용한 모발 상태 분석 방법 및 이를 수행하기 위한 컴퓨팅 장치

Also Published As

Publication number Publication date
KR20230160754A (ko) 2023-11-24
KR102602559B9 (ko) 2024-03-13
KR102602559B1 (ko) 2023-11-16
KR20230081911A (ko) 2023-06-08

Similar Documents

Publication Publication Date Title
CN110390351A (zh) 一种基于深度学习的致痫灶三维自动定位系统
CN111815564B (zh) 一种检测丝锭的方法、装置及丝锭分拣系统
WO2016204402A1 (fr) Procédé d'inspection de défaut de composant, et appareil associé
WO2021137454A1 (fr) Procédé et système à base d'intelligence artificielle pour analyser des informations médicales d'utilisateur
CN112534243A (zh) 检查装置及方法
CN114170478A (zh) 基于跨图像局部特征对齐的缺陷检测、定位的方法及系统
WO2022197044A1 (fr) Procédé de diagnostic de lésion de la vessie utilisant un réseau neuronal, et système associé
CN112766251B (zh) 变电设备红外检测方法、系统、储存介质及计算机设备
WO2023101375A1 (fr) Procédé, dispositif et système pour une inspection facultative non destructive d'objet basée sur un moteur d'intelligence artificielle
Ettalibi et al. AI and Computer Vision-based Real-time Quality Control: A Review of Industrial Applications
US20080094469A1 (en) Circuitry testing method and circuitry testing device
JP2023145412A (ja) 欠陥検出方法及びシステム
WO2023101374A1 (fr) Procédé, dispositif et système de test d'ensemble non destructif basés sur l'intelligence artificielle pour objet
JP2021042955A (ja) 食品検査装置、食品検査方法及び食品検査装置の食品再構成ニューラルネットワークの学習方法
CN115222649A (zh) 用于对热图的图案进行检测和分类的系统、设备和方法
KR20220111214A (ko) 인공지능 기반 제품 결함 검사 방법, 장치 및 컴퓨터 프로그램
WO2022108250A1 (fr) Procédé, appareil et programme de génération de cliché radiographique de haute qualité basés sur l'apprentissage profond
KR102585028B1 (ko) 3차원 처리 기반 대상체 비파괴 검사 방법, 장치 및 시스템
Acri et al. A novel phantom and a dedicated developed software for image quality controls in x-ray intraoral devices
KR20000050719A (ko) 접속케이블 자동검사시스템
CN110288573A (zh) 一种哺乳类家畜患病自动检测方法
JP3097153B2 (ja) 青果物等の等級検定装置
WO2022108249A1 (fr) Procédé, appareil et programme de génération de données d'apprentissage, et procédé de détection de substances étrangères l'utilisant
WO2023282611A1 (fr) Dispositif d'apprentissage de modèle d'ia pour la lecture d'un résultat de test par kit diagnostique et son procédé de fonctionnement
US20220172453A1 (en) Information processing system for determining inspection settings for object based on identification information thereof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22901733

Country of ref document: EP

Kind code of ref document: A1