WO2023112259A1 - Dispositif de traitement d'images - Google Patents

Dispositif de traitement d'images Download PDF

Info

Publication number
WO2023112259A1
WO2023112259A1 PCT/JP2021/046526 JP2021046526W WO2023112259A1 WO 2023112259 A1 WO2023112259 A1 WO 2023112259A1 JP 2021046526 W JP2021046526 W JP 2021046526W WO 2023112259 A1 WO2023112259 A1 WO 2023112259A1
Authority
WO
WIPO (PCT)
Prior art keywords
area
camera
image
predetermined
cpu
Prior art date
Application number
PCT/JP2021/046526
Other languages
English (en)
Japanese (ja)
Inventor
康樹 野上
Original Assignee
株式会社Fuji
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Fuji filed Critical 株式会社Fuji
Priority to PCT/JP2021/046526 priority Critical patent/WO2023112259A1/fr
Priority to JP2023567437A priority patent/JPWO2023112259A1/ja
Publication of WO2023112259A1 publication Critical patent/WO2023112259A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05KPRINTED CIRCUITS; CASINGS OR CONSTRUCTIONAL DETAILS OF ELECTRIC APPARATUS; MANUFACTURE OF ASSEMBLAGES OF ELECTRICAL COMPONENTS
    • H05K13/00Apparatus or processes specially adapted for manufacturing or adjusting assemblages of electric components
    • H05K13/08Monitoring manufacture of assemblages

Definitions

  • This specification relates to technology for processing images from cameras used in the mounting process of mounting components on boards.
  • Patent Document 1 discloses a control device that processes an image obtained by a camera used in a component mounter.
  • the control device identifies an area in which normal imaging cannot be performed in the entire area of the image as an unimaging area, and holds the part using nozzles that are scheduled to appear in the identified unimaging area among the plurality of nozzles. cancel. Even if a problem occurs in the camera, another nozzle is used to continue the mounting process of mounting components on the board.
  • the above patent document a rectangular area that circumscribes the dropped part area generated by dropping the part onto the camera in the vertical direction and the horizontal direction is defined as the non-imaging area. For this reason, the area of the non-imaging area is larger than the area of the dropped part area, and the malfunction of the camera cannot be evaluated correctly. Further, the above patent document is a technique for continuing the mounting process even if a problem occurs in the camera, and is not a technique for evaluating the camera problem and notifying the user of the camera problem. Techniques are provided herein for evaluating camera defects and for notifying users of such defects.
  • An image processing apparatus disclosed in the present specification is a processing execution unit that executes inspection processing for inspecting the state of a camera used in a mounting process for mounting a component on a board, wherein the inspection processing includes the component and the The processing execution unit includes processing for causing the camera to image a predetermined reference plate different from the substrate, and the image captured by the camera in the inspection processing includes a non-reference area having a predetermined area or more.
  • a notifying unit for executing a predetermined notifying operation, wherein the non-reference area includes a plurality of pixel values having a difference of a predetermined value or more from a pixel value indicating the reference plate in the image.
  • the reporting unit which is a blob area in which pixels are connected.
  • Camera malfunctions include, for example, dust adhering to the camera lens and malfunction of camera parts. Such defects may be imaged in a color different from the color of the object being imaged. Alternatively, in the imaged image, the luminance as a pixel value may differ between the defect portion and the imaged object. In addition, if the area of the region corresponding to the defect in the image is relatively large, there is a possibility that the imaging of the target will not be performed normally and the mounting process will fail.
  • the non-reference region is a region having a color different from that of the reference plate and having a predetermined area or more. If there is a non-reference area in the image, it can be evaluated that the camera is malfunctioning. According to the above configuration, the notification operation is performed when it can be evaluated that the camera is malfunctioning. The user can be notified of camera malfunctions.
  • FIG. 1 shows a block diagram of a mounting device
  • FIG. 4 shows a flowchart of first camera inspection processing according to the first embodiment.
  • 4 shows a flowchart of second camera inspection processing according to the first embodiment.
  • FIG. 11 shows a flowchart of a first camera inspection process according to a second embodiment;
  • the notification operation may not be executed when the non-reference area does not exist in the image.
  • the reference plate may be a white plate.
  • the predetermined area may be equal to or less than the area of the smallest part.
  • Images of dust adhering to camera lenses and images showing malfunctions of camera parts are usually smaller than images showing parts. According to the above configuration, it is possible to appropriately determine a blob area that is equal to or smaller than the area of the smallest component as a non-reference area.
  • the camera can capture an image of a nozzle that picks up the component and conveys it to the board, and in the inspection process, the reference plate may be picked up by the nozzle.
  • the execution of the notification operation allows the image indicating dust or failure at the peripheral edge of the lens. The user can be prompted to deal with the debris or malfunction of the
  • the camera may capture an image of the substrate supported on a predetermined support, and the reference plate may be supported by the support during the inspection process.
  • the mounting apparatus 10 is an apparatus capable of executing a mounting process for mounting components on a board.
  • a component is an electronic component such as a resistor.
  • the mounting apparatus 10 includes a moving device 12 , a nozzle 14 , a display 16 , a bottom parts camera 20 , a side parts camera 22 , a mark camera 24 , a control section 30 , a support base 50 and a parts feeder 60 .
  • the nozzle 14 , side parts camera 22 and mark camera 24 are attached to the moving device 12 .
  • the control unit 30 has a CPU 32 and a memory 34 .
  • the CPU 32 can execute various processes according to programs 40 stored in the memory 34 .
  • the CPU 32 can, for example, analyze the images captured by each camera 20, 22, 24. FIG.
  • the support base 50 is a base that supports a substrate that is transported by a transport device (for example, a conveyor, not shown).
  • the component feeder 60 is a device that supplies components into the mounting apparatus 10 .
  • the nozzle 14 is a device that picks up the parts supplied by the parts feeder 60 .
  • the display 16 is a device capable of displaying various information.
  • the moving device 12 is a device that can move in the horizontal direction (horizontal direction and orthogonal direction to the paper surface).
  • the moving device 12 moves above the component feeder 60, for example.
  • the nozzle 14 of the moving device 12 picks up the component supplied by the component feeder 60 .
  • the moving device 12 moves from the upper side of the component feeder 60 to the upper side of the support base 50 after the nozzle 14 picks up the component.
  • the nozzle 14 mounts the component sucked by the nozzle 14 on the board supported by the support table 50 . This implements the mounting process.
  • the bottom parts camera 20 is arranged between the parts feeder 60 and the support base 50 .
  • the lower surface parts camera 20 can image the nozzle 14 moving between the parts feeder 60 and the support base 50 from below. By analyzing the image captured by the lower part camera 20, the position within the nozzle 14 of the part sucked by the nozzle 14 can be obtained. Based on the position, it is inspected whether or not the component is correctly sucked by the nozzle 14 .
  • the side parts camera 22 is arranged adjacent to the nozzle 14 .
  • the side part camera 22 can image the part sucked by the nozzle 14 from the side.
  • the orientation of the part sucked by the nozzle 14 can be obtained. Based on the posture, it is inspected whether or not the component is correctly sucked by the nozzle 14 .
  • the mark camera 24 is arranged adjacent to the nozzle 14 .
  • the mark camera 24 can image the substrate supported by the support table 50 from above.
  • the appearance of the board on which the components are mounted is inspected.
  • the board identification number can be obtained from the mark (for example, bar code) printed on the board.
  • the images captured by each camera 20-24 are monochrome. Note that in a modified example, the images captured by each of the cameras 20-24 may be in color.
  • Dirt such as dust may adhere to the lens of the camera such as the bottom part camera 20 .
  • Dust adhering to the lens can usually be removed by a blower (not shown) of the mounting device 10 .
  • a blower not shown
  • Dust attached to the lens cannot be completely removed by the blower. Dust that cannot be removed by the blower is manually removed by the user of the mounting apparatus 10, for example.
  • the CPU 32 performs the first camera inspection process (see FIG. 3) and the second camera inspection process (see FIG. 4) in accordance with the program 40 to prompt the user to remove dust adhering to the lens. Execute.
  • the first camera inspection process is a process for inspecting whether or not dust adheres to the bottom parts camera 20 .
  • the first camera inspection process is performed before starting the mounting process.
  • the first camera inspection process is triggered by an instruction from the user.
  • the first camera inspection process may be performed during the mounting process.
  • the first camera inspection process may be automatically executed without an instruction from the user.
  • the CPU 32 causes the nozzle 14 to pick up the reference plate 100 (see FIG. 1).
  • the reference plate 100 is a white plate.
  • the reference plate 100 is, for example, prepared in advance at a predetermined position (for example, next to the component feeder 60) inside the mounting apparatus 10. As shown in FIG.
  • the CPU 32 moves the moving device 12 to the upper side of a predetermined position, and causes the nozzle 14 to absorb the reference plate 100 arranged at the predetermined position.
  • the CPU 32 moves the moving device 12 to the upper side of the lower parts camera 20 and causes the lower parts camera 20 to image the reference plate 100 sucked by the nozzle 14 . Thereby, the CPU 32 acquires an image (hereinafter referred to as a “first processed image”) showing the reference plate 100 from the lower parts camera 20 .
  • a first processed image showing the reference plate 100 from the lower parts camera 20 .
  • a blob region composed of pixels having pixel values different from those representing the reference plate 100 indicates dust adhering to the lens of the bottom part camera 20 .
  • the blob area is not included in the first processed image.
  • a blob area is an area in which a plurality of pixels having pixel values indicating a color (or brightness) different from the pixel values indicating white of the reference plate 100 (for example, “225 to 255”) are connected.
  • the CPU 32 determines whether the pixel value of each of the plurality of pixels forming the first processed image is smaller than a predetermined threshold value (eg "220").
  • a pixel value smaller than a predetermined threshold means that the color of the pixel indicated by the pixel value is different from the white color of the reference plate 100 .
  • the CPU 32 uses the result of determination at S14 (see the graph at the bottom of the page) to determine whether or not the first processed image includes one or more blob areas.
  • whether or not two pixels having pixel values indicating different colors are connected is determined, for example, by using a number that identifies each pixel (for example, the number indicated on the horizontal axis of the graph at the bottom of the page). and is judged. If the CPU 32 determines that one or more blob areas are included in the first processed image (YES in S16), the process proceeds to S18 and subsequent steps. On the other hand, when the CPU 32 determines that the first processed image does not include the blob area (NO in S16), the CPU 32 skips the processes after S18 and ends the process of FIG.
  • the CPU 32 determines whether or not the area of at least one blob area among the one or more blob areas is larger than a predetermined value.
  • the predetermined value is set to an area equal to or smaller than the smallest component among various types of components supplied by the component feeder 60 .
  • the image of dust on a camera lens is usually smaller than the image showing the part. According to such a configuration, it is possible to appropriately determine a blob region that is equal to or smaller than the area of the smallest component as a region indicating dust adhering to the lens.
  • the predetermined value may be set independently of the sizes of various types of parts supplied by the parts feeder 60 .
  • the process proceeds to S20.
  • the CPU 32 causes the display 16 to display an instruction to remove dust adhering to the lens of the bottom parts camera 20 .
  • the display 16 causes the display 16 to display an instruction to remove dust adhering to the lens of the bottom parts camera 20 .
  • the CPU 32 determines that the area of all of the one or more blob areas is equal to or less than the predetermined value (NO in S18).
  • the CPU 32 proceeds to S22.
  • a blob area having an area smaller than a predetermined value is highly likely to be, for example, noise from the bottom part camera 20 rather than dust adhering to the lens.
  • the CPU 32 corrects a plurality of pixel values representing a plurality of pixels forming the blob region in order to remove noise from the bottom part camera 20.
  • FIG. For example, the CPU 32 corrects the plurality of pixel values using the average value of the plurality of pixel values.
  • the second camera inspection process is a process for inspecting whether or not dust adheres to the mark camera 24 .
  • the trigger for the second camera inspection process is similar to the trigger for the first camera inspection process.
  • the second camera inspection process is the same as the first camera inspection process except that the processes of S100 and S102 are executed instead of S10 and S12 of FIG.
  • the CPU 32 causes the transport device to transport the white reference plate 200, which is the same as the reference plate 100. Thereby, the reference plate 200 is supported by the support base 50 .
  • the reference plate 200 is, for example, put into the transport device by a user.
  • the CPU 32 moves the moving device 12 to the upper side of the support base 50, and causes the mark camera 24 to image the reference plate 200 supported by the support base 50. As a result, the CPU 32 acquires an image representing the reference plate 200 from the mark camera 24 (hereinafter referred to as a “second processed image”).
  • the CPU 32 executes the processes of S14 to S22.
  • the second processed image is used in S14 to S22 of the second camera inspection process. For example, when the CPU 32 determines that the area of at least one blob area included in the second processed image is larger than a predetermined value (YES in S18), the CPU 32 instructs to remove dust adhering to the lens of the mark camera 24. is displayed on the display 16 (S20). Accordingly, it is possible to evaluate the presence of dust on the lens of the mark camera 24 and notify the user of the presence of dust.
  • the control unit 30 of the mounting apparatus 10 is an example of the "image processing apparatus".
  • the lower part camera 20, the reference plate 100, and the predetermined value in S18 of FIG. be.
  • the second camera inspection process of FIG. 4 the mark camera 24, the reference plate 200, and the predetermined value in S18 of FIG. .
  • the process proceeds to S200.
  • the CPU 32 selects one blob area (hereinafter referred to as "target blob area”) from one or more blob areas.
  • the CPU 32 determines whether or not the target blob area is included in the central area including the center of the lens of the bottom parts camera 20 in the first processed image.
  • the position of the target blob area is determined, for example, by using the numbers that identify the pixels forming the target blob area (for example, the numbers indicated on the horizontal axis of the graph at the bottom of the page of FIG. 3). be done.
  • the CPU 32 determines that the target blob area is included in the central area (YES in S202)
  • the process proceeds to S204. Whether each pixel in the first processed image belongs to the central region or the peripheral region is predetermined and stored in the memory 34 .
  • the area including the larger area of the blob area or the area including the larger number of pixels of the blob area may be determined as an area including the blob area.
  • the CPU 32 determines whether or not the area of the target blob area is larger than the first predetermined value.
  • the CPU 32 determines that the area of the target blob region is larger than the first predetermined value (YES in S204)
  • the process proceeds to S210.
  • the CPU 32 decides to display an instruction to remove dust adhering to the lens of the bottom parts camera 20 . Note that the display of the instruction is not performed until the determinations of S202 and S204 are performed for all of one or more blob areas.
  • S212 is the same as S22 in FIG. After completing S210 or S212, the CPU 32 proceeds to S230.
  • the process proceeds to S214.
  • the CPU 32 determines whether or not the area of the target blob area is larger than the second predetermined value.
  • the second predetermined value is a value smaller than the first predetermined value of S204.
  • the CPU 32 determines that the area of the target blob area is equal to or less than the second predetermined value (NO in S214).
  • the CPU 32 proceeds to S222.
  • S222 is similar to S212.
  • S220 or S222 ends, the CPU 32 proceeds to S230.
  • the CPU 32 determines whether or not there is an unselected blob area among the one or more blob areas. When the CPU 32 determines that there is an unselected blob area among one or more blob areas (YES in S230), the CPU 32 returns to S200 and selects another blob area. On the other hand, when the CPU 32 determines that there is no unselected blob area among the one or more blob areas (NO in S230), the process proceeds to S232.
  • the CPU 32 determines whether or not it was decided at S210 or S220 to display an instruction to remove dust adhering to the lens.
  • the CPU 32 determines that display of the instruction has been determined (YES in S232)
  • the process proceeds to S240.
  • the CPU 32 determines that display of the instruction has not been determined (NO in S232)
  • the CPU 32 skips S240 and ends the processing of FIG.
  • S240 is the same as S20 in FIG. When S240 ends, the CPU 32 ends the processing of FIG.
  • the second predetermined value of S214 is smaller than the first predetermined value of S204. Then, even if the blob area included in the peripheral area is smaller than the blob area included in the central area, the user is instructed to remove dust (S220). That is, even if the dust adhering to the periphery of the lens is smaller than the dust adhering to the center of the lens, the user can be prompted to remove the dust adhering to the periphery of the lens. For example, dust adhering to the center of the lens is easily removed by the blower of the mounting apparatus 10, but dust adhering to the periphery of the lens may not be completely removed by the blower. The user can be urged to remove dust that cannot be removed by the blower.
  • the first predetermined value of S204 and the second predetermined value of S214 in FIG. 5 are examples of the "first area" and the "second area”, respectively.
  • the “image processing device” is not limited to the control unit 30 of the mounting device 10 , and may be an external device (for example, a server) provided separately from the mounting device 10 .
  • the "camera state” is not limited to dust adhering to the camera lens, and may be, for example, a malfunction of camera parts.
  • the "notification operation” is an operation for notifying the user of a failure of a part of the camera. For example, the user can take action to replace the failed component.
  • the "notification operation” is not limited to the display of the instruction in S20 of FIG.
  • the first camera inspection process in FIG. 3 may be employed to inspect the state of the side parts camera 22 (adhesion of dust, failure of parts, etc.).
  • the side part camera 22 is an example of a "camera”.
  • the color of the "reference plate” is not limited to white, and may be, for example, a color other than white (eg, gray).
  • either one of the first camera inspection process in FIG. 3 and the second camera inspection process in FIG. 4 may not be executed.
  • the determination of the pixel value in S14 of FIG. 3 is not limited to the configuration of the above embodiment.
  • the CPU 32 may calculate a difference value between a pixel value representing the reference plate 100 and each of a plurality of pixels forming the first processed image.
  • the CPU 32 may determine whether or not the calculated difference value is greater than or equal to a specific value that is predetermined and stored in the memory 34 .
  • Pixel values indicating the reference plate 100 are stored in advance in the memory 34, for example.
  • the CPU 32 may determine an area in which a plurality of pixels having pixel values with a difference value equal to or greater than a specific value are connected to each other as a blob area.
  • an area in which a plurality of pixels having pixel values whose difference values are equal to or greater than a specific value are connected to each other is an example of the "non-reference area".

Landscapes

  • Engineering & Computer Science (AREA)
  • Operations Research (AREA)
  • Manufacturing & Machinery (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Studio Devices (AREA)

Abstract

L'invention concerne un dispositif de traitement d'images comprenant : une unité d'exécution de processus qui exécute un processus d'inspection pour inspecter l'état d'une caméra qui est utilisée dans une étape de montage pour monter un composant sur un substrat, le processus d'inspection comprenant un processus pour amener la caméra à imager une plaque de référence prédéterminée différente de la composante et du substrat ; et une unité de notification qui exécute une opération de notification prédéterminée si une région hors de référence ayant une zone supérieure ou égale à une zone prédéterminée est présente dans une image capturée par la caméra dans le processus d'inspection, la région hors référence étant une région de tache dans laquelle une pluralité de pixels sont présents, la pluralité de pixels étant connectés et ayant des valeurs de pixel dont une différence par rapport à une valeur de pixel indiquant la plaque de référence dans l'image est supérieure ou égale à une valeur prédéterminée.
PCT/JP2021/046526 2021-12-16 2021-12-16 Dispositif de traitement d'images WO2023112259A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2021/046526 WO2023112259A1 (fr) 2021-12-16 2021-12-16 Dispositif de traitement d'images
JP2023567437A JPWO2023112259A1 (fr) 2021-12-16 2021-12-16

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/046526 WO2023112259A1 (fr) 2021-12-16 2021-12-16 Dispositif de traitement d'images

Publications (1)

Publication Number Publication Date
WO2023112259A1 true WO2023112259A1 (fr) 2023-06-22

Family

ID=86773866

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/046526 WO2023112259A1 (fr) 2021-12-16 2021-12-16 Dispositif de traitement d'images

Country Status (2)

Country Link
JP (1) JPWO2023112259A1 (fr)
WO (1) WO2023112259A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11330799A (ja) * 1998-05-21 1999-11-30 Sony Corp 部品装着装置
WO2005022901A1 (fr) * 2003-08-29 2005-03-10 Nikon Corporation Dispositif, programme et progiciel de diagnostic de systeme d'imagerie et dispositif d'imagerie
JP2012173045A (ja) * 2011-02-18 2012-09-10 Jfe Steel Corp 表面検査装置の評価装置及び表面検査装置の評価方法
WO2016092651A1 (fr) * 2014-12-10 2016-06-16 富士機械製造株式会社 Dispositif de montage d'éléments
WO2019198220A1 (fr) * 2018-04-13 2019-10-17 株式会社Fuji Dispositif de gestion de maintenance, dispositif de montage et procédé de gestion de maintenance

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11330799A (ja) * 1998-05-21 1999-11-30 Sony Corp 部品装着装置
WO2005022901A1 (fr) * 2003-08-29 2005-03-10 Nikon Corporation Dispositif, programme et progiciel de diagnostic de systeme d'imagerie et dispositif d'imagerie
JP2012173045A (ja) * 2011-02-18 2012-09-10 Jfe Steel Corp 表面検査装置の評価装置及び表面検査装置の評価方法
WO2016092651A1 (fr) * 2014-12-10 2016-06-16 富士機械製造株式会社 Dispositif de montage d'éléments
WO2019198220A1 (fr) * 2018-04-13 2019-10-17 株式会社Fuji Dispositif de gestion de maintenance, dispositif de montage et procédé de gestion de maintenance

Also Published As

Publication number Publication date
JPWO2023112259A1 (fr) 2023-06-22

Similar Documents

Publication Publication Date Title
US8527082B2 (en) Component mounting method, component mounting apparatus, method for determining mounting conditions, and apparatus and program for determining mounting conditions
JP3966189B2 (ja) 基板検査方法およびこの方法を用いた基板検査装置
CN1527126A (zh) 图像处理系统、投影机和图像处理方法
JP2009004754A (ja) 部品実装方法、部品実装機、実装条件決定方法、実装条件決定装置およびプログラム
WO2015040667A1 (fr) Dispositif d'inspection de montage
CN107006148B (zh) 元件安装装置及元件安装系统
WO2023112259A1 (fr) Dispositif de traitement d'images
WO2013069506A1 (fr) Machine d'impression
JP4342199B2 (ja) 部品実装機における部品吸着位置補正装置
CN115461612A (zh) 外观检查装置和外观检查方法
US8780194B2 (en) Component presence/absence judging apparatus and method
JP2016194434A (ja) 検査システムおよび検査方法
JP2021056004A (ja) 画像判定装置及び画像判定方法
JP2009216647A (ja) 欠陥検査方法及び欠陥検査装置
JP4911121B2 (ja) バックアップピンの設置方法
JPH11344449A (ja) 外観検査方法
JP5713441B2 (ja) 部品実装システム
JP5881237B2 (ja) 検査装置、処理装置、情報処理装置、対象物製造装置及びその製造方法
JP7333408B2 (ja) 画像処理装置、部品実装システムおよび画像処理方法
JP2010141209A (ja) 基板検査装置及び基板検査方法
JP7016846B2 (ja) 部品実装装置
JP2007071785A (ja) プロジェクタの検査方法
JP5003610B2 (ja) 基板検査方法
EP4092408A1 (fr) Dispositif et procédé d'inspection
US11393366B2 (en) Projection calibrations

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21968174

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2023567437

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE