WO2023112259A1 - Image processing device - Google Patents

Image processing device Download PDF

Info

Publication number
WO2023112259A1
WO2023112259A1 PCT/JP2021/046526 JP2021046526W WO2023112259A1 WO 2023112259 A1 WO2023112259 A1 WO 2023112259A1 JP 2021046526 W JP2021046526 W JP 2021046526W WO 2023112259 A1 WO2023112259 A1 WO 2023112259A1
Authority
WO
WIPO (PCT)
Prior art keywords
area
camera
image
predetermined
cpu
Prior art date
Application number
PCT/JP2021/046526
Other languages
French (fr)
Japanese (ja)
Inventor
康樹 野上
Original Assignee
株式会社Fuji
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Fuji filed Critical 株式会社Fuji
Priority to PCT/JP2021/046526 priority Critical patent/WO2023112259A1/en
Publication of WO2023112259A1 publication Critical patent/WO2023112259A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05KPRINTED CIRCUITS; CASINGS OR CONSTRUCTIONAL DETAILS OF ELECTRIC APPARATUS; MANUFACTURE OF ASSEMBLAGES OF ELECTRICAL COMPONENTS
    • H05K13/00Apparatus or processes specially adapted for manufacturing or adjusting assemblages of electric components
    • H05K13/08Monitoring manufacture of assemblages

Definitions

  • This specification relates to technology for processing images from cameras used in the mounting process of mounting components on boards.
  • Patent Document 1 discloses a control device that processes an image obtained by a camera used in a component mounter.
  • the control device identifies an area in which normal imaging cannot be performed in the entire area of the image as an unimaging area, and holds the part using nozzles that are scheduled to appear in the identified unimaging area among the plurality of nozzles. cancel. Even if a problem occurs in the camera, another nozzle is used to continue the mounting process of mounting components on the board.
  • the above patent document a rectangular area that circumscribes the dropped part area generated by dropping the part onto the camera in the vertical direction and the horizontal direction is defined as the non-imaging area. For this reason, the area of the non-imaging area is larger than the area of the dropped part area, and the malfunction of the camera cannot be evaluated correctly. Further, the above patent document is a technique for continuing the mounting process even if a problem occurs in the camera, and is not a technique for evaluating the camera problem and notifying the user of the camera problem. Techniques are provided herein for evaluating camera defects and for notifying users of such defects.
  • An image processing apparatus disclosed in the present specification is a processing execution unit that executes inspection processing for inspecting the state of a camera used in a mounting process for mounting a component on a board, wherein the inspection processing includes the component and the The processing execution unit includes processing for causing the camera to image a predetermined reference plate different from the substrate, and the image captured by the camera in the inspection processing includes a non-reference area having a predetermined area or more.
  • a notifying unit for executing a predetermined notifying operation, wherein the non-reference area includes a plurality of pixel values having a difference of a predetermined value or more from a pixel value indicating the reference plate in the image.
  • the reporting unit which is a blob area in which pixels are connected.
  • Camera malfunctions include, for example, dust adhering to the camera lens and malfunction of camera parts. Such defects may be imaged in a color different from the color of the object being imaged. Alternatively, in the imaged image, the luminance as a pixel value may differ between the defect portion and the imaged object. In addition, if the area of the region corresponding to the defect in the image is relatively large, there is a possibility that the imaging of the target will not be performed normally and the mounting process will fail.
  • the non-reference region is a region having a color different from that of the reference plate and having a predetermined area or more. If there is a non-reference area in the image, it can be evaluated that the camera is malfunctioning. According to the above configuration, the notification operation is performed when it can be evaluated that the camera is malfunctioning. The user can be notified of camera malfunctions.
  • FIG. 1 shows a block diagram of a mounting device
  • FIG. 4 shows a flowchart of first camera inspection processing according to the first embodiment.
  • 4 shows a flowchart of second camera inspection processing according to the first embodiment.
  • FIG. 11 shows a flowchart of a first camera inspection process according to a second embodiment;
  • the notification operation may not be executed when the non-reference area does not exist in the image.
  • the reference plate may be a white plate.
  • the predetermined area may be equal to or less than the area of the smallest part.
  • Images of dust adhering to camera lenses and images showing malfunctions of camera parts are usually smaller than images showing parts. According to the above configuration, it is possible to appropriately determine a blob area that is equal to or smaller than the area of the smallest component as a non-reference area.
  • the camera can capture an image of a nozzle that picks up the component and conveys it to the board, and in the inspection process, the reference plate may be picked up by the nozzle.
  • the execution of the notification operation allows the image indicating dust or failure at the peripheral edge of the lens. The user can be prompted to deal with the debris or malfunction of the
  • the camera may capture an image of the substrate supported on a predetermined support, and the reference plate may be supported by the support during the inspection process.
  • the mounting apparatus 10 is an apparatus capable of executing a mounting process for mounting components on a board.
  • a component is an electronic component such as a resistor.
  • the mounting apparatus 10 includes a moving device 12 , a nozzle 14 , a display 16 , a bottom parts camera 20 , a side parts camera 22 , a mark camera 24 , a control section 30 , a support base 50 and a parts feeder 60 .
  • the nozzle 14 , side parts camera 22 and mark camera 24 are attached to the moving device 12 .
  • the control unit 30 has a CPU 32 and a memory 34 .
  • the CPU 32 can execute various processes according to programs 40 stored in the memory 34 .
  • the CPU 32 can, for example, analyze the images captured by each camera 20, 22, 24. FIG.
  • the support base 50 is a base that supports a substrate that is transported by a transport device (for example, a conveyor, not shown).
  • the component feeder 60 is a device that supplies components into the mounting apparatus 10 .
  • the nozzle 14 is a device that picks up the parts supplied by the parts feeder 60 .
  • the display 16 is a device capable of displaying various information.
  • the moving device 12 is a device that can move in the horizontal direction (horizontal direction and orthogonal direction to the paper surface).
  • the moving device 12 moves above the component feeder 60, for example.
  • the nozzle 14 of the moving device 12 picks up the component supplied by the component feeder 60 .
  • the moving device 12 moves from the upper side of the component feeder 60 to the upper side of the support base 50 after the nozzle 14 picks up the component.
  • the nozzle 14 mounts the component sucked by the nozzle 14 on the board supported by the support table 50 . This implements the mounting process.
  • the bottom parts camera 20 is arranged between the parts feeder 60 and the support base 50 .
  • the lower surface parts camera 20 can image the nozzle 14 moving between the parts feeder 60 and the support base 50 from below. By analyzing the image captured by the lower part camera 20, the position within the nozzle 14 of the part sucked by the nozzle 14 can be obtained. Based on the position, it is inspected whether or not the component is correctly sucked by the nozzle 14 .
  • the side parts camera 22 is arranged adjacent to the nozzle 14 .
  • the side part camera 22 can image the part sucked by the nozzle 14 from the side.
  • the orientation of the part sucked by the nozzle 14 can be obtained. Based on the posture, it is inspected whether or not the component is correctly sucked by the nozzle 14 .
  • the mark camera 24 is arranged adjacent to the nozzle 14 .
  • the mark camera 24 can image the substrate supported by the support table 50 from above.
  • the appearance of the board on which the components are mounted is inspected.
  • the board identification number can be obtained from the mark (for example, bar code) printed on the board.
  • the images captured by each camera 20-24 are monochrome. Note that in a modified example, the images captured by each of the cameras 20-24 may be in color.
  • Dirt such as dust may adhere to the lens of the camera such as the bottom part camera 20 .
  • Dust adhering to the lens can usually be removed by a blower (not shown) of the mounting device 10 .
  • a blower not shown
  • Dust attached to the lens cannot be completely removed by the blower. Dust that cannot be removed by the blower is manually removed by the user of the mounting apparatus 10, for example.
  • the CPU 32 performs the first camera inspection process (see FIG. 3) and the second camera inspection process (see FIG. 4) in accordance with the program 40 to prompt the user to remove dust adhering to the lens. Execute.
  • the first camera inspection process is a process for inspecting whether or not dust adheres to the bottom parts camera 20 .
  • the first camera inspection process is performed before starting the mounting process.
  • the first camera inspection process is triggered by an instruction from the user.
  • the first camera inspection process may be performed during the mounting process.
  • the first camera inspection process may be automatically executed without an instruction from the user.
  • the CPU 32 causes the nozzle 14 to pick up the reference plate 100 (see FIG. 1).
  • the reference plate 100 is a white plate.
  • the reference plate 100 is, for example, prepared in advance at a predetermined position (for example, next to the component feeder 60) inside the mounting apparatus 10. As shown in FIG.
  • the CPU 32 moves the moving device 12 to the upper side of a predetermined position, and causes the nozzle 14 to absorb the reference plate 100 arranged at the predetermined position.
  • the CPU 32 moves the moving device 12 to the upper side of the lower parts camera 20 and causes the lower parts camera 20 to image the reference plate 100 sucked by the nozzle 14 . Thereby, the CPU 32 acquires an image (hereinafter referred to as a “first processed image”) showing the reference plate 100 from the lower parts camera 20 .
  • a first processed image showing the reference plate 100 from the lower parts camera 20 .
  • a blob region composed of pixels having pixel values different from those representing the reference plate 100 indicates dust adhering to the lens of the bottom part camera 20 .
  • the blob area is not included in the first processed image.
  • a blob area is an area in which a plurality of pixels having pixel values indicating a color (or brightness) different from the pixel values indicating white of the reference plate 100 (for example, “225 to 255”) are connected.
  • the CPU 32 determines whether the pixel value of each of the plurality of pixels forming the first processed image is smaller than a predetermined threshold value (eg "220").
  • a pixel value smaller than a predetermined threshold means that the color of the pixel indicated by the pixel value is different from the white color of the reference plate 100 .
  • the CPU 32 uses the result of determination at S14 (see the graph at the bottom of the page) to determine whether or not the first processed image includes one or more blob areas.
  • whether or not two pixels having pixel values indicating different colors are connected is determined, for example, by using a number that identifies each pixel (for example, the number indicated on the horizontal axis of the graph at the bottom of the page). and is judged. If the CPU 32 determines that one or more blob areas are included in the first processed image (YES in S16), the process proceeds to S18 and subsequent steps. On the other hand, when the CPU 32 determines that the first processed image does not include the blob area (NO in S16), the CPU 32 skips the processes after S18 and ends the process of FIG.
  • the CPU 32 determines whether or not the area of at least one blob area among the one or more blob areas is larger than a predetermined value.
  • the predetermined value is set to an area equal to or smaller than the smallest component among various types of components supplied by the component feeder 60 .
  • the image of dust on a camera lens is usually smaller than the image showing the part. According to such a configuration, it is possible to appropriately determine a blob region that is equal to or smaller than the area of the smallest component as a region indicating dust adhering to the lens.
  • the predetermined value may be set independently of the sizes of various types of parts supplied by the parts feeder 60 .
  • the process proceeds to S20.
  • the CPU 32 causes the display 16 to display an instruction to remove dust adhering to the lens of the bottom parts camera 20 .
  • the display 16 causes the display 16 to display an instruction to remove dust adhering to the lens of the bottom parts camera 20 .
  • the CPU 32 determines that the area of all of the one or more blob areas is equal to or less than the predetermined value (NO in S18).
  • the CPU 32 proceeds to S22.
  • a blob area having an area smaller than a predetermined value is highly likely to be, for example, noise from the bottom part camera 20 rather than dust adhering to the lens.
  • the CPU 32 corrects a plurality of pixel values representing a plurality of pixels forming the blob region in order to remove noise from the bottom part camera 20.
  • FIG. For example, the CPU 32 corrects the plurality of pixel values using the average value of the plurality of pixel values.
  • the second camera inspection process is a process for inspecting whether or not dust adheres to the mark camera 24 .
  • the trigger for the second camera inspection process is similar to the trigger for the first camera inspection process.
  • the second camera inspection process is the same as the first camera inspection process except that the processes of S100 and S102 are executed instead of S10 and S12 of FIG.
  • the CPU 32 causes the transport device to transport the white reference plate 200, which is the same as the reference plate 100. Thereby, the reference plate 200 is supported by the support base 50 .
  • the reference plate 200 is, for example, put into the transport device by a user.
  • the CPU 32 moves the moving device 12 to the upper side of the support base 50, and causes the mark camera 24 to image the reference plate 200 supported by the support base 50. As a result, the CPU 32 acquires an image representing the reference plate 200 from the mark camera 24 (hereinafter referred to as a “second processed image”).
  • the CPU 32 executes the processes of S14 to S22.
  • the second processed image is used in S14 to S22 of the second camera inspection process. For example, when the CPU 32 determines that the area of at least one blob area included in the second processed image is larger than a predetermined value (YES in S18), the CPU 32 instructs to remove dust adhering to the lens of the mark camera 24. is displayed on the display 16 (S20). Accordingly, it is possible to evaluate the presence of dust on the lens of the mark camera 24 and notify the user of the presence of dust.
  • the control unit 30 of the mounting apparatus 10 is an example of the "image processing apparatus".
  • the lower part camera 20, the reference plate 100, and the predetermined value in S18 of FIG. be.
  • the second camera inspection process of FIG. 4 the mark camera 24, the reference plate 200, and the predetermined value in S18 of FIG. .
  • the process proceeds to S200.
  • the CPU 32 selects one blob area (hereinafter referred to as "target blob area”) from one or more blob areas.
  • the CPU 32 determines whether or not the target blob area is included in the central area including the center of the lens of the bottom parts camera 20 in the first processed image.
  • the position of the target blob area is determined, for example, by using the numbers that identify the pixels forming the target blob area (for example, the numbers indicated on the horizontal axis of the graph at the bottom of the page of FIG. 3). be done.
  • the CPU 32 determines that the target blob area is included in the central area (YES in S202)
  • the process proceeds to S204. Whether each pixel in the first processed image belongs to the central region or the peripheral region is predetermined and stored in the memory 34 .
  • the area including the larger area of the blob area or the area including the larger number of pixels of the blob area may be determined as an area including the blob area.
  • the CPU 32 determines whether or not the area of the target blob area is larger than the first predetermined value.
  • the CPU 32 determines that the area of the target blob region is larger than the first predetermined value (YES in S204)
  • the process proceeds to S210.
  • the CPU 32 decides to display an instruction to remove dust adhering to the lens of the bottom parts camera 20 . Note that the display of the instruction is not performed until the determinations of S202 and S204 are performed for all of one or more blob areas.
  • S212 is the same as S22 in FIG. After completing S210 or S212, the CPU 32 proceeds to S230.
  • the process proceeds to S214.
  • the CPU 32 determines whether or not the area of the target blob area is larger than the second predetermined value.
  • the second predetermined value is a value smaller than the first predetermined value of S204.
  • the CPU 32 determines that the area of the target blob area is equal to or less than the second predetermined value (NO in S214).
  • the CPU 32 proceeds to S222.
  • S222 is similar to S212.
  • S220 or S222 ends, the CPU 32 proceeds to S230.
  • the CPU 32 determines whether or not there is an unselected blob area among the one or more blob areas. When the CPU 32 determines that there is an unselected blob area among one or more blob areas (YES in S230), the CPU 32 returns to S200 and selects another blob area. On the other hand, when the CPU 32 determines that there is no unselected blob area among the one or more blob areas (NO in S230), the process proceeds to S232.
  • the CPU 32 determines whether or not it was decided at S210 or S220 to display an instruction to remove dust adhering to the lens.
  • the CPU 32 determines that display of the instruction has been determined (YES in S232)
  • the process proceeds to S240.
  • the CPU 32 determines that display of the instruction has not been determined (NO in S232)
  • the CPU 32 skips S240 and ends the processing of FIG.
  • S240 is the same as S20 in FIG. When S240 ends, the CPU 32 ends the processing of FIG.
  • the second predetermined value of S214 is smaller than the first predetermined value of S204. Then, even if the blob area included in the peripheral area is smaller than the blob area included in the central area, the user is instructed to remove dust (S220). That is, even if the dust adhering to the periphery of the lens is smaller than the dust adhering to the center of the lens, the user can be prompted to remove the dust adhering to the periphery of the lens. For example, dust adhering to the center of the lens is easily removed by the blower of the mounting apparatus 10, but dust adhering to the periphery of the lens may not be completely removed by the blower. The user can be urged to remove dust that cannot be removed by the blower.
  • the first predetermined value of S204 and the second predetermined value of S214 in FIG. 5 are examples of the "first area" and the "second area”, respectively.
  • the “image processing device” is not limited to the control unit 30 of the mounting device 10 , and may be an external device (for example, a server) provided separately from the mounting device 10 .
  • the "camera state” is not limited to dust adhering to the camera lens, and may be, for example, a malfunction of camera parts.
  • the "notification operation” is an operation for notifying the user of a failure of a part of the camera. For example, the user can take action to replace the failed component.
  • the "notification operation” is not limited to the display of the instruction in S20 of FIG.
  • the first camera inspection process in FIG. 3 may be employed to inspect the state of the side parts camera 22 (adhesion of dust, failure of parts, etc.).
  • the side part camera 22 is an example of a "camera”.
  • the color of the "reference plate” is not limited to white, and may be, for example, a color other than white (eg, gray).
  • either one of the first camera inspection process in FIG. 3 and the second camera inspection process in FIG. 4 may not be executed.
  • the determination of the pixel value in S14 of FIG. 3 is not limited to the configuration of the above embodiment.
  • the CPU 32 may calculate a difference value between a pixel value representing the reference plate 100 and each of a plurality of pixels forming the first processed image.
  • the CPU 32 may determine whether or not the calculated difference value is greater than or equal to a specific value that is predetermined and stored in the memory 34 .
  • Pixel values indicating the reference plate 100 are stored in advance in the memory 34, for example.
  • the CPU 32 may determine an area in which a plurality of pixels having pixel values with a difference value equal to or greater than a specific value are connected to each other as a blob area.
  • an area in which a plurality of pixels having pixel values whose difference values are equal to or greater than a specific value are connected to each other is an example of the "non-reference area".

Abstract

This image processing device comprises: a process execution unit that executes an inspection process for inspecting the state of a camera that is utilized in a mounting step for mounting a component on a substrate, the inspection process including a process for causing the camera to image a predetermined reference plate different from the component and the substrate; and a notification unit that executes a predetermined notification operation if an out-of-reference region having an area greater than or equal to a predetermined area is present in an image captured by the camera in the inspection process, the out-of-reference region being a blob region in which a plurality of pixels are present, the plurality of pixels being connected and having pixel values of which a difference from a pixel value indicating the reference plate in the image is greater than or equal to a predetermined value.

Description

画像処理装置Image processing device
 本明細書は、部品を基板に実装する実装工程で利用されるカメラの画像を処理する技術に関する。 This specification relates to technology for processing images from cameras used in the mounting process of mounting components on boards.
 特許文献1には、部品実装機で利用されるカメラによって得られる画像を処理する制御装置が開示されている。制御装置は、画像の全領域において、正常な撮像ができない領域を撮像不可能領域として特定し、複数個のノズルのうち、特定済みの撮像不可能領域に写る予定のノズルを利用した部品の保持を中止する。カメラに不具合が発生しても、他のノズルを利用して、基板に部品を実装する実装工程が継続される。 Patent Document 1 discloses a control device that processes an image obtained by a camera used in a component mounter. The control device identifies an area in which normal imaging cannot be performed in the entire area of the image as an unimaging area, and holds the part using nozzles that are scheduled to appear in the identified unimaging area among the plurality of nozzles. cancel. Even if a problem occurs in the camera, another nozzle is used to continue the mounting process of mounting components on the board.
特開2018-098280号公報JP 2018-098280 A
 上記の特許文献では、カメラに部品が落下することにより生じる落下部品領域に対して、縦方向と横方向とで外接する矩形状の領域を撮像不可能領域とする。このため、撮像不可能領域の面積は、落下部品領域の面積に対して広い面積となり、カメラの不具合を正しく評価するものとはならない。また、上記の特許文献は、カメラに不具合が発生しても実装工程を継続するための技術であり、カメラの不具合を評価し、ユーザにカメラの不具合を報知する技術ではない。本明細書では、カメラの不具合を評価し、その不具合をユーザに知らせるための技術を提供する。 In the above patent document, a rectangular area that circumscribes the dropped part area generated by dropping the part onto the camera in the vertical direction and the horizontal direction is defined as the non-imaging area. For this reason, the area of the non-imaging area is larger than the area of the dropped part area, and the malfunction of the camera cannot be evaluated correctly. Further, the above patent document is a technique for continuing the mounting process even if a problem occurs in the camera, and is not a technique for evaluating the camera problem and notifying the user of the camera problem. Techniques are provided herein for evaluating camera defects and for notifying users of such defects.
 本明細書で開示する画像処理装置は、部品を基板に実装する実装工程で利用されるカメラの状態を検査する検査処理を実行する処理実行部であって、前記検査処理は、前記部品及び前記基板とは異なる所定の基準板を前記カメラに撮像させる処理を含む、前記処理実行部と、前記検査処理において前記カメラによって撮像された画像内に所定の面積以上の領域である基準外領域が存在する場合に、所定の報知動作を実行する報知部であって、前記基準外領域は、前記画像のうちの前記基準板を示す画素値との差が所定値以上である画素値を有する複数の画素が繋がって存在するブロブ領域である、前記報知部と、を備える。 An image processing apparatus disclosed in the present specification is a processing execution unit that executes inspection processing for inspecting the state of a camera used in a mounting process for mounting a component on a board, wherein the inspection processing includes the component and the The processing execution unit includes processing for causing the camera to image a predetermined reference plate different from the substrate, and the image captured by the camera in the inspection processing includes a non-reference area having a predetermined area or more. a notifying unit for executing a predetermined notifying operation, wherein the non-reference area includes a plurality of pixel values having a difference of a predetermined value or more from a pixel value indicating the reference plate in the image. and the reporting unit, which is a blob area in which pixels are connected.
 カメラの不具合は、例えば、カメラのレンズへのゴミの付着、カメラの部品の故障等である。このような不具合は、撮像の対象の色とは異なる色で撮像され得る。あるいは、撮像された画像において、不具合の部分と撮像の対象とでは画素値としての輝度が異なり得る。また、画像内の当該不具合に対応する領域の面積が比較的に大きい場合には、対象の撮像が正常に行われず、実装工程が失敗する可能性がある。上記の基準外領域は、基準板とは異なる色で、所定の面積以上の領域である。画像内に基準外領域が存在する場合には、カメラに不具合が発生していると評価することができる。上記の構成によれば、カメラに不具合が発生していると評価できる場合に、報知動作が実行される。カメラの不具合をユーザに知らせることができる。 Camera malfunctions include, for example, dust adhering to the camera lens and malfunction of camera parts. Such defects may be imaged in a color different from the color of the object being imaged. Alternatively, in the imaged image, the luminance as a pixel value may differ between the defect portion and the imaged object. In addition, if the area of the region corresponding to the defect in the image is relatively large, there is a possibility that the imaging of the target will not be performed normally and the mounting process will fail. The non-reference region is a region having a color different from that of the reference plate and having a predetermined area or more. If there is a non-reference area in the image, it can be evaluated that the camera is malfunctioning. According to the above configuration, the notification operation is performed when it can be evaluated that the camera is malfunctioning. The user can be notified of camera malfunctions.
実装装置の構成を模式的に表す側面図を示す。The side view which expresses the structure of a mounting apparatus typically is shown. 実装装置のブロック図を示す。1 shows a block diagram of a mounting device; FIG. 第1実施例に係る第1のカメラ検査処理のフローチャートを示す。4 shows a flowchart of first camera inspection processing according to the first embodiment. 第1実施例に係る第2のカメラ検査処理のフローチャートを示す。4 shows a flowchart of second camera inspection processing according to the first embodiment. 第2実施例に係る第1のカメラ検査処理のフローチャートを示す。FIG. 11 shows a flowchart of a first camera inspection process according to a second embodiment; FIG.
 以下に説明する実施例の主要な特徴を列記しておく。なお、以下に記載する技術要素は、それぞれ独立した技術要素であって、単独であるいは各種の組合せによって技術的有用性を発揮するものであり、出願時の請求項に記載の組合せに限定されるものではない。 The main features of the embodiments described below are listed. It should be noted that the technical elements described below are independent technical elements, exhibiting technical usefulness alone or in various combinations, and are limited to the combinations described in the claims as filed. not a thing
(特徴1) 前記画像内に前記基準外領域が存在しない場合に、前記報知動作は実行されなくてもよい。 (Feature 1) The notification operation may not be executed when the non-reference area does not exist in the image.
 このような構成によれば、報知動作が不必要に実行されることを抑制することができる。 According to such a configuration, unnecessary execution of the notification operation can be suppressed.
(特徴2) 前記基準板は、白色の板であってもよい。 (Feature 2) The reference plate may be a white plate.
(特徴3) 前記所定の面積は、最も小さい前記部品の面積以下であってもよい。 (Feature 3) The predetermined area may be equal to or less than the area of the smallest part.
 カメラのレンズに付着するゴミの画像、カメラの部品の故障を示す画像は、通常、部品を示す画像よりも小さい。上記の構成によれば、最も小さい部品の面積以下のブロブ領域を基準外領域として適切に判断することができる。 Images of dust adhering to camera lenses and images showing malfunctions of camera parts are usually smaller than images showing parts. According to the above configuration, it is possible to appropriately determine a blob area that is equal to or smaller than the area of the smallest component as a non-reference area.
(特徴4) 前記カメラは、前記部品を吸着して前記基板まで搬送するノズルを撮像可能であり、前記検査処理において、前記基準板は前記ノズルに吸着されてもよい。 (Feature 4) The camera can capture an image of a nozzle that picks up the component and conveys it to the board, and in the inspection process, the reference plate may be picked up by the nozzle.
 このような構成によれば、ノズルを撮像可能なカメラの不具合をユーザに知らせることができる。 According to such a configuration, it is possible to inform the user of a malfunction of the camera capable of imaging the nozzle.
(特徴5) 前記画像のうち、前記カメラのレンズの中央を示す中央領域において、前記所定の面積である第1の面積以上の前記ブロブ領域が存在する場合に、前記画像内に前記基準外領域が存在すると判断され、前記画像のうち、前記カメラのレンズの周縁を示す周縁領域において、前記所定の面積である第2の面積であって、前記第1の面積より小さい前記第2の面積以上の前記ブロブ領域が存在する場合に、前記画像内に前記基準外領域が存在すると判断されてもよい。 (Feature 5) In the image, in the central area indicating the center of the lens of the camera, when the blob area having the first area or more, which is the predetermined area, exists, the non-standard area in the image is determined to exist, and in the image, in the peripheral area indicating the peripheral edge of the lens of the camera, the second area that is the predetermined area and is equal to or larger than the second area that is smaller than the first area It may be determined that the non-reference area exists in the image when the blob area of .
 このような構成によれば、レンズの中央のゴミ又は故障を示す画像と比較して、レンズの周縁のゴミ又は故障を示す画像が小さい場合であっても、報知動作の実行によって、レンズの周縁のゴミ又は故障に対処することをユーザに促すことができる。 According to such a configuration, even if the image indicating dust or failure at the peripheral edge of the lens is smaller than the image indicating dust or failure at the center of the lens, the execution of the notification operation allows the image indicating dust or failure at the peripheral edge of the lens. The user can be prompted to deal with the debris or malfunction of the
(特徴6) 前記カメラは、所定の支持台の上に支持されている前記基板を撮像可能であり、前記検査処理において、前記基準板は前記支持台に支持されてもよい。 (Feature 6) The camera may capture an image of the substrate supported on a predetermined support, and the reference plate may be supported by the support during the inspection process.
 このような構成によれば、支持台の上に支持されている基板を撮像可能なカメラの不具合をユーザに知らせることができる。 According to such a configuration, it is possible to notify the user of a malfunction of the camera capable of imaging the substrate supported on the support base.
(第1実施例)
(実装装置10の構成;図1、図2)
 実装装置10は、部品を基板に実装する実装工程を実行可能な装置である。部品は、抵抗器等の電子部品である。実装装置10は、移動装置12、ノズル14、ディスプレイ16、下面パーツカメラ20、側面パーツカメラ22、マークカメラ24、制御部30、支持台50、及び、部品フィーダ60を備える。ノズル14、側面パーツカメラ22、及び、マークカメラ24は、移動装置12に取り付けられている。
(First embodiment)
(Configuration of mounting apparatus 10; FIGS. 1 and 2)
The mounting apparatus 10 is an apparatus capable of executing a mounting process for mounting components on a board. A component is an electronic component such as a resistor. The mounting apparatus 10 includes a moving device 12 , a nozzle 14 , a display 16 , a bottom parts camera 20 , a side parts camera 22 , a mark camera 24 , a control section 30 , a support base 50 and a parts feeder 60 . The nozzle 14 , side parts camera 22 and mark camera 24 are attached to the moving device 12 .
 各部12~24、60は、制御部30によって制御される。制御部30は、CPU32と、メモリ34と、を備える。CPU32は、メモリ34に記憶されているプログラム40に従って様々な処理を実行可能である。CPU32は、例えば、各カメラ20、22、24によって撮像された画像を分析可能である。 Each unit 12-24, 60 is controlled by the control unit 30. The control unit 30 has a CPU 32 and a memory 34 . The CPU 32 can execute various processes according to programs 40 stored in the memory 34 . The CPU 32 can, for example, analyze the images captured by each camera 20, 22, 24. FIG.
 支持台50は、搬送装置(例えばコンベア、図示省略)によって搬送される基板を支持する台である。部品フィーダ60は、実装装置10内に部品を供給する装置である。 The support base 50 is a base that supports a substrate that is transported by a transport device (for example, a conveyor, not shown). The component feeder 60 is a device that supplies components into the mounting apparatus 10 .
 ノズル14は、部品フィーダ60によって供給される部品を吸着する装置である。ディスプレイ16は、様々の情報を表示可能な装置である。 The nozzle 14 is a device that picks up the parts supplied by the parts feeder 60 . The display 16 is a device capable of displaying various information.
 移動装置12は、水平方向(紙面左右方向及び紙面直交方向)に移動可能な装置である。移動装置12は、例えば、部品フィーダ60の上側に移動する。移動装置12のノズル14は、部品フィーダ60によって供給された部品を吸着する。移動装置12は、ノズル14が部品を吸着した後に、部品フィーダ60の上側から支持台50の上側に移動する。ノズル14は、ノズル14に吸着されている部品を支持台50に支持されている基板に実装する。これにより、実装工程が実現される。 The moving device 12 is a device that can move in the horizontal direction (horizontal direction and orthogonal direction to the paper surface). The moving device 12 moves above the component feeder 60, for example. The nozzle 14 of the moving device 12 picks up the component supplied by the component feeder 60 . The moving device 12 moves from the upper side of the component feeder 60 to the upper side of the support base 50 after the nozzle 14 picks up the component. The nozzle 14 mounts the component sucked by the nozzle 14 on the board supported by the support table 50 . This implements the mounting process.
 下面パーツカメラ20は、部品フィーダ60と支持台50との間に配置されている。下面パーツカメラ20は、部品フィーダ60と支持台50との間を移動するノズル14を下側から撮像可能である。下面パーツカメラ20によって撮像された画像を分析することによって、ノズル14に吸着された部品のノズル14内の位置を取得することができる。当該位置に基づいて、部品がノズル14に正しく吸着されているのか否かが検査される。 The bottom parts camera 20 is arranged between the parts feeder 60 and the support base 50 . The lower surface parts camera 20 can image the nozzle 14 moving between the parts feeder 60 and the support base 50 from below. By analyzing the image captured by the lower part camera 20, the position within the nozzle 14 of the part sucked by the nozzle 14 can be obtained. Based on the position, it is inspected whether or not the component is correctly sucked by the nozzle 14 .
 側面パーツカメラ22は、ノズル14に隣接して配置されている。側面パーツカメラ22は、ノズル14に吸着されている部品を側面から撮像可能である。側面パーツカメラ22によって撮像された画像を分析することによって、ノズル14に吸着された部品の姿勢を取得することができる。当該姿勢に基づいて、部品がノズル14に正しく吸着されているのか否かが検査される。 The side parts camera 22 is arranged adjacent to the nozzle 14 . The side part camera 22 can image the part sucked by the nozzle 14 from the side. By analyzing the image captured by the side part camera 22, the orientation of the part sucked by the nozzle 14 can be obtained. Based on the posture, it is inspected whether or not the component is correctly sucked by the nozzle 14 .
 マークカメラ24は、ノズル14に隣接して配置されている。マークカメラ24は、支持台50に支持されている基板を上側から撮像可能である。マークカメラ24によって撮像された画像を分析することによって、部品が実装された基板の外観が検査される。また、マークカメラ24によって撮像された画像を分析することによって、基板に印字されているマーク(例えばバーコード)から、基板の識別番号を取得可能である。 The mark camera 24 is arranged adjacent to the nozzle 14 . The mark camera 24 can image the substrate supported by the support table 50 from above. By analyzing the image captured by the mark camera 24, the appearance of the board on which the components are mounted is inspected. Also, by analyzing the image captured by the mark camera 24, the board identification number can be obtained from the mark (for example, bar code) printed on the board.
 各カメラ20~24によって撮像される画像は、モノクロである。なお、変形例では、各カメラ20~24によって撮像される画像は、カラーであってもよい。 The images captured by each camera 20-24 are monochrome. Note that in a modified example, the images captured by each of the cameras 20-24 may be in color.
 下面パーツカメラ20等のカメラのレンズには、塵埃等のゴミが付着し得る。レンズに付着したゴミは、通常、実装装置10のブロア(図示省略)によって除去可能である。しかし、レンズに付着したゴミがブロアによって除去しきれない状況が想定される。ブロアによって除去しきれないゴミは、例えば、実装装置10のユーザによって手作業で除去される。 Dirt such as dust may adhere to the lens of the camera such as the bottom part camera 20 . Dust adhering to the lens can usually be removed by a blower (not shown) of the mounting device 10 . However, a situation is assumed in which the dust attached to the lens cannot be completely removed by the blower. Dust that cannot be removed by the blower is manually removed by the user of the mounting apparatus 10, for example.
 本実施例では、CPU32は、プログラム40に従って、レンズに付着したゴミの除去をユーザに促すために、第1のカメラ検査処理(図3参照)と第2のカメラ検査処理(図4参照)を実行する。 In this embodiment, the CPU 32 performs the first camera inspection process (see FIG. 3) and the second camera inspection process (see FIG. 4) in accordance with the program 40 to prompt the user to remove dust adhering to the lens. Execute.
(第1のカメラ検査処理;図3)
 第1のカメラ検査処理は、下面パーツカメラ20にゴミが付着しているか否かを検査するための処理である。例えば、第1のカメラ検査処理は、実装工程の開始前に実行される。例えば、第1のカメラ検査処理は、ユーザからの指示をトリガとして実行される。なお、変形例では、実装工程の途中で第1のカメラ検査処理が実行されてもよい。また、第1のカメラ検査処理は、ユーザからの指示無しに自動的に実行されてもよい。
(First camera inspection process; FIG. 3)
The first camera inspection process is a process for inspecting whether or not dust adheres to the bottom parts camera 20 . For example, the first camera inspection process is performed before starting the mounting process. For example, the first camera inspection process is triggered by an instruction from the user. Note that, in a modified example, the first camera inspection process may be performed during the mounting process. Also, the first camera inspection process may be automatically executed without an instruction from the user.
 S10では、CPU32は、基準板100(図1参照)の吸着をノズル14に実行させる。基準板100は、白色の板である。基準板100は、例えば、実装装置10内の所定の位置(例えば部品フィーダ60の隣)に予め準備されている。CPU32は、移動装置12を所定の位置の上側まで移動させ、所定の位置に配置されている基準板100をノズル14に吸着させる。 In S10, the CPU 32 causes the nozzle 14 to pick up the reference plate 100 (see FIG. 1). The reference plate 100 is a white plate. The reference plate 100 is, for example, prepared in advance at a predetermined position (for example, next to the component feeder 60) inside the mounting apparatus 10. As shown in FIG. The CPU 32 moves the moving device 12 to the upper side of a predetermined position, and causes the nozzle 14 to absorb the reference plate 100 arranged at the predetermined position.
 S12では、CPU32は、移動装置12を下面パーツカメラ20の上側まで移動させ、ノズル14に吸着されている基準板100を下面パーツカメラ20に撮像させる。これにより、CPU32は、下面パーツカメラ20から基準板100を示す画像(以下では、「第1の処理画像」と記載)を取得する。例えば、下面パーツカメラ20のレンズにゴミが付着している場合には、図3に示すように、基準板100の白色とは異なる色(例えばグレー)を示すブロブ領域が第1の処理画像に含まれる。すなわち、下面パーツカメラ20のレンズにゴミが付着している場合には、第1の処理画像において、基準板100の白色を示す画素値としての輝度を有する画素と、基準板100を示す画素値とは異なる画素値としての輝度を有する画素が存在する。基準板100を示す画素値とは異なる画素値を有する画素からなるブロブ領域は、下面パーツカメラ20のレンズに付着しているゴミを示す。一方、下面パーツカメラ20のレンズにゴミが付着していない場合には、ブロブ領域が第1の処理画像に含まれない。ブロブ領域は、基準板100の白色を示す画素値(例えば「225~255」)とは異なる色(又は輝度)を示す画素値を有する複数の画素が繋がって存在する領域である。 In S12, the CPU 32 moves the moving device 12 to the upper side of the lower parts camera 20 and causes the lower parts camera 20 to image the reference plate 100 sucked by the nozzle 14 . Thereby, the CPU 32 acquires an image (hereinafter referred to as a “first processed image”) showing the reference plate 100 from the lower parts camera 20 . For example, when dust is attached to the lens of the lower surface parts camera 20, as shown in FIG. included. That is, when dust adheres to the lens of the bottom parts camera 20, in the first processed image, a pixel having a luminance as a pixel value indicating the white color of the reference plate 100 and a pixel value indicating the reference plate 100 There exists a pixel having a luminance as a pixel value different from that of . A blob region composed of pixels having pixel values different from those representing the reference plate 100 indicates dust adhering to the lens of the bottom part camera 20 . On the other hand, when dust does not adhere to the lens of the bottom parts camera 20, the blob area is not included in the first processed image. A blob area is an area in which a plurality of pixels having pixel values indicating a color (or brightness) different from the pixel values indicating white of the reference plate 100 (for example, “225 to 255”) are connected.
 S14では、CPU32は、第1の処理画像を構成する複数個の画素のそれぞれについて、当該画素の画素値が所定の閾値(例えば「220」)よりも小さいか否かを判断する。画素値が所定の閾値よりも小さいことは、当該画素値によって示される画素の色が、基準板100の白色とは異なる色であることを意味する。CPU32は、第1の処理画像を構成する複数個の画素のそれぞれについて、上記の判断が終了すると、S16に進む。 In S14, the CPU 32 determines whether the pixel value of each of the plurality of pixels forming the first processed image is smaller than a predetermined threshold value (eg "220"). A pixel value smaller than a predetermined threshold means that the color of the pixel indicated by the pixel value is different from the white color of the reference plate 100 . After completing the above determination for each of the plurality of pixels forming the first processed image, the CPU 32 proceeds to S16.
 S16では、CPU32は、S14の判断の結果(紙面下側のグラフ参照)を利用して、第1の処理画像に1個以上のブロブ領域が含まれるのか否かを判断する。ここで、異なる色を示す画素値を有する2個の画素が繋がっているか否かは、例えば、各画素を識別する番号(例えば紙面下側のグラフの横軸に記載されている番号)を利用して判断される。CPU32は、第1の処理画像に1個以上のブロブ領域が含まれると判断する場合(S16でYES)に、S18以降の処理に進む。一方、CPU32は、第1の処理画像にブロブ領域が含まれないと判断する場合(S16でNO)に、S18以降の処理をスキップして、図3の処理を終了する。 At S16, the CPU 32 uses the result of determination at S14 (see the graph at the bottom of the page) to determine whether or not the first processed image includes one or more blob areas. Here, whether or not two pixels having pixel values indicating different colors are connected is determined, for example, by using a number that identifies each pixel (for example, the number indicated on the horizontal axis of the graph at the bottom of the page). and is judged. If the CPU 32 determines that one or more blob areas are included in the first processed image (YES in S16), the process proceeds to S18 and subsequent steps. On the other hand, when the CPU 32 determines that the first processed image does not include the blob area (NO in S16), the CPU 32 skips the processes after S18 and ends the process of FIG.
 S18では、CPU32は、1個以上のブロブ領域のうちの少なくとも1個のブロブ領域の面積が所定値より大きいのか否かを判断する。ここで、所定値は、部品フィーダ60によって供給される様々な種類の部品のうち、最も小さい部品以下の面積に設定される。カメラのレンズに付着するゴミの画像は、通常、部品を示す画像よりも小さい。このような構成によれば、最も小さい部品の面積以下のブロブ領域を、レンズに付着したゴミを示す領域として適切に判断することができる。なお、変形例では、所定値は、部品フィーダ60によって供給される様々な種類の部品の大きさとは無関係に設定されてもよい。 In S18, the CPU 32 determines whether or not the area of at least one blob area among the one or more blob areas is larger than a predetermined value. Here, the predetermined value is set to an area equal to or smaller than the smallest component among various types of components supplied by the component feeder 60 . The image of dust on a camera lens is usually smaller than the image showing the part. According to such a configuration, it is possible to appropriately determine a blob region that is equal to or smaller than the area of the smallest component as a region indicating dust adhering to the lens. It should be noted that, in a modified example, the predetermined value may be set independently of the sizes of various types of parts supplied by the parts feeder 60 .
 CPU32は、少なくとも1個のブロブ領域の面積が所定値より大きいと判断する場合(S18でYES)に、S20に進む。S20では、CPU32は、下面パーツカメラ20のレンズに付着したゴミを除去する指示をディスプレイ16に表示させる。これにより、下面パーツカメラ20のレンズにゴミが付着していることを評価して、ゴミの付着をユーザに知らせることができる。 When the CPU 32 determines that the area of at least one blob area is larger than the predetermined value (YES in S18), the process proceeds to S20. In S<b>20 , the CPU 32 causes the display 16 to display an instruction to remove dust adhering to the lens of the bottom parts camera 20 . As a result, it is possible to evaluate whether dust has adhered to the lens of the bottom part camera 20 and notify the user of the dust adherence.
 一方、CPU32は、1個以上のブロブ領域の全てについて、当該ブログ領域の面積が所定値以下と判断する場合(S18でNO)に、S22に進む。所定値よりも小さい面積を有するブロブ領域は、レンズに付着したゴミではなく、例えば、下面パーツカメラ20のノイズである可能性が高い。S22では、CPU32は、下面パーツカメラ20のノイズを除去するためにブロブ領域を構成する複数個の画素を示す複数個の画素値を補正する。例えば、CPU32は、当該複数個の画素値の平均値を利用して、当該複数個の画素値を補正する。S20又はS22が終了すると、CPU32は、図3の処理を終了する。 On the other hand, if the CPU 32 determines that the area of all of the one or more blob areas is equal to or less than the predetermined value (NO in S18), the CPU 32 proceeds to S22. A blob area having an area smaller than a predetermined value is highly likely to be, for example, noise from the bottom part camera 20 rather than dust adhering to the lens. In S22, the CPU 32 corrects a plurality of pixel values representing a plurality of pixels forming the blob region in order to remove noise from the bottom part camera 20. FIG. For example, the CPU 32 corrects the plurality of pixel values using the average value of the plurality of pixel values. When S20 or S22 ends, the CPU 32 ends the processing of FIG.
 なお、1個以上のブロブ領域の全てについて、当該ブログ領域の面積が所定値よりも小さいと判断される場合(S18でNO)に、S20の処理は実行されない。S20の処理(即ちユーザへの報知)が不必要に実行されることを抑制することができる。 Note that if it is determined that the area of each of the one or more blob areas is smaller than the predetermined value (NO in S18), the process of S20 is not executed. Unnecessary execution of the process of S20 (that is, notification to the user) can be suppressed.
(第2のカメラ検査処理;図4)
 第2のカメラ検査処理は、マークカメラ24にゴミが付着しているか否かを検査するための処理である。例えば、第2のカメラ検査処理のトリガは、第1のカメラ検査処理のトリガと同様である。第2のカメラ検査処理は、図3のS10、S12に代えて、S100、S102の処理が実行される点を除いて、第1のカメラ検査処理と同様である。
(Second camera inspection process; FIG. 4)
The second camera inspection process is a process for inspecting whether or not dust adheres to the mark camera 24 . For example, the trigger for the second camera inspection process is similar to the trigger for the first camera inspection process. The second camera inspection process is the same as the first camera inspection process except that the processes of S100 and S102 are executed instead of S10 and S12 of FIG.
 S100では、CPU32は、基準板100と同じ白色の基準板200を搬送装置に搬送させる。これにより、基準板200が支持台50に支持される。基準板200は、例えば、ユーザによって搬送装置に投入される。 At S100, the CPU 32 causes the transport device to transport the white reference plate 200, which is the same as the reference plate 100. Thereby, the reference plate 200 is supported by the support base 50 . The reference plate 200 is, for example, put into the transport device by a user.
 S102では、CPU32は、移動装置12を支持台50の上側まで移動させ、支持台50に支持されている基準板200をマークカメラ24に撮像させる。これにより、CPU32は、マークカメラ24から基準板200を示す画像(以下では、「第2の処理画像」と記載)を取得する。 In S102, the CPU 32 moves the moving device 12 to the upper side of the support base 50, and causes the mark camera 24 to image the reference plate 200 supported by the support base 50. As a result, the CPU 32 acquires an image representing the reference plate 200 from the mark camera 24 (hereinafter referred to as a “second processed image”).
 S102が終了すると、CPU32は、S14~S22の処理を実行する。第2のカメラ検査処理のS14~S22では、第2の処理画像が利用される。例えば、CPU32は、第2の処理画像に含まれる少なくとも1個のブロブ領域の面積が所定値より大きいと判断する場合(S18でYES)に、マークカメラ24のレンズに付着したゴミを除去する指示をディスプレイ16に表示させる(S20)。これにより、マークカメラ24のレンズにゴミが付着していることを評価して、ゴミの付着をユーザに知らせることができる。 When S102 ends, the CPU 32 executes the processes of S14 to S22. The second processed image is used in S14 to S22 of the second camera inspection process. For example, when the CPU 32 determines that the area of at least one blob area included in the second processed image is larger than a predetermined value (YES in S18), the CPU 32 instructs to remove dust adhering to the lens of the mark camera 24. is displayed on the display 16 (S20). Accordingly, it is possible to evaluate the presence of dust on the lens of the mark camera 24 and notify the user of the presence of dust.
(対応関係)
 実装装置10の制御部30が、「画像処理装置」の一例である。図3の第1のカメラ検査処理において、下面パーツカメラ20、基準板100、図3のS18の所定値が、それぞれ、「カメラ」、「所定の基準板」、「所定の面積」の一例である。第1のカメラ検査処理において、図3のS10~S18の処理を実行するCPU32、図3のS20の処理を実行するCPU32が、それぞれ、「処理実行部」、「報知部」の一例である。図4の第2のカメラ検査処理において、マークカメラ24、基準板200、図4のS18の所定値が、それぞれ、「カメラ」、「所定の基準板」、「所定の面積」の一例である。第2のカメラ検査処理において、図4のS100、S102、S14~S18の処理を実行するCPU32、図4のS20の処理を実行するCPU32が、それぞれ、「処理実行部」、「報知部」の一例である。
(correspondence relationship)
The control unit 30 of the mounting apparatus 10 is an example of the "image processing apparatus". In the first camera inspection process of FIG. 3, the lower part camera 20, the reference plate 100, and the predetermined value in S18 of FIG. be. In the first camera inspection process, the CPU 32 that executes the processes of S10 to S18 in FIG. 3 and the CPU 32 that executes the process of S20 in FIG. In the second camera inspection process of FIG. 4, the mark camera 24, the reference plate 200, and the predetermined value in S18 of FIG. . In the second camera inspection process, the CPU 32 that executes the processes of S100, S102, and S14 to S18 in FIG. 4 and the CPU 32 that executes the process of S20 in FIG. An example.
(第2実施例)
 本実施例では、下面パーツカメラ20にゴミが付着しているか否かを検査する第1のカメラ検査処理の内容が異なる点を除いて、第1実施例と同様である。
(Second embodiment)
This embodiment is the same as the first embodiment, except that the content of the first camera inspection process for inspecting whether or not dust adheres to the bottom parts camera 20 is different.
 (第1のカメラ検査処理;図5)
 S10~S16は、第1実施例と同様である。CPU32は、第1の処理画像にブロブ領域が含まれないと判断する場合(S16でNO)に、後述するS200以降の処理をスキップして、図5の処理を終了する。
(First camera inspection process; FIG. 5)
S10 to S16 are the same as in the first embodiment. When the CPU 32 determines that the first processed image does not include the blob area (NO in S16), the CPU 32 skips the processing after S200 described later and ends the processing in FIG.
 CPU32は、第1の処理画像に1個以上のブロブ領域が含まれると判断する場合(S16でYES)に、S200に進む。S200では、CPU32は、1個以上のブロブ領域の中から1個のブロブ領域(以下では、「対象のブロブ領域」と記載)を選択する。 When the CPU 32 determines that one or more blob areas are included in the first processed image (YES in S16), the process proceeds to S200. In S200, the CPU 32 selects one blob area (hereinafter referred to as "target blob area") from one or more blob areas.
 続くS202では、CPU32は、対象のブロブ領域が、第1の処理画像のうち、下面パーツカメラ20のレンズの中央を含む中央領域に含まれるのか否かを判断する。ここで、対象のブロブ領域の位置は、例えば、対象のブロブ領域を構成する画素を識別する番号(例えば図3の紙面下側のグラフの横軸に記載されている番号)を利用して判断される。CPU32は、対象のブロブ領域が中央領域に含まれると判断する場合(S202でYES)に、S204に進む。第1の処理画像における各画素が中央領域に属するのか、周縁領域に属するのか、については予め定められ、メモリ34に記憶されている。なお、対象のブロブ領域を構成する複数の画素が、中央領域と周縁領域にまたがっている場合には、当該ブロブ領域の面積をより大きく含む領域、または当該ブロブ領域の画素数をより多く含む領域を、当該ブロブ領域を含む領域と判断してもよい。 In subsequent S202, the CPU 32 determines whether or not the target blob area is included in the central area including the center of the lens of the bottom parts camera 20 in the first processed image. Here, the position of the target blob area is determined, for example, by using the numbers that identify the pixels forming the target blob area (for example, the numbers indicated on the horizontal axis of the graph at the bottom of the page of FIG. 3). be done. When the CPU 32 determines that the target blob area is included in the central area (YES in S202), the process proceeds to S204. Whether each pixel in the first processed image belongs to the central region or the peripheral region is predetermined and stored in the memory 34 . In addition, when a plurality of pixels constituting the target blob area straddles the central area and the peripheral area, the area including the larger area of the blob area or the area including the larger number of pixels of the blob area may be determined as an area including the blob area.
 S204では、CPU32は、対象のブロブ領域の面積が第1の所定値より大きいのか否かを判断する。CPU32は、対象ブロブ領域の面積が第1の所定値より大きいと判断する場合(S204でYES)に、S210に進む。S210では、CPU32は、下面パーツカメラ20のレンズに付着したゴミを除去する指示の表示を決定する。なお、当該指示の表示は、1個以上のブロブ領域の全てについてS202、S204の判断が実行されるまで実行されない。 In S204, the CPU 32 determines whether or not the area of the target blob area is larger than the first predetermined value. When the CPU 32 determines that the area of the target blob region is larger than the first predetermined value (YES in S204), the process proceeds to S210. In S<b>210 , the CPU 32 decides to display an instruction to remove dust adhering to the lens of the bottom parts camera 20 . Note that the display of the instruction is not performed until the determinations of S202 and S204 are performed for all of one or more blob areas.
 また、CPU32は、対象ブロブ領域の面積が第1の所定値以下であると判断する場合(S204でNO)に、S212に進む。S212は、図3のS22と同様である。S210又はS212が終了すると、CPU32は、S230に進む。 Also, when the CPU 32 determines that the area of the target blob area is equal to or less than the first predetermined value (NO in S204), the process proceeds to S212. S212 is the same as S22 in FIG. After completing S210 or S212, the CPU 32 proceeds to S230.
 また、CPU32は、対象のブロブ領域が中央領域に含まれず、下面パーツカメラ20のレンズの周縁を示す周縁領域に含まれると判断する場合(S202でNO)に、S214に進む。S214では、CPU32は、対象のブロブ領域の面積が第2の所定値より大きいのか否かを判断する。ここで、第2の所定値は、S204の第1の所定値よりも小さい値である。CPU32は、対象ブロブ領域の面積が第2の所定値より大きいと判断する場合(S214でYES)に、S220に進む。S220は、S210と同様である。 Also, when the CPU 32 determines that the target blob area is not included in the central area but is included in the peripheral area indicating the peripheral edge of the lens of the bottom part camera 20 (NO in S202), the process proceeds to S214. In S214, the CPU 32 determines whether or not the area of the target blob area is larger than the second predetermined value. Here, the second predetermined value is a value smaller than the first predetermined value of S204. When the CPU 32 determines that the area of the target blob area is larger than the second predetermined value (YES in S214), the process proceeds to S220. S220 is similar to S210.
 また、CPU32は、対象ブロブ領域の面積が第2の所定値以下であると判断する場合(S214でNO)に、S222に進む。S222は、S212と同様である。S220又はS222が終了すると、CPU32は、S230に進む。 Also, when the CPU 32 determines that the area of the target blob area is equal to or less than the second predetermined value (NO in S214), the CPU 32 proceeds to S222. S222 is similar to S212. When S220 or S222 ends, the CPU 32 proceeds to S230.
 S230では、CPU32は、1個以上のブロブ領域の中に未選択のブロブ領域が存在するのか否かを判断する。CPU32は、1個以上のブロブ領域の中に未選択のブロブ領域が存在すると判断する場合(S230でYES)に、S200に戻り、他のブロブ領域を選択する。一方、CPU32は、1個以上のブロブ領域の中に未選択のブロブ領域が存在しないと判断する場合(S230でNO)に、S232に進む。 At S230, the CPU 32 determines whether or not there is an unselected blob area among the one or more blob areas. When the CPU 32 determines that there is an unselected blob area among one or more blob areas (YES in S230), the CPU 32 returns to S200 and selects another blob area. On the other hand, when the CPU 32 determines that there is no unselected blob area among the one or more blob areas (NO in S230), the process proceeds to S232.
 S232では、CPU32は、S210又はS220において、レンズに付着したゴミを除去する指示の表示が決定されたのか否かを判断する。CPU32は、当該指示の表示が決定されたと判断する場合(S232でYES)に、S240に進む。一方、CPU32は、当該指示の表示が決定されていないと判断する場合(S232でNO)に、S240をスキップして、図5の処理を終了する。 At S232, the CPU 32 determines whether or not it was decided at S210 or S220 to display an instruction to remove dust adhering to the lens. When the CPU 32 determines that display of the instruction has been determined (YES in S232), the process proceeds to S240. On the other hand, when the CPU 32 determines that display of the instruction has not been determined (NO in S232), the CPU 32 skips S240 and ends the processing of FIG.
 S240は、図3のS20と同様である。S240が終了すると、CPU32は、図5の処理を終了する。 S240 is the same as S20 in FIG. When S240 ends, the CPU 32 ends the processing of FIG.
 本実施例において、S214の第2の所定値は、S204の第1の所定値よりも小さい。そして、中央領域に含まれるブロブ領域と比較して、周縁領域に含まれるブロブ領域が小さい場合であっても、ゴミの除去をユーザに指示する(S220)。即ち、レンズの中央に付着するゴミと比較して、レンズの周縁に付着するゴミが小さい場合であっても、レンズの周縁に付着するゴミの除去をユーザに促すことができる。例えば、レンズの中央に付着したゴミは、実装装置10のブロアによって除去され易いが、レンズの周縁に付着したゴミは、当該ブロアで除去しきれない可能性がある。ブロアで除去しきれないゴミの除去をユーザに促すことができる。 In this embodiment, the second predetermined value of S214 is smaller than the first predetermined value of S204. Then, even if the blob area included in the peripheral area is smaller than the blob area included in the central area, the user is instructed to remove dust (S220). That is, even if the dust adhering to the periphery of the lens is smaller than the dust adhering to the center of the lens, the user can be prompted to remove the dust adhering to the periphery of the lens. For example, dust adhering to the center of the lens is easily removed by the blower of the mounting apparatus 10, but dust adhering to the periphery of the lens may not be completely removed by the blower. The user can be urged to remove dust that cannot be removed by the blower.
(対応関係)
 図5のS10~S16、S200~230の処理を実行するCPU32、図5のS240の処理を実行するCPU32が、それぞれ、「処理実行部」、「報知部」の一例である。図5のS204の第1の所定値、S214の第2の所定値が、それぞれ、「第1の面積」、「第2の面積」の一例である。
(correspondence relationship)
The CPU 32 that executes the processes of S10 to S16 and S200 to 230 in FIG. 5, and the CPU 32 that executes the process of S240 in FIG. The first predetermined value of S204 and the second predetermined value of S214 in FIG. 5 are examples of the "first area" and the "second area", respectively.
 実施例で説明した画像処理装置に関する留意点を述べる。「画像処理装置」は、実装装置10の制御部30に限らず、例えば、実装装置10とは別体に設けられている外部の装置(例えばサーバ)であってもよい。 Points to note regarding the image processing apparatus described in the embodiment will be described. The “image processing device” is not limited to the control unit 30 of the mounting device 10 , and may be an external device (for example, a server) provided separately from the mounting device 10 .
 「カメラの状態」は、カメラのレンズにゴミが付着することに限らず、例えば、カメラの部品の故障であってもよい。本変形例では、「報知動作」は、カメラの部品の故障をユーザに報知するための動作である。例えば、ユーザは、故障した部品を取り換える対処をすることができる。 The "camera state" is not limited to dust adhering to the camera lens, and may be, for example, a malfunction of camera parts. In this modified example, the "notification operation" is an operation for notifying the user of a failure of a part of the camera. For example, the user can take action to replace the failed component.
 「報知動作」は、図3のS20の指示の表示に限らず、例えば、警告音の出力、警告灯の点灯であってもよい。 The "notification operation" is not limited to the display of the instruction in S20 of FIG.
 図3の第1のカメラ検査処理は、側面パーツカメラ22の状態(ゴミの付着、部品の故障等)を検査することに採用してもよい。本変形例では、側面パーツカメラ22が、「カメラ」の一例である。 The first camera inspection process in FIG. 3 may be employed to inspect the state of the side parts camera 22 (adhesion of dust, failure of parts, etc.). In this modified example, the side part camera 22 is an example of a "camera".
 「基準板」の色は、白色に限らず、例えば、白色とは異なる他の色(例えばグレー)であってもよい。 The color of the "reference plate" is not limited to white, and may be, for example, a color other than white (eg, gray).
 第1実施例において、図3の第1のカメラ検査処理及び図4の第2のカメラ検査処理のいずれか一方が実行されなくてもよい。 In the first embodiment, either one of the first camera inspection process in FIG. 3 and the second camera inspection process in FIG. 4 may not be executed.
 また、図3のS14の画素値の判断は、上記の実施例の構成に限らない。例えば、CPU32は、第1の処理画像を構成する複数個の画素のそれぞれについて、基準板100を示す画素値との差分値を算出してもよい。CPU32は、算出済みの差分値が、予め定められメモリ34に記憶されている特定の値以上であるか否かを判断してもよい。基準板100を示す画素値は、例えば、メモリ34に予め記憶されている。続くS16では、CPU32は、差分値が特定の値以上である画素値を有する複数の画素が繋がって存在する領域をブロブ領域として判断してもよい。本変形例では、差分値が特定の値以上である画素値を有する複数の画素が繋がって存在する領域が、「基準外領域」の一例である。 Also, the determination of the pixel value in S14 of FIG. 3 is not limited to the configuration of the above embodiment. For example, the CPU 32 may calculate a difference value between a pixel value representing the reference plate 100 and each of a plurality of pixels forming the first processed image. The CPU 32 may determine whether or not the calculated difference value is greater than or equal to a specific value that is predetermined and stored in the memory 34 . Pixel values indicating the reference plate 100 are stored in advance in the memory 34, for example. In subsequent S16, the CPU 32 may determine an area in which a plurality of pixels having pixel values with a difference value equal to or greater than a specific value are connected to each other as a blob area. In this modified example, an area in which a plurality of pixels having pixel values whose difference values are equal to or greater than a specific value are connected to each other is an example of the "non-reference area".
 本明細書または図面に説明した技術要素は、単独であるいは各種の組合せによって技術的有用性を発揮するものであり、出願時請求項記載の組合せに限定されるものではない。また、本明細書または図面に例示した技術は複数目的を同時に達成するものであり、そのうちの一つの目的を達成すること自体で技術的有用性を持つものである。 The technical elements described in this specification or drawings demonstrate technical usefulness either alone or in various combinations, and are not limited to the combinations described in the claims at the time of filing. In addition, the techniques exemplified in this specification or drawings achieve multiple purposes at the same time, and achieving one of them has technical utility in itself.
10  :実装装置
12  :移動装置
14  :ノズル
16  :ディスプレイ
20  :下面パーツカメラ
22  :側面パーツカメラ
24  :マークカメラ
30  :制御部
32  :CPU
34  :メモリ
40  :プログラム
50  :支持台
60  :部品フィーダ
100、200 :基準板
10: Mounting Device 12: Moving Device 14: Nozzle 16: Display 20: Bottom Parts Camera 22: Side Parts Camera 24: Mark Camera 30: Control Unit 32: CPU
34: Memory 40: Program 50: Support table 60: Parts feeders 100, 200: Reference plate

Claims (7)

  1.  部品を基板に実装する実装工程で利用されるカメラの状態を検査する検査処理を実行する処理実行部であって、前記検査処理は、前記部品及び前記基板とは異なる所定の基準板を前記カメラに撮像させる処理を含む、前記処理実行部と、
     前記検査処理において前記カメラによって撮像された画像内に所定の面積以上の領域である基準外領域が存在する場合に、所定の報知動作を実行する報知部であって、前記基準外領域は、前記画像のうちの前記基準板を示す画素値との差が所定値以上である画素値を有する複数の画素が繋がって存在するブロブ領域である、前記報知部と、
     を備える、画像処理装置。
    A processing execution unit for executing inspection processing for inspecting a state of a camera used in a mounting process for mounting components on a board, wherein the inspection processing is performed by using a predetermined reference plate different from the component and the board as the camera. The processing execution unit including processing for causing the
    A notification unit that executes a predetermined notification operation when a non-standard region, which is a region having a predetermined area or more, exists in the image captured by the camera in the inspection process, wherein the non-standard region is the The reporting unit, which is a blob region in which a plurality of pixels having pixel values whose difference from a pixel value indicating the reference plate in the image is equal to or greater than a predetermined value are connected and exist;
    An image processing device comprising:
  2.  前記画像内に前記基準外領域が存在しない場合に、前記報知動作は実行されない、請求項1に記載の画像処理装置。 The image processing apparatus according to claim 1, wherein the notification operation is not executed when the non-reference area does not exist in the image.
  3.  前記基準板は、白色の板である、請求項1又は2に記載の画像処理装置。 The image processing apparatus according to claim 1 or 2, wherein the reference plate is a white plate.
  4.  前記所定の面積は、最も小さい前記部品の面積以下である、請求項1から3のいずれか一項に記載の画像処理装置。 The image processing device according to any one of claims 1 to 3, wherein the predetermined area is equal to or smaller than the area of the smallest part.
  5.  前記カメラは、前記部品を吸着して前記基板まで搬送するノズルを撮像可能であり、
     前記検査処理において、前記基準板は前記ノズルに吸着される、請求項1から4のいずれか一項に記載の画像処理装置。
    The camera is capable of capturing an image of a nozzle that picks up the component and conveys it to the substrate,
    5. The image processing apparatus according to claim 1, wherein said reference plate is sucked by said nozzle in said inspection process.
  6.  前記画像のうち、前記カメラのレンズの中央を示す中央領域において、前記所定の面積である第1の面積以上の前記ブロブ領域が存在する場合に、前記画像内に前記基準外領域が存在すると判断され、
     前記画像のうち、前記カメラのレンズの周縁を示す周縁領域において、前記所定の面積である第2の面積であって、前記第1の面積より小さい前記第2の面積以上の前記ブロブ領域が存在する場合に、前記画像内に前記基準外領域が存在すると判断される、請求項5に記載の画像処理装置。
    determining that the non-standard area exists in the image when the blob area having the first area or more, which is the predetermined area, exists in the central area indicating the center of the lens of the camera in the image; is,
    In the image, the blob area having a second area which is the predetermined area and equal to or larger than the second area smaller than the first area exists in a peripheral edge area indicating a peripheral edge of the lens of the camera. 6. The image processing apparatus according to claim 5, wherein it is determined that said non-reference region exists in said image when said non-reference region exists.
  7.  前記カメラは、所定の支持台の上に支持されている前記基板を撮像可能であり、
     前記検査処理において、前記基準板は前記支持台に支持される、請求項1から4のいずれか一項に記載の画像処理装置。
    The camera is capable of imaging the substrate supported on a predetermined support base,
    5. The image processing apparatus according to claim 1, wherein in said inspection process, said reference plate is supported by said support base.
PCT/JP2021/046526 2021-12-16 2021-12-16 Image processing device WO2023112259A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/046526 WO2023112259A1 (en) 2021-12-16 2021-12-16 Image processing device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/046526 WO2023112259A1 (en) 2021-12-16 2021-12-16 Image processing device

Publications (1)

Publication Number Publication Date
WO2023112259A1 true WO2023112259A1 (en) 2023-06-22

Family

ID=86773866

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/046526 WO2023112259A1 (en) 2021-12-16 2021-12-16 Image processing device

Country Status (1)

Country Link
WO (1) WO2023112259A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11330799A (en) * 1998-05-21 1999-11-30 Sony Corp Component mounting unit
WO2005022901A1 (en) * 2003-08-29 2005-03-10 Nikon Corporation Imaging system diagnosis device, imaging system diagnosis program, imaging system diagnosis program product, and imaging device
JP2012173045A (en) * 2011-02-18 2012-09-10 Jfe Steel Corp Evaluation apparatus for surface checkup apparatuses and evaluation method for surface checkup apparatuses
WO2016092651A1 (en) * 2014-12-10 2016-06-16 富士機械製造株式会社 Component-mounting device
WO2019198220A1 (en) * 2018-04-13 2019-10-17 株式会社Fuji Maintenance management device, mounting device, and maintenance management method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11330799A (en) * 1998-05-21 1999-11-30 Sony Corp Component mounting unit
WO2005022901A1 (en) * 2003-08-29 2005-03-10 Nikon Corporation Imaging system diagnosis device, imaging system diagnosis program, imaging system diagnosis program product, and imaging device
JP2012173045A (en) * 2011-02-18 2012-09-10 Jfe Steel Corp Evaluation apparatus for surface checkup apparatuses and evaluation method for surface checkup apparatuses
WO2016092651A1 (en) * 2014-12-10 2016-06-16 富士機械製造株式会社 Component-mounting device
WO2019198220A1 (en) * 2018-04-13 2019-10-17 株式会社Fuji Maintenance management device, mounting device, and maintenance management method

Similar Documents

Publication Publication Date Title
JP4767995B2 (en) Component mounting method, component mounting machine, mounting condition determining method, mounting condition determining apparatus, and program
US8527082B2 (en) Component mounting method, component mounting apparatus, method for determining mounting conditions, and apparatus and program for determining mounting conditions
JP3966189B2 (en) Substrate inspection method and substrate inspection apparatus using the method
CN1527126A (en) Image processing system, projector and image processing method
WO2015040667A1 (en) Mounting inspection device
WO2023112259A1 (en) Image processing device
JP4342199B2 (en) Component adsorption position correction device for component mounting machine
JP7149723B2 (en) Image management method and image management device
JP6794529B2 (en) Surface mounter
US8780194B2 (en) Component presence/absence judging apparatus and method
JP6230820B2 (en) Board work machine
JP4520324B2 (en) Inspection result notification device and mounting system
JP2009216647A (en) Defect inspection method and defect inspection device
JP5713441B2 (en) Component mounting system
JP4549662B2 (en) Solder inspection apparatus and solder inspection method
JP2017034202A (en) Inspection device, mounting device, inspection method and program
JP7061220B2 (en) Component mounting device
JP5881237B2 (en) Inspection apparatus, processing apparatus, information processing apparatus, object manufacturing apparatus, and manufacturing method thereof
JP2009295776A (en) Method for installing backup pin
WO2021090395A1 (en) Image processing device, component mounting system, and image processing method
JP2007071785A (en) Method for inspecting projector
JP5003610B2 (en) Board inspection method
JP2002312766A (en) Soldering state inspection device
EP4092408A1 (en) Inspection device and inspection method
JP2801337B2 (en) Electronic component mounting machine

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21968174

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2023567437

Country of ref document: JP