WO2021191727A1 - Image processing apparatus and image processing method - Google Patents

Image processing apparatus and image processing method Download PDF

Info

Publication number
WO2021191727A1
WO2021191727A1 PCT/IB2021/052150 IB2021052150W WO2021191727A1 WO 2021191727 A1 WO2021191727 A1 WO 2021191727A1 IB 2021052150 W IB2021052150 W IB 2021052150W WO 2021191727 A1 WO2021191727 A1 WO 2021191727A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
unit
master
reference image
processing apparatus
Prior art date
Application number
PCT/IB2021/052150
Other languages
French (fr)
Inventor
Shotaro KOMOTO
Keiji Ohmura
Akito TAJIMA
Original Assignee
Ricoh Company, Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ricoh Company, Ltd. filed Critical Ricoh Company, Ltd.
Publication of WO2021191727A1 publication Critical patent/WO2021191727A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/001Industrial image inspection using an image reference approach
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20076Probabilistic image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20092Interactive image processing based on input by user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30136Metal

Definitions

  • Embodiments of the present disclosure relate to an image processing apparatus and an image processing method.
  • inspection systems that employ a multi-master system in which a plurality of master images is used for collation, to inspect target parts varying in position or orientation and therefore varying in image appearance.
  • PTL 1 discloses an inspection system provided with a reverse lookup unit that classifies inspection items for each type of image template and displays the inspection items in a list on a screen of a display in association with the image template, for the purpose of facilitating the correction of parameters for appearance inspection by image processing.
  • PTL 2 proposes a technique of storing a specific image of an object at a stop position as a reference image and comparing an acquired image with the reference image to determine whether or not the process is normally performed.
  • PTL 3 proposes a technique of setting, in a template, an unexecuted area that is not collated with an image or a second area subjected to collation different from collation performed on a first area in the template and executing template matching based on a collation process excluding the unexecuted area or a collation process with the first and second areas.
  • PTL 4 proposes a technique, employed by an article collation unit, of performing a first collation based on an image obtained by photographing a collation target article and overall image information of each article stored in an article information storage unit, and comparing, when a plurality of article candidates are extracted, attribute information included in partial feature information of the extracted plurality of article candidates with part of the image of the collation target article corresponding to positional information included in the partial feature information, to perform a second collation.
  • PTL 5 proposes a technique of searching a plurality of naturally occurring images for a similar image similar to a recognition target image, extracting a plurality of keywords that can be candidates for a result of recognition of the recognition target image from information associated with the searched similar image, and analyzing the extracted plurality of keywords, to identify a likely keyword as the result of recognition of the recognition target image and output the identified keyword as the result of recognition of the recognition target image.
  • the template matching is performed with a plurality of template images (i.e., master images), it takes time and effort to register the plurality of master images for an inspection target having a large variation. That is, as the variation in appearance of the inspection target increases, setting operations in, e.g., master image registration and threshold adjustment take more time and effort.
  • An image processing apparatus includes a determining unit, an image acquiring unit, a retaining unit, a collation value calculating unit, an extracting unit, a presentation device, an operation device, and a registering unit.
  • the determining unit is configured to determine, based on comparison and collation between a target image including an object to be inspected and a reference image registered in advance, whether or not the object is normal.
  • the image acquiring unit is configured to repeatedly acquire the target image.
  • the retaining unit is configured to retain an initial reference image to be compared and collated with the object included in the target image.
  • the collation value calculating unit is configured to calculate a collation value indicating a degree of matching between the target image and the initial reference image.
  • the extracting unit is configured to extract, as a candidate for the reference image, the target image having the calculated collation value equal to or greater than a first threshold and equal to or less than a second threshold.
  • the presentation device is configured to present the candidate.
  • the operation device is configured to allow selection as to whether or not to register the candidate as the reference image.
  • the registering unit is configured to register, as the reference image, the candidate selected to be registered.
  • the setting operation in master image registration is simplified.
  • FIG. 1 is a diagram illustrating an example of an overall configuration of an image processing system to which an image processing apparatus according to an embodiment is applied.
  • FIG. 2 is a block diagram illustrating an example of a hardware configuration of an image processing apparatus according to an embodiment.
  • FIG. 3 is a block diagram illustrating a functional configuration of an image processing apparatus according to an embodiment.
  • FIG. 4 is a flowchart illustrating an example of an initial master image generation process of an image processing apparatus according to an embodiment.
  • FIG. 5 is a flowchart illustrating an example of an automatic master addition process of an image processing apparatus according to an embodiment.
  • FIG. 6 is a flowchart illustrating an example of a matching determination process of an image processing apparatus according to an embodiment.
  • FIG. 7 is a diagram illustrating an example of an inspection screen displayed on an image processing apparatus according to an embodiment.
  • FIG. 8 is a diagram illustrating an example of a master candidate distribution screen displayed on an image processing apparatus according to an embodiment.
  • FIG. 9 is a diagram illustrating an example of an order of hit numbers of an image processing apparatus according to an embodiment.
  • FIG. 10 is a diagram illustrating an example of a unit assembly inspection screen displayed on an image processing apparatus according to an embodiment.
  • FIG. 11 is a diagram illustrating an example of an electronic substrate assembly inspection screen displayed on an image processing apparatus according to an embodiment.
  • FIG. 12 is a diagram illustrating an example of a press part inspection screen displayed on an image processing apparatus according to an embodiment.
  • FIG. 13 is a diagram illustrating an example of a packing number inspection screen displayed on an image processing apparatus according to an embodiment.
  • FIGS. 14A and 14B are diagrams illustrating examples of a box breakage inspection screen displayed on an image processing apparatus according to an embodiment.
  • FIG. 15A is a diagram illustrating an example of a master image relating to label printing inspection
  • FIG. 15B is a diagram illustrating an example of a label printing inspection screen displayed on an image processing apparatus according to an embodiment.
  • FIG. 16
  • FIGS. 16A to 16C are illustrations of grease applying inspection.
  • FIG. 17 is a diagram illustrating image examples of grease.
  • FIG. 18 is an illustration of a case in which a background image is registered as a master image.
  • FIG. 19 is a diagram illustrating an example of an inspection screen.
  • FIG. 1 is a diagram illustrating an example of an overall configuration of an image processing system to which an image processing apparatus according to an embodiment is applied.
  • an image processing system 1 to which the image processing apparatus according to the present embodiment is applied includes cameras 2a to 2d, an image processing apparatus 3, and a hub 4.
  • the cameras 2a to 2d are video cameras that image (or record) a subject by converting light from the subject into an electric signal and generate a moving image (e.g., 10 frames per second (FPS) or 25 [FPS]) constructed of a plurality of frames (i.e., image data).
  • a moving image e.g., 10 frames per second (FPS) or 25 [FPS]
  • the cameras 2a to 2d image a part or a semi-product to be inspected in a production facility or a production line for producing a product and generate image data.
  • the cameras 2a to 2d are simply referred to as “cameras 2” or “camera 2” when the cameras 2a to 2d are referred to without distinction or collectively.
  • the image processing system 1 includes the four cameras 2. However, the number of the cameras 2 is not limited to four. The image processing system 1 may include another number of cameras 2.
  • the image processing apparatus 3 is, e.g., a personal computer (PC) or a workstation serving as an image processing apparatus that executes image processing based on video data imaged by each of the cameras 2.
  • PC personal computer
  • workstation serving as an image processing apparatus that executes image processing based on video data imaged by each of the cameras 2.
  • the hub 4 is a line concentrator compliant with the Ethernet (registered trademark) standard to connect the cameras 2a to 2d to the image processing apparatus 3.
  • Ethernet registered trademark
  • data communication is performed between the cameras 2a to 2d and the image processing apparatus 3 according to a protocol such as transmission-control protocol/Intemet protocol (TCP/IP).
  • TCP/IP transmission-control protocol/Intemet protocol
  • each of the cameras 2a to 2d and the image processing apparatus 3 has a media access control (MAC) address for communication by the TCP/IP.
  • An IP address such as a private IP address is addressed to each of the cameras 2a to 2d and the image processing apparatus 3.
  • FIG. 1 illustrates an example in which the hub 4 relays the communication by the TCP/IP.
  • the image processing apparatus 3 may be provided with a video graphics array (VGA) terminal or a universal serial bus (USB) port; whereas the cameras 2 may be concentrated to the hub 4 via VGA cables or USB cables and thus connected to the image processing apparatus 3.
  • VGA video graphics array
  • USB universal serial bus
  • each of the cameras 2 is connected to the image processing apparatus 3 via the hub 4.
  • the cameras 2 may communicate with the image processing apparatus 3 via a network such as a local area network (LAN), a dedicated line, or the Internet.
  • LAN local area network
  • dedicated line or the Internet.
  • the number of cameras 2 is not limited as described above, the following will describe, as an example, a configuration in which the four cameras 2 are connected to the image processing apparatus 3 as illustrated in FIG. 1.
  • FIG. 2 is a block diagram illustrating an example of a hardware configuration of an image processing apparatus according to an embodiment.
  • the image processing apparatus 3 includes a central processing unit (CPU) 101, a read only memory (ROM) 102, a random access memory (RAM) 103, an external storage device 104, a display 105, a network interface (I/F) 106, a keyboard 107, a mouse 108, a digital versatile disc (DVD) drive 109, an external device I/F 111, and a speaker 112.
  • CPU central processing unit
  • ROM read only memory
  • RAM random access memory
  • I/F network interface
  • keyboard 107 keyboard
  • mouse 108 a digital versatile disc
  • DVD digital versatile disc
  • the CPU 101 is a device that controls the entire operation of the image processing apparatus 3.
  • the ROM 102 is a nonvolatile storage device that stores programs such as firmware and a Basic Input/Output System (BIOS) for the image processing apparatus 3.
  • the RAM 103 is a volatile storage device that is used as a work area for the CPU 101.
  • the external storage device 104 is a storage device such as a hard disk drive (HDD) or a solid state drive (SSD) that stores various kinds of data such as setting information, reference image data, and image data received from the cameras 2.
  • HDD hard disk drive
  • SSD solid state drive
  • the display 105 is a display device that displays a cursor, a menu, a window, various kinds of information such as characters and images, and a screen of an application for executing image processing with the image processing apparatus 3.
  • the display 105 is, e.g., a cathode -ray tube (CRT) display, a liquid-crystal display (LCD) or an organic electroluminescence (OEL) display.
  • the display 105 is connected to the main body of the image processing apparatus 3 via, e.g., a VGA cable or a high definition multimedia interface (HDMI; registered trademark) cable.
  • HDMI high definition multimedia interface
  • the display 105 may be connected to the main body of the image processing apparatus 3 via an Ethernet cable.
  • the display 105 displays an inspection screen 400 as illustrated in FIG.
  • the inspection screen 400 includes a multicamera display area 401, a setting area 402, and a determination result display area 403.
  • the multicamera display area 401 is constructed of sub-display areas 401A, 401B, 401C, and 401D. A plurality of images is displayed in the multicamera display area 401.
  • Various setting buttons are arranged in the setting area 402. A determination result of normal (or okay (OK)) or abnormal (or no good (NG)) is displayed in the determination result display area 403.
  • the network I/F 106 is an interface that links the image processing apparatus 3 with the hub 4 for data communication.
  • the network I/F 106 is, e.g., a network interface card (NIC) that allows communication by the TCP/IP.
  • NIC network interface card
  • the image processing apparatus 3 receives video data from the cameras 2 via the network I/F 106.
  • the keyboard 107 is an input device that is used to select characters, numbers, and various instructions, move a cursor, and set setting information, for example.
  • the mouse 108 is an input device that is used to select and execute various instructions, select a processing target, move a cursor, and set setting information, for example.
  • the DVD drive 109 is a device that controls writing, reading, and removal of data to and from a DVD 110, which is as an example of a removable storage medium.
  • the external device I/F 111 is an interface that links the image processing apparatus 3 with an external device for data communication, for example.
  • the external device FF 111 is, e.g., a NIC or a USB interface card that enables communication by the TCP/IP. Specifically, the image processing apparatus 3 performs data communication with an external device via the external device FF 111.
  • the speaker 112 is an audio output device that provides by voice a result of determination described later, for example.
  • the CPU 101, the ROM 102, the RAM 103, the external storage device 104, the display 105, the network FF 106, the keyboard 107, the mouse 108, the DVD drive 109, the external device FF 111, and the speaker 112 described above are communicably connected to each other via a bus 113 such as an address bus or a data bus.
  • a bus 113 such as an address bus or a data bus.
  • the display 105 is connected to the main body of the image processing apparatus 3 via an Ethernet cable
  • the display 105 is connected to the network FF 106.
  • data communication is performed according to a protocol such as the TCP/IP.
  • FIG. 3 is a block diagram illustrating a functional configuration of an image processing apparatus according to an embodiment. Note that FIG. 3 does not illustrate the hub 4 to simplify the description.
  • the image processing apparatus 3 includes a video receiving unit 301, a detection frame setting unit 302, a master setting unit 303, an automatic master adding unit 304, a determining unit 305, a setting data retaining unit 306, and a determination data output unit 307.
  • the video receiving unit 301 is a functional unit that acquires image data from each of the cameras 2 via the hub 4.
  • the video receiving unit 301 is an example of an image acquiring unit.
  • the video receiving unit 301 has a function of repeatedly acquiring a target image including an object to be inspected and storing the target image as image data in, e.g., the external storage device 104.
  • the video receiving unit 301 is implemented by the network I/F 106 illustrated in FIG. 2.
  • the detection frame setting unit 302 is a functional unit that sets a detection frame 420 in an image corresponding to image data. As illustrated in FIGS. 10 to 15B, the detection frame 420 is an image area including an object to be inspected. The detection frame 420 is extracted from the target image including the object to recognize the object. A part of the image defined by the detection frame 420 is detected instead of the whole image.
  • the detection frame setting unit 302 has a function of, in a case in which the target image includes a plurality of objects to be inspected, setting the detection frame 420 for each of the plurality of objects.
  • the detection frame setting unit 302 has a function of causing the external storage device 104 to retain detection frame information corresponding to the set detection frame 420.
  • the detection frame information includes, e.g., the vertical and lateral sizes and position of the detection frame 420 in the target image.
  • the master setting unit 303 is a functional unit that sets a master image for each of the detection frames 420.
  • the master image is a reference image to be compared and collated with the object included in the target image.
  • a plurality of master images is used for the comparison and collation to inspect inspection target parts varying in position or orientation and therefore varying in image appearance.
  • the master setting unit 303 has a function of causing the external storage device 104 to retain master image information corresponding to the set master image.
  • the automatic master adding unit 304 is a functional unit that automatically adds a master image based on an initial master image (serving as an initial reference image).
  • the initial master image serves as a reference of a plurality of master images.
  • the automatic master adding unit 304 has a function of calculating a collation value indicating a degree of matching between a target image and the initial master image, based on template matching between the target image and the initial master image.
  • the automatic master adding unit 304 is an example of a collation value calculating unit.
  • the collation value is a numerical value that represents, e.g., the image similarity, indicating how similar two images are when the two images are compared and collated.
  • As one method for calculating the image similarity value there is a method for calculating the image similarity value with a histogram, for example.
  • the automatic master adding unit 304 also has a function of extracting, as a candidate for the master image (hereinafter simply referred to as a master image candidate), a target image having a calculated collation value equal to or greater than a first threshold and equal to or less than a second threshold.
  • the automatic master adding unit 304 is an example of an extracting unit.
  • the first threshold is used to set a lower limit of the collation value.
  • the second threshold is used to set an upper limit of the collation value.
  • the second threshold is set higher than the first threshold.
  • the first threshold and the second threshold are retained in, e.g., the external storage device 104.
  • the automatic master adding unit 304 causes the external storage device 104 to temporarily retain the extracted master image candidate.
  • the automatic master adding unit 304 also has a function of allowing a user to select a master image candidate to be registered as a master image from among a plurality of master image candidates. Specifically, the automatic master adding unit 304 reads out the plurality of master image candidates from, e.g., the external storage device 104 and displays the plurality of master image candidates in a list on the display 105.
  • the display 105 is an example of a presentation device. Such a process in which the image processing apparatus 3 automatically presents the master image candidates is referred to as “semi-automatic setting.” As illustrated in FIG.
  • the automatic master adding unit 304 sorts a list of master image candidates on a sorting screen 410 displayed on the display 105, for each of the master image candidates, into two groups: the master image candidates to be registered as master images and the master image candidates not to be registered as master images.
  • numbers assigned to the respective master image candidates are vertically arranged in the center of the sorting screen 410.
  • the user drags a desired number with a pointer 411 of the mouse 108, moves the number to one of a “use” area 412 and a “not use” area 413 arranged laterally on the sorting screen 410 in FIG. 8, and drops the number, thus registering or not registering the master image candidate corresponding to the number as a master image.
  • the “use” area 412 is an area in which master image candidates to be registered as master images are allocated.
  • FIG. 8 illustrates an example in which master image candidates No. 5 and No. 4 are selected as master images.
  • the “not use” area 413 is an area in which master image candidates not to be registered as master images are allocated.
  • FIG. 8 illustrates an example in which a master image candidate No. 8 is not registered as a master image.
  • the mouse 108 and the keyboard 107 are examples of an operation device.
  • the master image candidate for registration is registered as a new master image and used for comparison and collation with a target image. By contrast, the master image candidate not for registration is discarded together with the image data.
  • the automatic master adding unit 304 is an example of a registering unit. [0040]
  • the determining unit 305 is a functional unit that determines whether an object to be inspected is normal or abnormal, based on template matching between a target image and a master image.
  • the determining unit 305 is an example of a determining unit.
  • the determining unit 305 causes the external storage device 104 to retain a use history of the master image used for the determination.
  • the external storage device 104 is an example of a storage device.
  • the user history of the master image is a use history 405 as illustrated in FIG. 9, for example.
  • the use history 405 is linked with each master image (e.g., M3, M5, and Ml) retained in the external storage device 104, thus being retained in the external storage device 104.
  • the determining unit 305 has a function of acquiring a plurality of registered master images in descending order according to the use history 405 and determining whether the object is normal or abnormal. Specifically, for example, the determining unit 305 sorts the plurality of master images retained in the external storage device 104 in descending order according to use history 405 and determines whether the object is normal or abnormal in order from the master image having the greatest use history 405 in number.
  • the setting data retaining unit 306 is a functional unit that retains, e.g., the detection frame information, the master image information, and use history information of a master image.
  • the setting data retaining unit 306 is an example of a retaining unit.
  • the setting data retaining unit 306 is implemented by the external storage device 104 illustrated in FIG. 2. [0043]
  • the determination data output unit 307 is a functional unit that outputs determination result data.
  • the determination data output unit 307 controls display of the display 105.
  • the determination data output unit 307 displays “OK” in association with the detection frame 420 displayed on the display 105, for example, as illustrated in FIG. 14A.
  • the determination data output unit 307 displays “NG” in association with the detection frame 420, for example, as illustrated in FIG. 14B.
  • FIG. 3 may be implemented by a hardware circuit such as an application-specific integrated circuit (ASIC) or a field- programmable gate array (FPGA), instead of programs as software.
  • ASIC application-specific integrated circuit
  • FPGA field- programmable gate array
  • FIG. 3 conceptually illustrates functions of the image processing apparatus 3 as functional units.
  • the functional configuration of the image processing apparatus 3 is not limited to the functional configuration illustrated in FIG. 3.
  • multiple independent functional units illustrated in FIG. 3 may construct an integrated functional unit.
  • functions held by a single functional unit illustrated in FIG. 3 may be separated from each other and held by individual functional units.
  • a single functional unit illustrated in FIG. 3 may be divided into multiple functional units according to function.
  • FIG. 4 is a flowchart illustrating an example of an initial master image generation process of an image processing apparatus according to an embodiment.
  • step S 101 the video receiving unit 301 acquires image data from each of the cameras 2.
  • the image data is temporarily stored in the external storage device 104 as a target image.
  • the video receiving unit 301 acquires image data of a real-time image from the cameras 2.
  • the video receiving unit 301 reads a moving image file, selects an optimum image from the moving image file while moving forward or scrolling, and acquires image data of the optimum image.
  • the video receiving unit 301 reads a moving image file including a series of images taken and selects a still image from the moving image file.
  • the video receiving unit 301 acquires image data form the moving image file for setting of a reference image.
  • step S 102 the detection frame setting unit 302 generates or sets a detection frame based on the image data acquired in step S 101.
  • the detection frame information corresponding to the detection frame is stored in the external storage device 104.
  • step S 103 the master setting unit 303 generates an initial master image for each detection frame and ends the process.
  • the initial master image and the image information corresponding to the initial master image are stored in the external storage device 104 for each detection frame.
  • FIG. 5 is a flowchart illustrating an example of an automatic master addition process of an image processing apparatus according to an embodiment.
  • step S201 the video receiving unit 301 acquires, as camera image, image data from each of the cameras 2.
  • the image data is temporarily stored in the external storage device 104 as a target image.
  • step S202 the automatic master adding unit 304 performs template matching between the target image and an initial master image.
  • the automatic master adding unit 304 calculates a collation value based on the template matching between the target image and the initial master image.
  • step S203 the automatic master adding unit 304 compares the collation value calculated in step S202 with an upper limit threshold and a lower limit threshold. As a result of the determination, when the collation value is equal to or less than the upper limit threshold and equal to or greater than the lower limit threshold (YES in step S203), the process proceeds to step S204. By contrast, when the collation value exceeds the upper limit threshold or when the collation value is less than the lower limit threshold (NO in step S203), the automatic master adding unit 304 discards the calculated collation value and the process returns to step S201.
  • step S204 the automatic master adding unit 304 extracts a target image having a collation value equal to or less than the upper limit threshold and equal to or greater than the lower limit threshold and adds the target image as a master candidate.
  • the master candidate is retained as a master image candidate in the external storage device 104.
  • step S205 the automatic master adding unit 304 determines whether or not the automatic master adding unit 304 has repeated the process a certain number of times.
  • the process returns to step S201.
  • the process ends.
  • the certain number of times corresponds to, e.g., a desired number of master image candidates.
  • FIG. 6 is a flowchart illustrating an example of a matching determination process of an image processing apparatus according to an embodiment.
  • step S301 the video receiving unit 301 acquires, as camera image, image data from each of the cameras 2.
  • the image data is temporarily stored in the external storage device 104 as a target image.
  • step S302 the determining unit 305 determines whether or not a determination trigger is turned on.
  • the process proceeds to step S303.
  • the determination trigger is not turned on (NO in step S302), the process returns to step S301.
  • the determination trigger is set to perform determination at a selected time as in a sampling inspection, for example, instead of a total inspection.
  • step S303 the automatic master adding unit 304 calculates a collation value based on template matching between a target image and an Nth master image of a plurality of master images.
  • N is a variable.
  • N is also a natural number.
  • N is a randomly selected number in a case in which the plurality of master images is arranged in order.
  • N 1 indicates the master image having the highest frequency of use.
  • step S304 the determining unit 305 determines whether or not the collation value is equal to or greater than a threshold.
  • the threshold is, e.g., the lower limit threshold described above.
  • the process proceeds to step S305.
  • the process proceeds to step S306.
  • step S305 the determining unit 305 displays, e.g., “OK” in association with a detection frame displayed on the display 105, for example, and ends the process.
  • step S307 the determining unit 305 determines whether or not the variable N is greater than the number of master images.
  • the process returns to step S303.
  • the variable N is greater than the number of master images (YES in step S307)
  • the process proceeds to step S308.
  • the number of master images is the number of master images registered as master images and stored in the external storage device 104.
  • step S308 the determining unit 305 displays, e.g., “NG” in association with the detection frame displayed on the display 105, for example, and ends the process.
  • the image processing apparatus 3 of the present embodiment retains an initial master image to be compared and collated with an object to be inspected included in a target image with, e.g., the setting data retaining unit 306 serving as a retaining unit.
  • the image processing apparatus 3 calculates a collation value based on template matching between the target image and the initial master image with, e.g., the automatic master adding unit 304 serving as a collation value calculating unit.
  • the image processing apparatus 3 presents, as a master image candidate, the target image having a collation value equal to or greater than the first threshold and equal to or less than the second threshold to a user with, e.g., the display 105 serving as a presentation device.
  • the image processing apparatus 3 causes the user to select whether or not to register each master image candidate as a master image with, e.g., the mouse 108 serving as an operation device.
  • the image processing apparatus 3 registers, as a master image, the master image candidate selected to be registered with, e.g., the automatic master adding unit 304 serving as a registering unit.
  • a user needs to specify a detection frame for each of a plurality of master images by dragging a mouse, for example.
  • the configuration described above facilitates the registration of master images thus, simplifying the setting operation in the registration of master images.
  • the determining unit 305 stores the use history 405 of the master image used for the determination in the external storage device 104.
  • the determining unit 305 acquires a plurality of registered master images in descending order according to the use history 405 and determines whether the object is normal or abnormal.
  • the template matching takes a long time. Therefore, the determining unit 305 compares and collates the target image with the master image in descending order according to the use history 405, thus increasing the probability of early determination as to whether or not the object is normal and accelerating the processing. As a result, the image processing apparatus 3 of the present embodiment shortens a period of time taken to determine whether or not the object is normal.
  • the automatic master adding unit 304 calculates a collation value based on template matching between a target image and an initial master image.
  • the automatic master adding unit 304 may calculate a collation value based on the difference in the number of pixels between the target image and the initial master image.
  • the collation value may be obtained from results of machine learning.
  • the determining unit 305 determines whether or not the object is normal. Alternatively, the determining unit 305 may determine whether or not the object is normal in another way.
  • FIGS. 16A to 16C are illustrations of grease applying inspection. As illustrated in FIG. 16A, in a grease applying inspection, a state of grease applied to an object 500 to be inspected by a grease application cylinder (or dispenser) 600 is inspected based on image data captured by the camera 2.
  • FIG. 16B illustrates a target image including the object 500 before the grease is applied to the object 500.
  • FIG. 16C illustrates the object 500 after the grease is applied to the object 500.
  • a normal object to be inspected is the same for each inspection. For example, in a case in which a screw attached in a product is the object to be inspected, the same screw is attached in each product. Therefore, the image of the screw is a master image.
  • the object to be inspected is a viscous semi-liquid such as a grease 501 illustrated in FIG.
  • the shape of the object is not the same for each inspection.
  • FIG. 17 is a diagram illustrating image examples of the grease 501. As illustrated in FIG. 17, since the shape of the grease 501 is not the same for each inspection, if images of all the shapes of the grease 501 are registered as master images, an enormous number of images are registered as master images.
  • FIG. 18 is an illustration of a case in which a background image is registered as a master image.
  • the setting data retaining unit 306 may retain, as a reference background image, a background image corresponding to an initial reference image.
  • the automatic master adding unit 304 may register the background image without grease as a master image, instead of registering an image of the grease as a master image.
  • the automatic master adding unit 304 calculates a collation value indicating a degree of matching between a target image and the reference background image, based on template matching between the target image and the reference background image, for example.
  • the determining unit 305 determines that an object to be inspected is normal in a case in which the collation value indicating the degree of matching between the target image and the reference background image is equal to or less than a given threshold.
  • FIG. 19 is a diagram illustrating an example of an inspection screen. As illustrated in FIG.
  • the determining unit 305 determines as OK. By contrast, when the background image remains, the determining unit 305 determines as NG. Thus, the number of master images is limited.
  • the present invention can be implemented in any convenient form, for example using dedicated hardware, or a mixture of dedicated hardware and software.
  • the present invention may be implemented as computer software implemented by one or more networked processing apparatuses.
  • the processing apparatuses include any suitably programmed apparatuses such as a general purpose computer, a personal digital assistant, a Wireless Application Protocol (WAP) or third-generation (3G)-compliant mobile telephone, and so on. Since the present invention can be implemented as software, each and every aspect of the present invention thus encompasses computer software implementable on a programmable device.
  • the computer software can be provided to the programmable device using any conventional carrier medium (carrier means).
  • the carrier medium includes a transient carrier medium such as an electrical, optical, microwave, acoustic or radio frequency signal carrying the computer code.
  • transient medium is a Transmission Control Protocol/Intemet Protocol (TCP/IP) signal carrying computer code over an IP network, such as the Internet.
  • the carrier medium also includes a storage medium for storing processor readable code such as a floppy disk, a hard disk, a compact disc read-only memory (CD- ROM), a magnetic tape device, or a solid state memory device.
  • Processing circuitry includes a programmed processor, as a processor includes circuitry.
  • a processing circuit also includes devices such as an application specific integrated circuit (ASIC), a digital signal processor (DSP), a field programmable gate array (FPGA), and conventional circuit components arranged to perform the recited functions.
  • ASIC application specific integrated circuit
  • DSP digital signal processor
  • FPGA field programmable gate array

Landscapes

  • Engineering & Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

An image processing apparatus includes: a determining unit that determines, based on comparison and collation between a target image including an object to be inspected and a reference image registered in advance, whether the object is normal; an image acquiring unit that repeatedly acquires the target image; a retaining unit that retains an initial reference image; a collation value calculating unit that calculates a collation value indicating a degree of matching between the target image and the initial reference image; an extracting unit that extracts, as a candidate for the reference image, the target image having the calculated collation value not less than a first threshold and not greater than a second threshold; a presentation device that presents the candidate; an operation device that allows selection as to whether to register the candidate as the reference image; and a registering unit that registers the selected candidate as the reference image.

Description

[DESCRIPTION]
IMAGE PROCESSING APPARATUS AND IMAGE PROCESSING METHOD [Technical Field]
[0001]
Embodiments of the present disclosure relate to an image processing apparatus and an image processing method.
[Background Art]
[0002]
Conventionally, there have been many assembly processes in which people or machines assemble parts to produce new parts or products. In an inspection process for inspecting whether the finished product is assembled correctly, a system for inspecting the finished product by collating a camera image with a master image (i.e., template matching) is introduced in many sites to prevent the leakage of defective products and to reduce the number of processes.
[0003]
There are also inspection systems that employ a multi-master system in which a plurality of master images is used for collation, to inspect target parts varying in position or orientation and therefore varying in image appearance.
[0004]
PTL 1 discloses an inspection system provided with a reverse lookup unit that classifies inspection items for each type of image template and displays the inspection items in a list on a screen of a display in association with the image template, for the purpose of facilitating the correction of parameters for appearance inspection by image processing.
[0005]
PTL 2 proposes a technique of storing a specific image of an object at a stop position as a reference image and comparing an acquired image with the reference image to determine whether or not the process is normally performed.
[0006]
PTL 3 proposes a technique of setting, in a template, an unexecuted area that is not collated with an image or a second area subjected to collation different from collation performed on a first area in the template and executing template matching based on a collation process excluding the unexecuted area or a collation process with the first and second areas.
[0007]
PTL 4 proposes a technique, employed by an article collation unit, of performing a first collation based on an image obtained by photographing a collation target article and overall image information of each article stored in an article information storage unit, and comparing, when a plurality of article candidates are extracted, attribute information included in partial feature information of the extracted plurality of article candidates with part of the image of the collation target article corresponding to positional information included in the partial feature information, to perform a second collation. [0008]
PTL 5 proposes a technique of searching a plurality of naturally occurring images for a similar image similar to a recognition target image, extracting a plurality of keywords that can be candidates for a result of recognition of the recognition target image from information associated with the searched similar image, and analyzing the extracted plurality of keywords, to identify a likely keyword as the result of recognition of the recognition target image and output the identified keyword as the result of recognition of the recognition target image. [Citation List]
[Patent Literature]
[0009]
[PTL 1]
Japanese Unexamined Patent Application Publication No. 2009-217368 [PTL 2]
Japanese Unexamined Patent Application Publication No. 2018-010368 [PTL 3]
Japanese Patent No. 5568277 [PTL 4]
Japanese Patent No. 5869988 [PTL 5]
Japanese Patent No. 5200015 [Summary of Invention]
[Technical Problem]
[0010]
Although, in the conventional systems described above, the template matching is performed with a plurality of template images (i.e., master images), it takes time and effort to register the plurality of master images for an inspection target having a large variation. That is, as the variation in appearance of the inspection target increases, setting operations in, e.g., master image registration and threshold adjustment take more time and effort.
[0011]
In light of the above-described problems, it is a general object of the present invention to provide an image processing apparatus and an image processing method that simplify the setting operation in reference image registration.
[Solution to Problem]
[0012]
An image processing apparatus includes a determining unit, an image acquiring unit, a retaining unit, a collation value calculating unit, an extracting unit, a presentation device, an operation device, and a registering unit. The determining unit is configured to determine, based on comparison and collation between a target image including an object to be inspected and a reference image registered in advance, whether or not the object is normal. The image acquiring unit is configured to repeatedly acquire the target image. The retaining unit is configured to retain an initial reference image to be compared and collated with the object included in the target image. The collation value calculating unit is configured to calculate a collation value indicating a degree of matching between the target image and the initial reference image. The extracting unit is configured to extract, as a candidate for the reference image, the target image having the calculated collation value equal to or greater than a first threshold and equal to or less than a second threshold. The presentation device is configured to present the candidate. The operation device is configured to allow selection as to whether or not to register the candidate as the reference image. The registering unit is configured to register, as the reference image, the candidate selected to be registered.
[Advantageous Effects of Invention]
[0013]
According to the present invention, the setting operation in master image registration is simplified.
[Brief Description of Drawings]
[0014]
The accompanying drawings are intended to depict example embodiments of the present invention and should not be interpreted to limit the scope thereof. The accompanying drawings are not to be considered as drawn to scale unless explicitly noted. Also, identical or similar reference numerals designate identical or similar components throughout the several views.
[FIG. 1]
FIG. 1 is a diagram illustrating an example of an overall configuration of an image processing system to which an image processing apparatus according to an embodiment is applied.
[FIG. 2]
FIG. 2 is a block diagram illustrating an example of a hardware configuration of an image processing apparatus according to an embodiment.
[FIG. 3]
FIG. 3 is a block diagram illustrating a functional configuration of an image processing apparatus according to an embodiment.
[FIG. 4]
FIG. 4 is a flowchart illustrating an example of an initial master image generation process of an image processing apparatus according to an embodiment.
[FIG. 5]
FIG. 5 is a flowchart illustrating an example of an automatic master addition process of an image processing apparatus according to an embodiment.
[FIG. 6]
FIG. 6 is a flowchart illustrating an example of a matching determination process of an image processing apparatus according to an embodiment.
[FIG. 7] FIG. 7 is a diagram illustrating an example of an inspection screen displayed on an image processing apparatus according to an embodiment.
[FIG. 8]
FIG. 8 is a diagram illustrating an example of a master candidate distribution screen displayed on an image processing apparatus according to an embodiment.
[FIG. 9]
FIG. 9 is a diagram illustrating an example of an order of hit numbers of an image processing apparatus according to an embodiment.
[FIG. 10]
FIG. 10 is a diagram illustrating an example of a unit assembly inspection screen displayed on an image processing apparatus according to an embodiment.
[FIG. 11]
FIG. 11 is a diagram illustrating an example of an electronic substrate assembly inspection screen displayed on an image processing apparatus according to an embodiment.
[FIG. 12]
FIG. 12 is a diagram illustrating an example of a press part inspection screen displayed on an image processing apparatus according to an embodiment.
[FIG. 13]
FIG. 13 is a diagram illustrating an example of a packing number inspection screen displayed on an image processing apparatus according to an embodiment.
[FIG. 14]
FIGS. 14A and 14B are diagrams illustrating examples of a box breakage inspection screen displayed on an image processing apparatus according to an embodiment.
[FIG. 15]
FIG. 15A is a diagram illustrating an example of a master image relating to label printing inspection; whereas FIG. 15B is a diagram illustrating an example of a label printing inspection screen displayed on an image processing apparatus according to an embodiment. [FIG. 16]
FIGS. 16A to 16C are illustrations of grease applying inspection.
[FIG. 17]
FIG. 17 is a diagram illustrating image examples of grease.
[FIG. 18]
FIG. 18 is an illustration of a case in which a background image is registered as a master image.
[FIG. 19]
FIG. 19 is a diagram illustrating an example of an inspection screen.
[Description of Embodiments]
[0015]
Referring now to the drawings, wherein like reference numerals designate identical or corresponding parts throughout the several views, embodiments of an image processing apparatus and an image processing method of the present invention are described in detail below. The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the present invention. As used herein, the singular forms “a,” “an,” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. In describing embodiments illustrated in the drawings, specific terminology is employed for the sake of clarity. However, the disclosure of this specification is not intended to be limited to the specific terminology so selected and it is to be understood that each specific element includes all technical equivalents that have a similar function, operate in a similar manner, and achieve a similar result.
[0016]
FIG. 1 is a diagram illustrating an example of an overall configuration of an image processing system to which an image processing apparatus according to an embodiment is applied. As illustrated in FIG. 1, an image processing system 1 to which the image processing apparatus according to the present embodiment is applied includes cameras 2a to 2d, an image processing apparatus 3, and a hub 4.
[0017]
The cameras 2a to 2d are video cameras that image (or record) a subject by converting light from the subject into an electric signal and generate a moving image (e.g., 10 frames per second (FPS) or 25 [FPS]) constructed of a plurality of frames (i.e., image data). For example, the cameras 2a to 2d image a part or a semi-product to be inspected in a production facility or a production line for producing a product and generate image data. Note that the cameras 2a to 2d are simply referred to as “cameras 2” or “camera 2” when the cameras 2a to 2d are referred to without distinction or collectively. In FIG. 1, the image processing system 1 includes the four cameras 2. However, the number of the cameras 2 is not limited to four. The image processing system 1 may include another number of cameras 2.
[0018]
The image processing apparatus 3 is, e.g., a personal computer (PC) or a workstation serving as an image processing apparatus that executes image processing based on video data imaged by each of the cameras 2.
[0019]
The hub 4 is a line concentrator compliant with the Ethernet (registered trademark) standard to connect the cameras 2a to 2d to the image processing apparatus 3. For example, in a case in which the hub 4 complies with the Ethernet standard, data communication is performed between the cameras 2a to 2d and the image processing apparatus 3 according to a protocol such as transmission-control protocol/Intemet protocol (TCP/IP). In this case, each of the cameras 2a to 2d and the image processing apparatus 3 has a media access control (MAC) address for communication by the TCP/IP. An IP address such as a private IP address is addressed to each of the cameras 2a to 2d and the image processing apparatus 3. FIG. 1 illustrates an example in which the hub 4 relays the communication by the TCP/IP. Alternatively, for example, the image processing apparatus 3 may be provided with a video graphics array (VGA) terminal or a universal serial bus (USB) port; whereas the cameras 2 may be concentrated to the hub 4 via VGA cables or USB cables and thus connected to the image processing apparatus 3.
[0020]
In the example illustrated in FIG. 1, each of the cameras 2 is connected to the image processing apparatus 3 via the hub 4. Alternatively, the cameras 2 may communicate with the image processing apparatus 3 via a network such as a local area network (LAN), a dedicated line, or the Internet.
[0021]
Although the number of cameras 2 is not limited as described above, the following will describe, as an example, a configuration in which the four cameras 2 are connected to the image processing apparatus 3 as illustrated in FIG. 1.
[0022]
FIG. 2 is a block diagram illustrating an example of a hardware configuration of an image processing apparatus according to an embodiment. As illustrated in FIG. 2, the image processing apparatus 3 includes a central processing unit (CPU) 101, a read only memory (ROM) 102, a random access memory (RAM) 103, an external storage device 104, a display 105, a network interface (I/F) 106, a keyboard 107, a mouse 108, a digital versatile disc (DVD) drive 109, an external device I/F 111, and a speaker 112.
[0023]
The CPU 101 is a device that controls the entire operation of the image processing apparatus 3. The ROM 102 is a nonvolatile storage device that stores programs such as firmware and a Basic Input/Output System (BIOS) for the image processing apparatus 3. The RAM 103 is a volatile storage device that is used as a work area for the CPU 101.
[0024]
The external storage device 104 is a storage device such as a hard disk drive (HDD) or a solid state drive (SSD) that stores various kinds of data such as setting information, reference image data, and image data received from the cameras 2.
[0025]
The display 105 is a display device that displays a cursor, a menu, a window, various kinds of information such as characters and images, and a screen of an application for executing image processing with the image processing apparatus 3. The display 105 is, e.g., a cathode -ray tube (CRT) display, a liquid-crystal display (LCD) or an organic electroluminescence (OEL) display. Note that the display 105 is connected to the main body of the image processing apparatus 3 via, e.g., a VGA cable or a high definition multimedia interface (HDMI; registered trademark) cable. Alternatively, the display 105 may be connected to the main body of the image processing apparatus 3 via an Ethernet cable. The display 105 displays an inspection screen 400 as illustrated in FIG. 7, for example. The inspection screen 400 includes a multicamera display area 401, a setting area 402, and a determination result display area 403. The multicamera display area 401 is constructed of sub-display areas 401A, 401B, 401C, and 401D. A plurality of images is displayed in the multicamera display area 401. Various setting buttons are arranged in the setting area 402. A determination result of normal (or okay (OK)) or abnormal (or no good (NG)) is displayed in the determination result display area 403.
[0026]
The network I/F 106 is an interface that links the image processing apparatus 3 with the hub 4 for data communication. The network I/F 106 is, e.g., a network interface card (NIC) that allows communication by the TCP/IP. Specifically, the image processing apparatus 3 receives video data from the cameras 2 via the network I/F 106.
[0027]
The keyboard 107 is an input device that is used to select characters, numbers, and various instructions, move a cursor, and set setting information, for example. The mouse 108 is an input device that is used to select and execute various instructions, select a processing target, move a cursor, and set setting information, for example.
[0028]
The DVD drive 109 is a device that controls writing, reading, and removal of data to and from a DVD 110, which is as an example of a removable storage medium.
[0029]
The external device I/F 111 is an interface that links the image processing apparatus 3 with an external device for data communication, for example. The external device FF 111 is, e.g., a NIC or a USB interface card that enables communication by the TCP/IP. Specifically, the image processing apparatus 3 performs data communication with an external device via the external device FF 111.
[0030]
The speaker 112 is an audio output device that provides by voice a result of determination described later, for example.
[0031]
The CPU 101, the ROM 102, the RAM 103, the external storage device 104, the display 105, the network FF 106, the keyboard 107, the mouse 108, the DVD drive 109, the external device FF 111, and the speaker 112 described above are communicably connected to each other via a bus 113 such as an address bus or a data bus. Note that, in a case in which the display 105 is connected to the main body of the image processing apparatus 3 via an Ethernet cable, the display 105 is connected to the network FF 106. In this case, data communication is performed according to a protocol such as the TCP/IP.
[0032]
FIG. 3 is a block diagram illustrating a functional configuration of an image processing apparatus according to an embodiment. Note that FIG. 3 does not illustrate the hub 4 to simplify the description.
[0033] As illustrated in FIG. 3, the image processing apparatus 3 includes a video receiving unit 301, a detection frame setting unit 302, a master setting unit 303, an automatic master adding unit 304, a determining unit 305, a setting data retaining unit 306, and a determination data output unit 307.
[0034]
The video receiving unit 301 is a functional unit that acquires image data from each of the cameras 2 via the hub 4. The video receiving unit 301 is an example of an image acquiring unit. The video receiving unit 301 has a function of repeatedly acquiring a target image including an object to be inspected and storing the target image as image data in, e.g., the external storage device 104. The video receiving unit 301 is implemented by the network I/F 106 illustrated in FIG. 2.
[0035]
The detection frame setting unit 302 is a functional unit that sets a detection frame 420 in an image corresponding to image data. As illustrated in FIGS. 10 to 15B, the detection frame 420 is an image area including an object to be inspected. The detection frame 420 is extracted from the target image including the object to recognize the object. A part of the image defined by the detection frame 420 is detected instead of the whole image. The detection frame setting unit 302 has a function of, in a case in which the target image includes a plurality of objects to be inspected, setting the detection frame 420 for each of the plurality of objects. The detection frame setting unit 302 has a function of causing the external storage device 104 to retain detection frame information corresponding to the set detection frame 420. The detection frame information includes, e.g., the vertical and lateral sizes and position of the detection frame 420 in the target image.
[0036]
The master setting unit 303 is a functional unit that sets a master image for each of the detection frames 420. The master image is a reference image to be compared and collated with the object included in the target image. In the present embodiment, a plurality of master images is used for the comparison and collation to inspect inspection target parts varying in position or orientation and therefore varying in image appearance. The master setting unit 303 has a function of causing the external storage device 104 to retain master image information corresponding to the set master image.
[0037]
The automatic master adding unit 304 is a functional unit that automatically adds a master image based on an initial master image (serving as an initial reference image). The initial master image serves as a reference of a plurality of master images. The automatic master adding unit 304 has a function of calculating a collation value indicating a degree of matching between a target image and the initial master image, based on template matching between the target image and the initial master image. The automatic master adding unit 304 is an example of a collation value calculating unit. The collation value is a numerical value that represents, e.g., the image similarity, indicating how similar two images are when the two images are compared and collated. As one method for calculating the image similarity value, there is a method for calculating the image similarity value with a histogram, for example. [0038]
The automatic master adding unit 304 also has a function of extracting, as a candidate for the master image (hereinafter simply referred to as a master image candidate), a target image having a calculated collation value equal to or greater than a first threshold and equal to or less than a second threshold. The automatic master adding unit 304 is an example of an extracting unit. The first threshold is used to set a lower limit of the collation value. The second threshold is used to set an upper limit of the collation value. The second threshold is set higher than the first threshold. The first threshold and the second threshold are retained in, e.g., the external storage device 104. The automatic master adding unit 304 causes the external storage device 104 to temporarily retain the extracted master image candidate.
[0039]
The automatic master adding unit 304 also has a function of allowing a user to select a master image candidate to be registered as a master image from among a plurality of master image candidates. Specifically, the automatic master adding unit 304 reads out the plurality of master image candidates from, e.g., the external storage device 104 and displays the plurality of master image candidates in a list on the display 105. The display 105 is an example of a presentation device. Such a process in which the image processing apparatus 3 automatically presents the master image candidates is referred to as “semi-automatic setting.” As illustrated in FIG. 8, in response to an operation of the mouse 108 or the keyboard 107, the automatic master adding unit 304 sorts a list of master image candidates on a sorting screen 410 displayed on the display 105, for each of the master image candidates, into two groups: the master image candidates to be registered as master images and the master image candidates not to be registered as master images. In the example illustrated in FIG. 8, numbers assigned to the respective master image candidates are vertically arranged in the center of the sorting screen 410. The user drags a desired number with a pointer 411 of the mouse 108, moves the number to one of a “use” area 412 and a “not use” area 413 arranged laterally on the sorting screen 410 in FIG. 8, and drops the number, thus registering or not registering the master image candidate corresponding to the number as a master image. Here, the “use” area 412 is an area in which master image candidates to be registered as master images are allocated.
FIG. 8 illustrates an example in which master image candidates No. 5 and No. 4 are selected as master images. On the other hand, the “not use” area 413 is an area in which master image candidates not to be registered as master images are allocated. FIG. 8 illustrates an example in which a master image candidate No. 8 is not registered as a master image. The mouse 108 and the keyboard 107 are examples of an operation device. The master image candidate for registration is registered as a new master image and used for comparison and collation with a target image. By contrast, the master image candidate not for registration is discarded together with the image data. The automatic master adding unit 304 is an example of a registering unit. [0040]
The determining unit 305 is a functional unit that determines whether an object to be inspected is normal or abnormal, based on template matching between a target image and a master image. The determining unit 305 is an example of a determining unit.
[0041]
In addition, each time when the determining unit 305 determines that the object is normal, the determining unit 305 causes the external storage device 104 to retain a use history of the master image used for the determination. The external storage device 104 is an example of a storage device. The user history of the master image is a use history 405 as illustrated in FIG. 9, for example. As illustrated in FIG. 9, the use history 405 is linked with each master image (e.g., M3, M5, and Ml) retained in the external storage device 104, thus being retained in the external storage device 104. The determining unit 305 has a function of acquiring a plurality of registered master images in descending order according to the use history 405 and determining whether the object is normal or abnormal. Specifically, for example, the determining unit 305 sorts the plurality of master images retained in the external storage device 104 in descending order according to use history 405 and determines whether the object is normal or abnormal in order from the master image having the greatest use history 405 in number.
[0042]
The setting data retaining unit 306 is a functional unit that retains, e.g., the detection frame information, the master image information, and use history information of a master image. The setting data retaining unit 306 is an example of a retaining unit. The setting data retaining unit 306 is implemented by the external storage device 104 illustrated in FIG. 2. [0043]
The determination data output unit 307 is a functional unit that outputs determination result data. The determination data output unit 307 controls display of the display 105. When the determining unit 305 determines that the object is normal, the determination data output unit 307 displays “OK” in association with the detection frame 420 displayed on the display 105, for example, as illustrated in FIG. 14A. By contrast, when the determining unit 305 determines that the object is abnormal, the determination data output unit 307 displays “NG” in association with the detection frame 420, for example, as illustrated in FIG. 14B.
[0044]
Note that some or all of the functional units illustrated in FIG. 3 may be implemented by a hardware circuit such as an application-specific integrated circuit (ASIC) or a field- programmable gate array (FPGA), instead of programs as software.
[0045]
Note that FIG. 3 conceptually illustrates functions of the image processing apparatus 3 as functional units. The functional configuration of the image processing apparatus 3 is not limited to the functional configuration illustrated in FIG. 3. For example, multiple independent functional units illustrated in FIG. 3 may construct an integrated functional unit. By contrast, functions held by a single functional unit illustrated in FIG. 3 may be separated from each other and held by individual functional units. In other words, a single functional unit illustrated in FIG. 3 may be divided into multiple functional units according to function. [0046]
FIG. 4 is a flowchart illustrating an example of an initial master image generation process of an image processing apparatus according to an embodiment.
[0047]
In step S 101, the video receiving unit 301 acquires image data from each of the cameras 2.
The image data is temporarily stored in the external storage device 104 as a target image. Specifically, the video receiving unit 301 acquires image data of a real-time image from the cameras 2. On the other hand, the video receiving unit 301 reads a moving image file, selects an optimum image from the moving image file while moving forward or scrolling, and acquires image data of the optimum image. Specifically, for example, in a case in which it is difficult to capture a real-time image of an inspection target at an optimum place in an automated facility inspection, the video receiving unit 301 reads a moving image file including a series of images taken and selects a still image from the moving image file. Thus, the video receiving unit 301 acquires image data form the moving image file for setting of a reference image.
[0048]
In step S 102, the detection frame setting unit 302 generates or sets a detection frame based on the image data acquired in step S 101. The detection frame information corresponding to the detection frame is stored in the external storage device 104.
[0049]
In step S 103, the master setting unit 303 generates an initial master image for each detection frame and ends the process. The initial master image and the image information corresponding to the initial master image are stored in the external storage device 104 for each detection frame.
[0050]
FIG. 5 is a flowchart illustrating an example of an automatic master addition process of an image processing apparatus according to an embodiment.
[0051]
In step S201, the video receiving unit 301 acquires, as camera image, image data from each of the cameras 2. The image data is temporarily stored in the external storage device 104 as a target image.
[0052]
In step S202, the automatic master adding unit 304 performs template matching between the target image and an initial master image. The automatic master adding unit 304 calculates a collation value based on the template matching between the target image and the initial master image.
[0053] In step S203, the automatic master adding unit 304 compares the collation value calculated in step S202 with an upper limit threshold and a lower limit threshold. As a result of the determination, when the collation value is equal to or less than the upper limit threshold and equal to or greater than the lower limit threshold (YES in step S203), the process proceeds to step S204. By contrast, when the collation value exceeds the upper limit threshold or when the collation value is less than the lower limit threshold (NO in step S203), the automatic master adding unit 304 discards the calculated collation value and the process returns to step S201.
[0054]
In step S204, the automatic master adding unit 304 extracts a target image having a collation value equal to or less than the upper limit threshold and equal to or greater than the lower limit threshold and adds the target image as a master candidate. The master candidate is retained as a master image candidate in the external storage device 104.
[0055]
In step S205, the automatic master adding unit 304 determines whether or not the automatic master adding unit 304 has repeated the process a certain number of times. When the automatic master adding unit 304 has not repeated the process the certain number of times (NO in step S205), the process returns to step S201. By contrast, when the automatic master adding unit 304 has repeated the process the certain number of times (YES in step S205), the process ends. The certain number of times corresponds to, e.g., a desired number of master image candidates.
[0056]
FIG. 6 is a flowchart illustrating an example of a matching determination process of an image processing apparatus according to an embodiment.
[0057]
In step S301, the video receiving unit 301 acquires, as camera image, image data from each of the cameras 2. The image data is temporarily stored in the external storage device 104 as a target image.
[0058]
In step S302, the determining unit 305 determines whether or not a determination trigger is turned on. When the determination trigger is turned on (YES in step S302), the process proceeds to step S303. By contrast, when the determination trigger is not turned on (NO in step S302), the process returns to step S301. The determination trigger is set to perform determination at a selected time as in a sampling inspection, for example, instead of a total inspection.
[0059]
In step S303, the automatic master adding unit 304 calculates a collation value based on template matching between a target image and an Nth master image of a plurality of master images. Note that N is a variable. N is also a natural number. For example, N is a randomly selected number in a case in which the plurality of master images is arranged in order. In a case in which the plurality of master images is arranged in descending order according to frequency of use, N = 1 indicates the master image having the highest frequency of use.
[0060]
In step S304, the determining unit 305 determines whether or not the collation value is equal to or greater than a threshold. The threshold is, e.g., the lower limit threshold described above. As a result of the determination, when the collation value is equal to or greater than the threshold (YES in step S304), the process proceeds to step S305. By contrast, when the collation value is less than the threshold (NO in step S304), the process proceeds to step S306. [0061]
In step S305, the determining unit 305 displays, e.g., “OK” in association with a detection frame displayed on the display 105, for example, and ends the process.
[0062]
In step S306, the determining unit 305 adds one to the variable N and sets N = N + 1.
[0063]
In step S307, the determining unit 305 determines whether or not the variable N is greater than the number of master images. When the variable N is equal to or less than the number of master images (NO in step S307), the process returns to step S303. By contrast, when the variable N is greater than the number of master images (YES in step S307), the process proceeds to step S308. Here, the number of master images is the number of master images registered as master images and stored in the external storage device 104.
[0064]
In step S308, the determining unit 305 displays, e.g., “NG” in association with the detection frame displayed on the display 105, for example, and ends the process.
[0065]
As described above, the image processing apparatus 3 of the present embodiment retains an initial master image to be compared and collated with an object to be inspected included in a target image with, e.g., the setting data retaining unit 306 serving as a retaining unit. The image processing apparatus 3 calculates a collation value based on template matching between the target image and the initial master image with, e.g., the automatic master adding unit 304 serving as a collation value calculating unit. The image processing apparatus 3 presents, as a master image candidate, the target image having a collation value equal to or greater than the first threshold and equal to or less than the second threshold to a user with, e.g., the display 105 serving as a presentation device. The image processing apparatus 3 causes the user to select whether or not to register each master image candidate as a master image with, e.g., the mouse 108 serving as an operation device. The image processing apparatus 3 registers, as a master image, the master image candidate selected to be registered with, e.g., the automatic master adding unit 304 serving as a registering unit.
[0066]
Typically, a user needs to specify a detection frame for each of a plurality of master images by dragging a mouse, for example. By contrast, according to the configuration described above, once a user registers an initial master image, the user simply selects whether or not to register a master image candidate. Thus, the configuration described above facilitates the registration of master images thus, simplifying the setting operation in the registration of master images.
[0067]
In addition, in the image processing apparatus 3 of the present embodiment, each time when the determining unit 305 determines that the object is normal, the determining unit 305 stores the use history 405 of the master image used for the determination in the external storage device 104. The determining unit 305 acquires a plurality of registered master images in descending order according to the use history 405 and determines whether the object is normal or abnormal. As the number of registered master images increases, the template matching takes a long time. Therefore, the determining unit 305 compares and collates the target image with the master image in descending order according to the use history 405, thus increasing the probability of early determination as to whether or not the object is normal and accelerating the processing. As a result, the image processing apparatus 3 of the present embodiment shortens a period of time taken to determine whether or not the object is normal. [0068]
In the present embodiment, the automatic master adding unit 304 calculates a collation value based on template matching between a target image and an initial master image.
Alternatively, the automatic master adding unit 304 may calculate a collation value based on the difference in the number of pixels between the target image and the initial master image. Alternatively, the collation value may be obtained from results of machine learning.
[0069]
In the present embodiment, based on comparison and collation between a target image including an object to be inspected and a reference image registered in advance, the determining unit 305 determines whether or not the object is normal. Alternatively, the determining unit 305 may determine whether or not the object is normal in another way.
FIGS. 16A to 16C are illustrations of grease applying inspection. As illustrated in FIG. 16A, in a grease applying inspection, a state of grease applied to an object 500 to be inspected by a grease application cylinder (or dispenser) 600 is inspected based on image data captured by the camera 2.
[0070]
FIG. 16B illustrates a target image including the object 500 before the grease is applied to the object 500. FIG. 16C illustrates the object 500 after the grease is applied to the object 500. A normal object to be inspected is the same for each inspection. For example, in a case in which a screw attached in a product is the object to be inspected, the same screw is attached in each product. Therefore, the image of the screw is a master image. By contrast, in a case in which the object to be inspected is a viscous semi-liquid such as a grease 501 illustrated in FIG.
16C, the shape of the object is not the same for each inspection.
[0071] FIG. 17 is a diagram illustrating image examples of the grease 501. As illustrated in FIG. 17, since the shape of the grease 501 is not the same for each inspection, if images of all the shapes of the grease 501 are registered as master images, an enormous number of images are registered as master images.
[0072]
FIG. 18 is an illustration of a case in which a background image is registered as a master image. To prevent such an unfavorable situation in which an enormous number of images are registered as master images, as illustrated in FIG. 18, the setting data retaining unit 306 may retain, as a reference background image, a background image corresponding to an initial reference image. The automatic master adding unit 304 may register the background image without grease as a master image, instead of registering an image of the grease as a master image. In this case, the automatic master adding unit 304 calculates a collation value indicating a degree of matching between a target image and the reference background image, based on template matching between the target image and the reference background image, for example. The determining unit 305 determines that an object to be inspected is normal in a case in which the collation value indicating the degree of matching between the target image and the reference background image is equal to or less than a given threshold.
[0073]
FIG. 19 is a diagram illustrating an example of an inspection screen. As illustrated in FIG.
19, when the grease is applied and the background image is hidden by the grease, the determining unit 305 determines as OK. By contrast, when the background image remains, the determining unit 305 determines as NG. Thus, the number of master images is limited. [0074]
The above-described embodiments are illustrative and do not limit the present invention.
Thus, numerous additional modifications and variations are possible in light of the above teachings. For example, elements and/or features of different illustrative embodiments may be combined with each other and/or substituted for each other within the scope of the present invention.
[0075]
The present invention can be implemented in any convenient form, for example using dedicated hardware, or a mixture of dedicated hardware and software. The present invention may be implemented as computer software implemented by one or more networked processing apparatuses. The processing apparatuses include any suitably programmed apparatuses such as a general purpose computer, a personal digital assistant, a Wireless Application Protocol (WAP) or third-generation (3G)-compliant mobile telephone, and so on. Since the present invention can be implemented as software, each and every aspect of the present invention thus encompasses computer software implementable on a programmable device. The computer software can be provided to the programmable device using any conventional carrier medium (carrier means). The carrier medium includes a transient carrier medium such as an electrical, optical, microwave, acoustic or radio frequency signal carrying the computer code. An example of such a transient medium is a Transmission Control Protocol/Intemet Protocol (TCP/IP) signal carrying computer code over an IP network, such as the Internet. The carrier medium also includes a storage medium for storing processor readable code such as a floppy disk, a hard disk, a compact disc read-only memory (CD- ROM), a magnetic tape device, or a solid state memory device.
[0076]
Each of the functions of the described embodiments may be implemented by one or more processing circuits or circuitry. Processing circuitry includes a programmed processor, as a processor includes circuitry. A processing circuit also includes devices such as an application specific integrated circuit (ASIC), a digital signal processor (DSP), a field programmable gate array (FPGA), and conventional circuit components arranged to perform the recited functions. [0077]
This patent application is based on and claims priority to Japanese Patent Application Nos. 2020-051878, filed on March 23, 2020, and 2021-032051, filed on March 1, 2021, in the Japan Patent Office, the entire disclosure of each of which is hereby incorporated by reference herein.
[Reference Signs List]
[0078]
1 image processing system 2a - 2d cameras
3 image processing apparatus
4 hub
301 video receiving unit
302 detection frame setting unit
303 master setting unit
304 automatic master adding unit
305 determining unit
306 setting data retaining unit
307 determination data output unit

Claims

[CLAIMS]
[Claim 1]
An image processing apparatus comprising: a determining unit configured to determine, based on comparison and collation between a target image including an object to be inspected and a reference image registered in advance, whether or not the object is normal; an image acquiring unit configured to repeatedly acquire the target image; a retaining unit configured to retain an initial reference image to be compared and collated with the object included in the target image; a collation value calculating unit configured to calculate a collation value indicating a degree of matching between the target image and the initial reference image; an extracting unit configured to extract, as a candidate for the reference image, the target image having the calculated collation value equal to or greater than a first threshold and equal to or less than a second threshold; a presentation device configured to present the candidate; an operation device configured to allow selection as to whether or not to register the candidate as the reference image; and a registering unit configured to register, as the reference image, the candidate selected to be registered.
[Claim 2]
The image processing apparatus according to claim 1, wherein the collation value calculating unit is configured to calculate the collation value based on template matching between the target image and the initial reference image.
[Claim 3]
The image processing apparatus according to claim 1 or 2, further comprising a storage device configured to store, in response to determination that the object is normal, a use history of the reference image used for the determination, wherein the determining unit is configured to acquire a plurality of reference images, including the reference image, registered by the registering unit in descending order according to the use history, to determine whether or not the object is normal.
[Claim 4]
The image processing apparatus according to any one of claims 1 to 3, wherein the retaining unit is configured to retain a background image corresponding to the initial reference image as a reference background image, wherein the collation value calculating unit is configured to calculate another collation value indicating a degree of matching between the target image and the reference background image, and wherein the determining unit is configured to determine that the object is normal in a case in which said another collation value is equal to or less than a threshold.
[Claim 5] An image processing method comprising: determining, based on comparison and collation between a target image including an object to be inspected and a reference image registered in advance, whether or not the object is normal; repeatedly acquiring the target image; retaining, in a storage device, an initial reference image to be compared and collated with the object included in the target image; calculating a collation value indicating a degree of matching between the target image and the initial reference image; extracting, as a candidate for the reference image, the target image having the calculated collation value equal to or greater than a first threshold and equal to or less than a second threshold; presenting, with a presentation device, the candidate; allowing selection, with an operation device, as to whether or not to register the candidate as the reference image; and registering, as the reference image, the candidate selected to be registered in the storage device.
PCT/IB2021/052150 2020-03-23 2021-03-16 Image processing apparatus and image processing method WO2021191727A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2020051878 2020-03-23
JP2020-051878 2020-03-23
JP2021-032051 2021-03-01
JP2021032051A JP2021152889A (en) 2020-03-23 2021-03-01 Image processing system and image processing method

Publications (1)

Publication Number Publication Date
WO2021191727A1 true WO2021191727A1 (en) 2021-09-30

Family

ID=77886617

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2021/052150 WO2021191727A1 (en) 2020-03-23 2021-03-16 Image processing apparatus and image processing method

Country Status (2)

Country Link
JP (1) JP2021152889A (en)
WO (1) WO2021191727A1 (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5215B2 (en) 1971-09-06 1977-01-05
JP2009217368A (en) 2008-03-07 2009-09-24 Panasonic Electric Works Co Ltd Image processor
US20100226563A1 (en) * 2009-03-04 2010-09-09 Omron Corporation Model image acquisition support apparatus, model image acquisition support method, and model image acquisition support program
US20140063241A1 (en) * 2011-03-15 2014-03-06 Siemens Healthcare Diagnostics Inc. Multi-view stereo systems and methods for tube inventory in healthcare diagnostics
JP5568277B2 (en) 2009-10-22 2014-08-06 株式会社日立ハイテクノロジーズ Pattern matching method and pattern matching apparatus
JP5869988B2 (en) 2012-08-24 2016-02-24 富士フイルム株式会社 Article collation apparatus and method, and program
JP2018010368A (en) 2016-07-11 2018-01-18 株式会社リコー Process determination device, and process determination method
JP2020051878A (en) 2018-09-27 2020-04-02 東芝ライテック株式会社 Method for evaluating light source and illumination device
JP2021032051A (en) 2019-08-29 2021-03-01 株式会社大林組 Scattering prevention device and scattering prevention method

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5215B2 (en) 1971-09-06 1977-01-05
JP2009217368A (en) 2008-03-07 2009-09-24 Panasonic Electric Works Co Ltd Image processor
US20100226563A1 (en) * 2009-03-04 2010-09-09 Omron Corporation Model image acquisition support apparatus, model image acquisition support method, and model image acquisition support program
JP5568277B2 (en) 2009-10-22 2014-08-06 株式会社日立ハイテクノロジーズ Pattern matching method and pattern matching apparatus
US20140063241A1 (en) * 2011-03-15 2014-03-06 Siemens Healthcare Diagnostics Inc. Multi-view stereo systems and methods for tube inventory in healthcare diagnostics
JP5869988B2 (en) 2012-08-24 2016-02-24 富士フイルム株式会社 Article collation apparatus and method, and program
JP2018010368A (en) 2016-07-11 2018-01-18 株式会社リコー Process determination device, and process determination method
JP2020051878A (en) 2018-09-27 2020-04-02 東芝ライテック株式会社 Method for evaluating light source and illumination device
JP2021032051A (en) 2019-08-29 2021-03-01 株式会社大林組 Scattering prevention device and scattering prevention method

Also Published As

Publication number Publication date
JP2021152889A (en) 2021-09-30

Similar Documents

Publication Publication Date Title
US11853347B2 (en) Product auditing in point-of-sale images
WO2021003825A1 (en) Video shot cutting method and apparatus, and computer device
JP7394809B2 (en) Methods, devices, electronic devices, media and computer programs for processing video
KR102002024B1 (en) Method for processing labeling of object and object management server
CN109829397B (en) Video annotation method and system based on image clustering and electronic equipment
CN109446061B (en) Page detection method, computer readable storage medium and terminal device
US10891489B2 (en) Identifying and tracking words in a video recording of captioning session
JP2015187759A (en) Image searching device and image searching method
WO2020135756A1 (en) Video segment extraction method, apparatus and device, and computer-readable storage medium
US8655016B2 (en) Example-based object retrieval for video surveillance
CN115470109A (en) Automatic testing method and device for automobile instrument
US9699501B2 (en) Information processing device and method, and program
CN111124863A (en) Intelligent equipment performance testing method and device and intelligent equipment
CN110475139B (en) Video subtitle shielding method and device, storage medium and electronic equipment
WO2021191727A1 (en) Image processing apparatus and image processing method
CN108334602B (en) Data annotation method and device, electronic equipment and computer storage medium
WO2019214019A1 (en) Online teaching method and apparatus based on convolutional neural network
CN112818984B (en) Title generation method, device, electronic equipment and storage medium
US20230092026A1 (en) Processing device, processing method, and non-transitory storage medium
CN114842476A (en) Watermark detection method and device and model training method and device
CN114627556A (en) Motion detection method, motion detection device, electronic apparatus, and storage medium
CN109299294B (en) Resource searching method and device in application, computer equipment and storage medium
WO2021106028A1 (en) Machine-learning device, machine-learning method, and recording medium having machine-learning program stored therein
CN112925942A (en) Data searching method, device, equipment and storage medium
CN113536031A (en) Video searching method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21713144

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21713144

Country of ref document: EP

Kind code of ref document: A1