WO2012050074A1 - 画像処理装置および画像処理方法 - Google Patents
画像処理装置および画像処理方法 Download PDFInfo
- Publication number
- WO2012050074A1 WO2012050074A1 PCT/JP2011/073301 JP2011073301W WO2012050074A1 WO 2012050074 A1 WO2012050074 A1 WO 2012050074A1 JP 2011073301 W JP2011073301 W JP 2011073301W WO 2012050074 A1 WO2012050074 A1 WO 2012050074A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- processing
- processing target
- image processing
- image
- setting
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
- G06T7/0008—Industrial image inspection checking presence/absence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
- G06T7/001—Industrial image inspection using an image reference approach
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8851—Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/24—Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20092—Interactive image processing based on input by user
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20092—Interactive image processing based on input by user
- G06T2207/20104—Interactive definition of region of interest [ROI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
- G06T2207/30128—Food products
Definitions
- the present invention relates to an image processing apparatus and an image processing method for executing image processing for each of a plurality of processing target areas defined for an input image.
- an input image is obtained by imaging a measurement object (hereinafter also referred to as “work”), and image processing is performed on a predetermined processing target area in the input image.
- An image processing apparatus that executes is used for general purposes.
- image processing there is matching processing (hereinafter also referred to as “pattern matching processing”) based on a previously registered pattern (hereinafter also referred to as “model”).
- pattern matching processing defects such as scratches and dust appearing on the workpiece can be detected, or an area on the workpiece similar to the model can be found.
- processing for performing inspection, identification, and the like on the workpiece using the image processing result is also collectively referred to as “measurement processing”.
- Patent Document 1 JP 2009-111886 A discloses an example of pattern matching processing.
- the image processing apparatus disclosed in Patent Document 1 it is possible to perform processing such as searching for an area that matches a pre-registered model in an input image.
- an input image is acquired by including the entire set including a plurality of workpieces in one imaging range, and each of the acquired input images is included in the acquired image.
- a technique for executing measurement processing on a workpiece is employed.
- Patent Document 2 discloses a method for searching for a plurality of workpieces in one search area.
- Patent Document 3 discloses a shape inspection method for performing an accurate defect inspection even when noise exists between image patterns representing a repetitive pattern.
- JP 2009-111886 A Japanese Patent Application Laid-Open No. 07-078257 JP 2009-300431 A
- Patent Document 2 evaluates each work, and there is a problem that the process for evaluating the whole of a plurality of works becomes complicated.
- an inspection region having a repetitive pattern is automatically divided, but it takes time for automatic division, and automatic division may fail. If the automatic division fails, the measurement process is interrupted even if the position and number of products to be arranged are known, which may be a factor that hinders productivity. In addition, for example, when there is no product (work) to be included in the same package, this has to be detected, but it is not subject to automatic division and cannot be detected. Furthermore, the method disclosed in Patent Document 3 does not evaluate the entire plurality of workpieces.
- An object of the present invention is to provide an image processing apparatus and an image processing method capable of performing appropriate measurement processing on a workpiece in which a plurality of objects to be subjected to image processing are regularly arranged in an input image. Is to provide.
- an image processing apparatus that executes image processing for each of a plurality of processing target areas defined for an input image.
- the image processing apparatus receives a setting related to common image processing executed for each of the plurality of processing target areas, receives a setting of a reference area for defining the plurality of processing target areas for the input image, Accepting a setting for regularly defining a plurality of processing target areas with reference to a reference area, executing image processing on each of the plurality of processing target areas according to a setting related to common image processing, The entire processing result reflecting the result of each image processing for the processing target area is output.
- the image processing includes processing for determining whether or not a preset condition is satisfied.
- the image processing apparatus further accepts setting of a determination condition regarding the number of processing target areas having a specific determination result among the plurality of processing target areas. As the overall processing result, it is output whether or not the determination result in each of the plurality of processing target areas satisfies the determination condition.
- the determination result in each of the plurality of processing target areas is output by changing the display mode on the input image.
- the image processing apparatus further accepts a setting for enabling or disabling as an execution target of image processing separately from a plurality of processing target areas. Image processing is skipped for a processing target area invalidated as an image processing execution target among the plurality of processing target areas.
- the image processing apparatus further displays an input image and a plurality of processing target areas set for the input image.
- the selected processing target area is specified from among the plurality of processing target areas, and the processing target area is enabled or disabled as an image processing execution target. It is determined whether or not
- the image processing apparatus defines a plurality of processing target areas on the input image so that adjacent processing target areas satisfy the accepted setting.
- the plurality of processing target areas are at least one of when a new setting of the reference area is received and when a new setting for regularly defining the plurality of processing target areas is received. Redefined on the input image.
- a plurality of processing target areas are defined in a matrix with respect to a rectangular reference area.
- the plurality of processing target areas are defined in a staggered pattern.
- the plurality of processing target areas are defined so as not to overlap each other, inscribed in a reference area set in an arbitrary shape.
- a plurality of processing target areas are defined radially around a point in the reference area.
- the image processing includes matching processing using a single model registered in advance.
- an image processing method for executing image processing for each of a plurality of processing target areas defined for an input image.
- the image processing method includes a step of accepting a setting related to common image processing executed for each of a plurality of processing target areas, and a setting of a reference area for defining a plurality of processing target areas for an input image. Accepting, receiving a setting for regularly defining a plurality of processing target areas with reference to the reference area, and image processing for each of the plurality of processing target areas according to a setting related to common image processing And a step of outputting an overall processing result reflecting the results of the respective image processing for a plurality of processing target areas.
- 1 is a schematic diagram showing an overall configuration of a visual sensor system including an image processing apparatus according to an embodiment of the present invention. It is a schematic diagram which shows the example of the workpiece
- 1 is a schematic configuration diagram of an image processing apparatus according to an embodiment of the present invention. It is a flowchart which shows the procedure of the whole process performed with the image processing apparatus which concerns on embodiment of this invention. It is a figure which shows an example of the user interface screen which concerns on the model registration process provided with the image processing apparatus which concerns on embodiment of this invention. It is a figure which shows an example of the user interface screen which concerns on the area
- ⁇ A. Overview In the image processing apparatus according to the present embodiment, a plurality of processing target areas are set for the input image.
- the image processing apparatus executes image processing (measurement processing) for each of the set processing target areas, and outputs an overall processing result reflecting the result of the image processing for each processing target area. To do.
- the image processing apparatus regularly defines a plurality of processing target areas based on the reference area in response to the setting of the reference area. In this way, it is possible to simultaneously set conditions related to image processing for a plurality of workpieces, and for example, it is possible to perform image processing independently for each processing target area associated with each of the plurality of workpieces. Therefore, the condition setting can be simplified and the measurement process can be appropriately executed.
- FIG. 1 is a schematic diagram showing an overall configuration of a visual sensor system 1 including an image processing apparatus 100 according to an embodiment of the present invention.
- FIG. 2 is a schematic diagram illustrating an example of a workpiece targeted by the visual sensor system 1 including the image processing apparatus 100 according to the embodiment of the present invention.
- the visual sensor system 1 is incorporated in a production line or the like, and performs a measurement process on the work set 2.
- the visual sensor system 1 according to the present embodiment is adapted to measurement processing for a set of workpieces in which a plurality of workpieces are regularly arranged.
- the measurement process executed by the image processing apparatus 100 typically includes a search process and a labeling process in many cases.
- the search process is a process in which a characteristic part of a work is registered in advance as an image pattern (model), and a part of the input image that is most similar to the pre-registered model is searched from the input image. At this time, the position / tilt / rotation angle of the portion most similar to the model, a correlation value indicating how similar to the model, and the like are calculated.
- a part that matches a pre-registered model or display attribute (color, etc.) is searched from the input image, and a label (number) is assigned to the searched part. By using these numbers, the area of the designated portion, the position of the center of gravity, and the like are calculated in response to the designation of the number.
- FIG. 1 shows an example of an inspection line of a press-through package (hereinafter also referred to as “PTP”) in which tablets and the like are packaged.
- PTP press-through package
- each tablet packaged in a PTP which is an example of the work set 2 corresponds to a work.
- FIG. 1 shows a state in which each PTP is missing one tablet even though 4 ⁇ 6 tablets are supposed to be packaged correctly.
- imaging is performed so that an image corresponding to at least one PTP is included in one input image, and measurement processing is performed on the input image. A missing tablet is detected as shown in FIG.
- Figure 2 shows another application example. That is, in the example shown in FIG. 2, a measurement object is set in which a plurality of bottles filled with beverages are set in each case 3. For example, each case 3 is applied to a line for inspecting whether a specified number of beer bottles are packed before shipping.
- FIG. 2 shows a state in which one bottle is missing in the case 3 on the right side of the page.
- the visual sensor system 1 according to the present embodiment detects such a shortage according to a logic as described later.
- the image processing apparatus 100 performs image processing (measurement processing) for each of a plurality of processing target areas (that is, objects) defined for an input image, and thereby performs a plurality of processing targets
- image processing measurement processing
- the entire process result reflecting the result of each image process (measurement process) for the region (object) is output.
- the work set 2 is conveyed by a conveyance mechanism 6 such as a belt conveyor, and the conveyed work set 2 is imaged by the imaging device 8 at a predetermined timing.
- the imaging device 8 includes an imaging element that is partitioned into a plurality of pixels, such as a CCD (Coupled Charged Device) or a CMOS (Complementary Metal Oxide Semiconductor) sensor, in addition to an optical system such as a lens.
- the image (input image) obtained by imaging by the imaging device 8 is transmitted to the image processing device 100.
- the image processing apparatus 100 performs a pattern matching process on the input image received from the imaging apparatus 8 and displays the result on the connected display 102 or outputs the result to an external apparatus.
- the arrival of the work set 2 within the field of view of the imaging device 8 is detected by the photoelectric sensors 4 disposed at both ends of the transport mechanism 6.
- the photoelectric sensor 4 includes a light receiving unit 4 a and a light projecting unit 4 b arranged on the same optical axis, and the light emitted from the light projecting unit 4 b is shielded by the work set 2.
- the arrival of the work set 2 is detected by detecting with the light receiving unit 4a.
- the trigger signal of the photoelectric sensor 4 is output to a PLC (Programmable Logic Controller) 5.
- the PLC 5 receives the trigger signal from the photoelectric sensor 4 and the like, and controls the transport mechanism 6 itself.
- the image processing apparatus 100 has a measurement mode for executing various image processes on the work set 2 and a setting mode for performing a model registration process to be described later. These modes can be switched by the user operating the mouse 104 or the like.
- the image processing apparatus 100 is typically a computer having a general-purpose architecture, and provides various functions as described later by executing a program (instruction code) installed in advance. Such a program typically circulates while being stored in the memory card 106 or the like.
- an OS Operating System
- the program according to the present embodiment may be a program module that is provided as a part of the OS and calls a necessary module at a predetermined timing in a predetermined arrangement to execute processing. Good. That is, the program itself according to the present embodiment does not include the module as described above, and the process may be executed in cooperation with the OS.
- the program according to the present embodiment may be a form that does not include some of such modules.
- the program according to the present embodiment may be provided by being incorporated in a part of another program. Even in that case, the program itself does not include the modules included in the other programs to be combined as described above, and the processing is executed in cooperation with the other programs. That is, the program according to the present embodiment may be in a form incorporated in such another program. A part or all of the functions provided by executing the program may be implemented as a dedicated hardware circuit.
- FIG. 3 is a schematic configuration diagram of the image processing apparatus 100 according to the embodiment of the present invention.
- an image processing apparatus 100 includes a CPU (Central Processing Unit) 110 that is an arithmetic processing unit, a main memory 112 and a hard disk 114 as a storage unit, a camera interface 116, an input interface 118, and a display A controller 120, a PLC interface 122, a communication interface 124, and a data reader / writer 126 are included. These units are connected to each other via a bus 128 so that data communication is possible.
- a CPU Central Processing Unit
- main memory 112 main memory
- a hard disk 114 as a storage unit
- a camera interface 116 an input interface 118
- a controller 120, a PLC interface 122, a communication interface 124, and a data reader / writer 126 are included. These units are connected to each other via a bus 128 so that data communication is possible.
- the CPU 110 performs various operations by developing programs (codes) stored in the hard disk 114 in the main memory 112 and executing them in a predetermined order.
- the main memory 112 is typically a volatile storage device such as a DRAM (Dynamic Random Access Memory), and in addition to the program read from the hard disk 114, the image data and work acquired by the imaging device 8 Holds information about data and models.
- the hard disk 114 may store various setting values.
- a semiconductor storage device such as a flash memory may be employed.
- the camera interface 116 mediates data transmission between the CPU 110 and the imaging device 8. That is, the camera interface 116 is connected to the imaging device 8 for imaging the work set 2 and generating image data. More specifically, the camera interface 116 can be connected to one or more imaging devices 8 and includes an image buffer 116 a for temporarily storing image data from the imaging devices 8. Then, when a predetermined number of frames of image data are accumulated in the image buffer 116a, the camera interface 116 transfers the accumulated data to the main memory 112. In addition, the camera interface 116 gives an imaging command to the imaging device 8 in accordance with an internal command generated by the CPU 110.
- the input interface 118 mediates data transmission between the CPU 110 and an input unit such as a mouse 104, a keyboard, and a touch panel. That is, the input interface 118 accepts an operation command given by the user operating the input unit.
- the display controller 120 is connected to the display 102, which is a typical example of a display device, and notifies the user of the result of image processing in the CPU 110. In other words, the display controller 120 is connected to the display 102 and controls display on the display 102.
- the PLC interface 122 mediates data transmission between the CPU 110 and the PLC 5. More specifically, the PLC interface 122 transmits information related to the state of the production line controlled by the PLC 5, information related to the workpiece, and the like to the CPU 110.
- the communication interface 124 mediates data transmission between the CPU 110 and a console (or personal computer or server device).
- the communication interface 124 typically includes Ethernet (registered trademark), USB (Universal Serial Bus), or the like.
- Ethernet registered trademark
- USB Universal Serial Bus
- the program downloaded from the distribution server or the like is installed in the image processing apparatus 100 via the communication interface 124. May be.
- the data reader / writer 126 mediates data transmission between the CPU 110 and the memory card 106 which is a recording medium. That is, the memory card 106 circulates in a state where a program executed by the image processing apparatus 100 is stored, and the data reader / writer 126 reads the program from the memory card 106. Further, the data reader / writer 126 writes the image data acquired by the imaging device 8 and / or the processing result in the image processing device 100 to the memory card 106 in response to an internal command of the CPU 110.
- the memory card 106 includes a general-purpose semiconductor storage device such as CF (Compact Flash) and SD (Secure Digital), a magnetic storage medium such as a flexible disk (Flexible Disk), and a CD-ROM (Compact Disk Read Only Memory). ) And other optical storage media.
- CF Compact Flash
- SD Secure Digital
- magnetic storage medium such as a flexible disk (Flexible Disk)
- CD-ROM Compact Disk Read Only Memory
- the image processing apparatus 100 may be connected to another output device such as a printer as necessary.
- the image processing apparatus 100 actually acquires an input image for each work set 2 and executes an “operation mode” in which measurement processing is performed on the acquired input image.
- the “mode” includes a “setting mode” for performing various settings for realizing an operation desired by the user. These “setting mode” and “operation mode” are appropriately switched according to a user operation.
- FIG. 4 is a flowchart showing a procedure of overall processing executed by the image processing apparatus 100 according to the embodiment of the present invention. Each step shown in FIG. 4 is provided by the CPU 110 of the image processing apparatus 100 executing a program (instruction code) prepared in advance. Note that FIG. 4 includes processing procedures for both “setting mode” and “operation mode”, and the initial mode is “setting mode”.
- CPU 110 accepts model registration (step S11).
- Pattern matching processing search processing is executed using the model set in this model registration processing.
- step S ⁇ b> 11 CPU 110 accepts settings related to common image processing executed for each of the plurality of processing target areas.
- the CPU 110 accepts the setting of a reference area for defining a plurality of process target areas for the input image (step S12). Further, CPU 110 accepts a setting (matrix setting) for regularly defining a plurality of processing target areas for the input image with reference to the reference area set in step S12 (step S13). At this point, a plurality of processing target areas are regularly defined according to the set value set in step S13 with respect to the input image with reference to the reference area set in step S12.
- the CPU 110 When a new setting for the reference area is accepted in step S12 or when a new setting for regularly defining a plurality of processing target areas is accepted in step S13, the CPU 110 The processing target area is redefined on the input image. That is, when the user changes the setting for regularly defining the reference area or the plurality of processing target areas, the CPU 110 also updates the plurality of specified processing target areas in accordance with the change.
- the CPU 110 receives the measurement parameter (step S14).
- This measurement parameter is used to output the conditions for evaluating the results of measurement processing executed for each processing target area and the overall processing results reflecting the results of the respective measurement processing for a plurality of processing target areas. Including conditions.
- the former condition includes a threshold value associated with a correlation value obtained when pattern matching processing is executed on each processing target region. That is, if the correlation value obtained as a result of the execution of the pattern matching process is equal to or greater than a predetermined threshold value, the processing target area is determined to be “OK”, and the correlation value is less than the predetermined threshold value. In this case, “NG” is determined.
- the measurement processing (image processing) executed on each processing target region includes processing for determining whether or not a condition preset as a part of the measurement parameter is satisfied.
- the latter condition setting of a determination condition for the number of processing target areas having a specific determination result among a plurality of processing target areas may be mentioned.
- the number of processing target areas determined to be “OK” is equal to or greater than a predetermined threshold value. If so, the input image is determined as “OK” as a whole, and “NG” is determined as a whole if the number of processing target areas determined as “OK” is less than the threshold value.
- conditions for evaluating the entire input image are set as the overall processing result based on the determination results in each of the plurality of processing target areas.
- This output parameter includes a condition for outputting a result of measurement processing (image processing) executed in the operation mode.
- step S16 the CPU 110 determines whether switching to the “operation mode” is instructed. If switching to the “operation mode” is not instructed (NO in step S16), the processes in and after step S11 are repeated. On the other hand, if switching to the “operation mode” is instructed (YES in step S16), the processes in and after step S21 are executed.
- steps S11 to S15 are described in series for convenience, but these processes may be executed in parallel or the execution order may be changed as appropriate.
- the CPU 110 waits for an input image acquisition timing (step S21).
- the sensor output from the photoelectric sensor 4 the light receiving unit 4a and the light projecting unit 4b shown in FIG. 1 detects that the work set 2 has reached the field of view of the imaging device 8, and this detection is performed from the PLC 5. Is transmitted, the CPU 110 determines that it is the acquisition timing of the input image.
- CPU 110 acquires the input image (step S22). More specifically, the CPU 110 gives the imaging apparatus 8 an imaging process by giving an imaging command to the imaging apparatus 8. Alternatively, when the imaging device 8 continuously captures images (with a predetermined frame period), the image data output from the imaging device 8 at the timing is stored as an input image. On the other hand, if it is determined that it is not the acquisition timing of the input image (NO in step S21), the process of step S21 is repeated.
- the CPU 110 regularly defines a plurality of processing target areas for the input image acquired in step S22 (step S23). At this time, the CPU 110 divides the image data indicating the input image in association with each processing target area. A subset of image data corresponding to each processing target area obtained by this division is a target of pattern matching processing.
- the CPU 110 sets the adjacent processing target region to regularly define a plurality of processing target regions set in step S13 (matrix setting).
- a plurality of processing target areas are defined on the input image so as to satisfy the above.
- the CPU 110 executes image processing (pattern matching processing) for each of the plurality of processing target regions in accordance with the setting related to the common image processing (pre-registered model) set in step S11 (step matching). S24). Subsequently, the CPU 110 determines whether or not the execution result of the image processing in step S24 satisfies the condition (measurement parameter) set in advance in step S14 (step S25).
- CPU 110 repeats the processes of steps S24 and S25 by the number of processing target areas defined for the input image.
- the CPU 110 outputs the entire processing result reflecting the results of the respective image processing for the plurality of processing target areas (step S26). At this time, the CPU 110 outputs a result of determining whether or not the determination result in each of the plurality of processing target areas satisfies the determination condition (measurement parameter) set in advance in step S14 as the overall processing result. And this process is completed.
- step S27 the CPU 110 determines whether or not an instruction to switch to the “setting mode” has been given. If switching to the “setting mode” is not instructed (NO in step S27), the processes in and after step S21 are repeated. On the other hand, if switching to the “setting mode” is instructed (YES in step S27), the processing from step S11 onward is executed.
- FIGS. 5 to 11 Examples of user interface screens provided by the image processing apparatus 100 according to the present embodiment are shown in FIGS. 5 to 11 and FIG.
- the user interface screens shown in FIGS. 5 to 11 are provided in the setting mode, and the user interface screen shown in FIG. 13 is provided in the operation mode.
- the input image acquired by the imaging device 8 is displayed, and various parameters necessary for the measurement processing according to the present embodiment can be set.
- the user interface screens shown in FIGS. 5 to 11 can be switched to each other by selecting a tab.
- an input image acquired by the imaging device 8 is displayed, and a result of a measurement process performed on the input image is displayed.
- FIG. 5 is a diagram showing an example of a user interface screen 201 related to the model registration process provided by the image processing apparatus 100 according to the embodiment of the present invention.
- the user interface screen 201 accepts settings related to common image processing executed for each of a plurality of processing target areas.
- a model registration tab 210 is displayed in a selectable manner.
- a user interface screen 201 shown in FIG. 5 is provided when the model registration tab 210 is selected.
- the user interface screen 201 includes a model parameter setting area 220, a model registration image area 228, an image display area 250, an entire display area 252, and a display control icon group 254.
- an input image generated by imaging with the imaging device 8 is displayed.
- a reference work set (reference model) is set in the field of view of the imaging device 8.
- An input image obtained by imaging the work set is displayed in the image display area 250, and a user performs a mouse operation to set a range to be registered as a model. Is registered as a model.
- 3 rows ⁇ 3 columns of processing target areas (9 places) are set.
- detection target works OKW are arranged in seven places
- target work NGWs that should not be detected are arranged in the other one place
- the remaining one place is placed in the remaining one place.
- An example in which no work is arranged is shown. Therefore, it is inherently necessary to determine “OK” for the seven workpieces OKW and to determine “NG” for the remaining processing target area, or to skip the measurement process.
- FIG. 5 shows an example of registering a circular (including both a perfect circle and an ellipse) range as a model.
- the model area 262 is set by the user moving the cursor to the center position to be registered as a model (cursor position CRS1) and then dragging it to the outer peripheral position of the model (cursor position CRS2).
- the image in the model area 262 is registered as a model.
- the center position 260 of the model area 262 is also displayed.
- the shape registered as a model can be arbitrarily set by the user. That is, when the user selects the edit button 226, a pop-up screen (not shown) for selecting a model shape is displayed, and a rectangle or a polygon can be selected on the pop-up screen.
- a plurality of models can be registered for the same input image.
- the registered models are displayed in a list in the registered image area 227 with characters indicating their shapes. In the example shown in FIG. 5, a circular model is registered, and this is displayed together with the characters “ellipse”.
- the display range / display magnification of the image displayed in the image display area 250 is changed according to the selected button.
- the entire display area 252 displays the entire image that can be displayed in the image display area 250.
- a model parameter setting area 220 for inputting settings related to the pattern matching process is displayed.
- settings search mode, stability, accuracy, etc.
- correlation search search processing (pattern matching processing) is executed based on a correlation value between a model and an image in a processing target region.
- shape search search processing (pattern matching processing) is performed based on a value indicating the shape between the model and the image in the processing target region (for example, an edge code indicating the vector amount of the edge). Executed.
- the search process is executed by rotating the model within the rotation range set by the user in the numerical value input box in the rotation parameter setting area 222. Further, an angular interval (step angle) for rotating the model is also set. By appropriately setting the rotation range and the step angle according to the set model and the target processing target region, it is possible to improve the processing speed while maintaining the search accuracy.
- the user can set the stability and accuracy related to the search process by operating the slides 224 and 225, respectively.
- Increasing the value of stability can reduce the probability of false detection, but the time required for search processing is relatively long.
- by increasing the accuracy value the accuracy of the detected coordinate position can be increased, but the time required for the search process becomes relatively long. Therefore, the user sets these parameters in consideration of the inspection time allowed for each work set.
- a “registered image display” button for displaying a registered model a “model re-registration” button for re-registering a registered model
- a “delete” button for deleting a registered model is displayed in a selectable manner.
- FIG. 6 is a diagram showing an example of a user interface screen 202 related to the area setting process provided by the image processing apparatus 100 according to the embodiment of the present invention.
- the user interface screen 202 accepts the setting of a reference area for defining a plurality of process target areas for the input image.
- the user interface screen 202 shown in FIG. 6 is provided when the area setting tab 212 is selected.
- the size of one processing target area is set. That is, the model area 262 set in the model registration process shown in FIG. 5 is displayed so as to be superimposed on the input image, and the user operates the mouse 104 or the like to indicate one processing target area. Region 264 is set.
- the user interface screen 202 shown in FIG. 6 shows a state where the user has set a rectangular unit area 264 as an example.
- the user moves the cursor to the upper left position of the range to be the unit area 264 (cursor position CRS3), and subsequently drags it to the lower right position of the range to be the unit area 264 (cursor position CRS4).
- the unit area 264 is set.
- a reference area for defining a plurality of process target areas is set for the input image.
- the process for setting the reference area will be described with reference to FIG.
- the shape of the unit area 264 can be arbitrarily set by the user. That is, when the user selects the edit button 232, a pop-up screen (not shown) for selecting the shape of the unit region 264 is displayed, and a rectangle, a polygon, or the like can be selected on the pop-up screen.
- the set unit area 264 is displayed as a list in the registered image area 230 with characters indicating its shape. In the example shown in FIG. 6, a rectangular model is registered, and this is displayed together with the characters “rectangular”.
- a check box 234 “automatically update matrix settings” is displayed.
- the check box 234 is a button for enabling / disabling a process for linking the setting of the unit area 264 and the setting of a plurality of processing target areas for the input image.
- FIG. 7 to 9 are diagrams showing an example of the user interface screen 203 related to the matrix setting process provided by the image processing apparatus 100 according to the embodiment of the present invention.
- the user interface screen 203 accepts the setting of a reference area for defining a plurality of process target areas for the input image.
- the user interface screen 203 shown in FIG. 7 is provided when the matrix setting tab 214 is selected.
- a reference area is set using a unit area 264 indicating one processing target area set on the user interface screen 202 shown in FIG.
- the unit area 264 set in the area setting process shown in FIG. 6 is displayed over the input image, and the user operates the mouse 104 or the like.
- the unit areas 264 are respectively arranged at two or more positions on the input image.
- a reference area is set based on a plurality of unit areas 264 arranged on the input image.
- a range that inscribes two unit areas (copy) 266 arranged by moving the unit area 264 on the user interface screen is set as the reference area. Is done.
- the user moves the unit area 264 to the upper left to place the unit area (copy) 266_1 (moves from the cursor position CRS5 to the cursor position CRS6), and then moves the unit area 264 to the lower right.
- the unit area (copy) 266_2 is arranged (moved from the cursor position CRS7 to the cursor position CRS8).
- a rectangular range having apexes at the upper left coordinate point of the unit area (copy) 266_1 and the lower right coordinate point of the unit area (copy) 266_2 is set as the reference area.
- the user moves the unit area 264 so as to match the work at the top position (upper left) of the work set 2 appearing in the input image, and matches the work at the final position (lower right). Move to do.
- the unit area (copy) 266_1 and 266_2 are created based on the setting of the unit area 264, with the matrix setting tab 214 being selected and the user interface screen 203 shown in FIG. These may be displayed in a default position so as to be selectable.
- the user may set an arbitrary shape as the reference region.
- the user interface screen 203 accepts the setting of the reference area for defining a plurality of process target areas for the input image.
- the user can arbitrarily change the shapes of the unit areas (copy) 266_1 and 266_2. That is, when the user selects the edit button 232, a pop-up screen (not shown) for selecting the shape of the unit region 264 is displayed, and the size, shape, and the like can be changed on the pop-up screen.
- unit areas (copy) set on the input image are displayed in a list with characters indicating their shapes. In the example shown in FIG. 7, since two unit areas (copy) are set, this is displayed together with two “rectangular” characters.
- the user interface screen 203 accepts settings for regularly defining a plurality of processing target areas.
- a plurality of processing target regions are defined in a matrix (matrix shape) with respect to a rectangular reference region. Therefore, the user interface screen 203 accepts parameters necessary for arranging the processing target areas in a matrix in this way.
- the user interface screen 203 includes a matrix setting area 240.
- the matrix setting area 240 includes numerical value input boxes 241 and 242 for setting the number in the row direction (number of rows) and the number in the column direction (number of columns) of the processing target regions arranged in the reference region, respectively.
- a plurality of processing target areas are set with respect to the reference area. Note that the examples shown in FIGS. 7 to 9 show a state where the processing target area of 3 rows ⁇ 3 columns is set to be defined.
- the image processing apparatus 100 inputs a plurality of processing target areas so that the adjacent processing target areas satisfy the settings received in the numerical value input boxes 241 and 242 of the matrix setting area 240. Specify on the image. That is, a user interface screen 203 as shown in FIG. 8 is displayed.
- processing target area 267 a plurality of processing target areas 267_1 to 267_9 (collectively referred to as “processing target area 267”) are arranged in a matrix with respect to the input image. Is done.
- the size of each of the processing target areas 267_1 to 267_9 is the same as the size of the unit area 264 set in FIG.
- the plurality of processing target regions can be arranged in a matrix without overlapping each other.
- the reference area looks as if it was divided (see FIG. 8), and therefore, the name “number of divisions” is given on the user interface screen 203 shown in FIGS.
- a case where a plurality of processing target areas are arranged so as to overlap each other is allowed. Even in this case, if attention is paid to a common portion (for example, a center point) in each processing target area, a plurality of processing target areas are arranged in a matrix.
- the matrix setting area 240 further includes numerical value input boxes 243 and 244 for adjusting the size of the reference area, and numerical values for adjusting the overall positions of a plurality of processing target areas set with reference to the reference area. Input boxes 245 and 246 are included.
- the size of the reference area is changed when the user inputs desired numbers in the numerical value input boxes 243 and 244, respectively.
- the amount of change in the width of the reference area is input to the numerical value input box 243
- the amount of change in the height of the reference area is input to the numerical value input box 244.
- the numerical values input to the numerical value input boxes 243 and 244 are preferably relative values (relative to the currently set reference area). In this way, by changing the size of the reference area, the arrangement form of the processing target areas 267_1 to 267_9 (that is, the interval between the adjacent processing target areas 267, the position of the processing target area 267, etc.) is updated.
- the arrangement position of the reference area is changed. That is, the numerical value input box 245 inputs the amount of movement of the reference area in the X direction (left and right direction on the paper), and the numerical value input box 246 inputs the amount of change in the reference area in the Y direction (up and down direction on the paper). Is done. It should be noted that the numerical values input to the numerical value input boxes 245 and 246 are preferably relative values (relative to the currently set reference area). In this way, by changing the arrangement position of the reference region, the overall positional relationship of the processing target regions 267_1 to 267_9 with respect to the input image is updated.
- the image processing apparatus 100 when a new setting of the reference region is accepted, or a new setting for regularly defining a plurality of processing target regions is accepted. In such a case, a plurality of processing target areas are redefined on the input image.
- the user sets the start position and the final position of the reference area, and then continues in the row direction (up and down direction on the paper surface) and column direction (left and right direction on the paper surface). ),
- the plurality of processing target areas are regularly defined. Additionally, the user adjusts the size (width and height) of the reference area and the position of the reference area (X and Y directions).
- the process target areas can be arranged regularly more easily. That is, if only the processing target regions located at the upper left and lower right (or upper right and lower left) are set among the plurality of processing target regions to be set for the input image, the remaining processing target regions are automatically set. Therefore, the processing target area can be set very easily and in a short time.
- the main regularities may not be satisfied. For example, as shown in FIG. 8, there is no workpiece to be detected in the second row of the leftmost column.
- measurement processing images is performed separately from a plurality of prescribed processing target areas. It is possible to set whether to enable or disable the processing as an execution target.
- a pull-down menu 279 as shown in FIG. 9 is displayed.
- “validation” or “invalidation” can be selected.
- the corresponding processing target area 267 becomes a target of measurement processing (image processing).
- measurement processing image processing
- the user interface screen 203 displays the processing target area selected from the plurality of processing target areas in response to an input from an input device such as a mouse (or touch panel) associated with the display position on the display 102.
- an input device such as a mouse (or touch panel) associated with the display position on the display 102.
- the display form may be varied depending on the state of validation / invalidation so that it can be grasped at a glance whether each processing target area is validated or invalidated.
- the processing target area set to be invalid may be displayed in gray (grayed out).
- FIG. 10 is a diagram showing an example of a user interface screen 204 related to the measurement parameter setting process provided by the image processing apparatus 100 according to the embodiment of the present invention.
- the user interface screen 204 reflects the conditions for evaluating the results of the measurement processing (image processing) executed for each processing target area 267 and the evaluation results of the respective image processing for a plurality of processing target areas. Each condition for generating an overall processing result is accepted.
- the user interface screen 204 shown in FIG. 10 is provided when the measurement parameter tab 216 is selected.
- the user interface screen 204 includes a measurement condition area and an extraction condition area. These areas accept conditions for evaluating the results of measurement processing (image processing) executed on each processing target area 267.
- a subpixel processing check box 271 for setting whether or not the pattern matching processing is executed in units of subpixels, and a value of a candidate point level when executing the subpixel processing are set.
- a numerical value input box 272 for displaying When the sub-pixel processing check box 271 is validated, sub-pixel processing is executed on candidate points (in pixel units) having a high degree of coincidence with a model registered in advance. As a condition (threshold value) for extracting candidate points for executing this subpixel processing, the value (correlation value) input in the numerical value input box 272 is used.
- the extraction condition area accepts a condition (threshold value) for determining which area is “OK” among areas that match a pre-registered model. More specifically, in the extraction condition area, a numerical value input box 274 for setting a threshold value for a correlation value for determining “OK” and a rotation angle for determining “OK” And a numerical value input box 275 for setting a threshold range for.
- a correlation value is calculated as a value indicating the degree of coincidence with a model registered in advance, and the model image is rotated over a predetermined range so that the degree of coincidence is maximized. Therefore, the result of the pattern matching process includes a correlation value and a rotation angle. Therefore, the correlation value obtained as a result of the pattern matching process is greater than or equal to the value set in the numerical value input box 274, and the rotation angle obtained as a result of the pattern matching process is set in the numerical value input box 275. If it is within the range, the corresponding processing target area is determined to be “OK”.
- the user interface screen 204 includes a measurement parameter area and a determination condition area. These areas accept conditions for generating an overall processing result reflecting the evaluation results of the respective image processing for a plurality of processing target areas.
- a button 273 is displayed.
- “OK area number” is selected as the measurement mode.
- the overall processing result is “ It is determined as “OK”. That is, a result that the target work set 2 is “OK” is output.
- This “number of OK areas” measurement mode is suitable for processing for checking whether or not the work set 2 includes a specified number of works.
- “number of NG areas” is selected as the measurement mode.
- the overall processing result is “ It is determined as “OK”.
- This “NG area number” measurement mode is suitable for processing for checking whether or not the number of defective products included in the work set 2 is equal to or less than a predetermined number.
- the determination condition area accepts setting of a determination condition regarding the number of processing target areas that satisfy a preset condition among a plurality of processing target areas. More specifically, in the determination condition area, the number of processing targets having a specific determination result (that is, “OK” or “NG”) designated according to the measurement mode set in the radio button 273 is displayed. A numerical value input box 276 for setting the determination condition is displayed.
- the input image (workset As a result of the measurement processing for 2), if the number of processing target areas determined as “OK” is in the range of 0 to 9, “OK” is output as the overall processing result. Otherwise, “NG” is output as the overall processing result.
- the user interface screen 204 is provided with a measurement button 277 for preliminarily executing the measurement process.
- a measurement button 277 for preliminarily executing the measurement process.
- FIG. 10 shows an example of a state in which this measurement process is preliminarily executed. That is, among the processing target regions 267_1 to 267_9, the processing target regions obtained by the successful pattern matching processing include cross (+) marks 269_1, 269_2, 269_3, 269_5, 269_6, 269_7 indicating the respective coordinate positions. 269_8 is displayed. Further, area marks 268_1, 268_2, 268_3, 268_5, 268_6, 268_7, and 268_8 indicating the outer shape of the area that matches the model image obtained as a result of the pattern matching process are displayed together with the cross mark.
- the cross mark and the area mark are not displayed. Further, since the target work NGW that should not be detected is arranged in the processing target area 267_9, similarly, the cross mark and the area mark are not displayed.
- the user interface screen 204 includes a display setting area 278.
- this display setting area 278, radio buttons for selecting information displayed on the input image are displayed. That is, when the “correlation value” radio button is selected, the correlation value calculated by executing the pattern matching process is displayed in association with the corresponding processing target area, and when the “angle” radio button is selected, The angle calculated by executing the pattern matching process is displayed in association with the corresponding process target area.
- the judgment result in each of the plurality of processing target areas is displayed by changing the display mode on the input image.
- processing target regions determined to be “OK” in the example of FIG. 10, processing target regions 267_1, 267_2, 267_3, 267_5, 267_6, 267_7, 267_8)
- the outer frame is displayed in “green”, and the outer frame is displayed in “red” for the processing target region determined to be “NG” (the processing target region 267_9 in the example of FIG. 10).
- the outer frame of the processing target area (processing target area 267_4) that is not the target of the measurement process is displayed in “gray”.
- the user interface screen 204 outputs whether or not the determination results in each of the plurality of processing target areas satisfy the determination condition as the overall processing result. That is, the entire processing result reflecting the results of the respective image processing for the plurality of processing target areas is output. In addition, by changing the display mode on the input image, the determination result in each of the plurality of processing target areas is output.
- FIG. 11 is a diagram showing an example of a user interface screen 205 relating to output parameter setting processing provided by the image processing apparatus 100 according to the embodiment of the present invention.
- the user interface screen 205 accepts settings related to the output method of the results of measurement processing executed for a plurality of processing target areas respectively defined in the input image.
- the user interface screen 205 shown in FIG. 11 is provided when the output parameter tab 218 is selected.
- the user interface screen 205 includes an output coordinate area 281, a calibration area 282, and a comprehensive determination reflection area 283.
- a radio button for setting whether to output a value before correction of positional deviation or a value after correction of positional deviation is displayed as measurement coordinates.
- This misalignment correction includes preprocessing and the like for the input image acquired by the imaging of the imaging device 8. That is, in order to correct the optical characteristics of the imaging device 8 and the like, it is possible to perform preprocessing such as enlargement / reduction / rotation on the input image in advance, and the result obtained by the pattern matching process is It is selected whether to output with the value of the coordinate system before the pre-processing is performed or to output with the value of the coordinate system after the pre-processing is performed.
- radio buttons for setting whether to output a value before calibration processing or a value after calibration processing are displayed as measurement coordinates.
- the calibration process is a process of correcting an input image obtained by imaging a reference in advance in order to correct an error caused by the installation environment of the imaging device 8. In the calibration area 282, it is selected whether the coordinate value before the calibration process is output or the coordinate value after the calibration process is output.
- a radio button for setting whether or not the determination result for each processing target area is included in the total determination result is displayed.
- FIG. 12 is a schematic diagram showing an outline of processing executed in the “operation mode” of the image processing apparatus 100 according to the embodiment of the present invention.
- pattern matching processing is executed for each of a plurality of processing target areas according to the following procedure.
- a plurality of processing target areas are set according to the specified rule.
- the processes (2) and (3) are executed for all the processing target areas.
- the overall processing result is output based on the number of processing target areas determined as “OK” or the number of processing target areas determined as “NG”. . That is, when the measurement mode is “number of OK areas”, the number of process target areas determined as “OK” is calculated, and if the calculated number is within the range set as the determination condition, “OK” is output as the processing result, and “NG” is output otherwise. On the other hand, when the measurement mode is “number of NG areas”, the number of process target areas determined as “NG” is calculated, and if the calculated number is within the range set as the determination condition, “OK” is output as the processing result, and “NG” is output otherwise.
- FIG. 13 is a diagram showing an example of a user interface screen 301 provided in the “operation mode” by the image processing apparatus 100 according to the present embodiment.
- the user interface screen 301 shown in FIG. 13 displays the pattern matching processing result (“OK” or “NG”) in each processing target area and the corresponding display mode (the color of the outer frame that defines each processing target area). Notify the user by making it different.
- characters “OK” or “NG” indicating the entire processing result are displayed on the upper left.
- the result of the pattern matching process executed for each processing target area and the result of the whole process that summarizes the result of the pattern matching process for each processing target area are displayed on the same screen. Is displayed.
- the condition setting necessary for the measurement processing can be performed only once.
- a setting for defining a plurality of processing target areas it is only necessary to specify a reference area (entire range) and a rule (division method) for setting the processing target areas. Therefore, the setting process required until the measurement process is started is simplified.
- the processing target area is manually set for the input image, so that the process for automation is compared to the process of automatically dividing the reference area. Therefore, the processing time can be shortened and time loss such as erroneous setting of the processing target region can be avoided.
- the same pattern matching process (search process, labeling process, etc.) is executed in parallel on all the processing target areas, and further, these process results are obtained. Since the evaluation is performed comprehensively, a work set including a plurality of works can be reliably inspected.
- the reference area is automatically set by defining two unit areas (copy) 266 on the user interface screen.
- the user may be more likely to be a user friend by setting the reference area to an arbitrary shape.
- an example of a user interface that allows the user to set an arbitrary shape as the reference region is shown.
- FIG. 14 is a diagram showing an example of a user interface screen 203A related to setting of a processing target area provided by the image processing apparatus according to the first modification of the embodiment of the present invention.
- a user interface screen 203A related to setting of a processing target area provided by the image processing apparatus according to the first modification of the embodiment of the present invention.
- an arbitrary shape can be set in the reference region 280 by the user operating a mouse or the like on the input image displayed in the image display area 250.
- a rectangular reference region 280 is set by the user performing a drag operation from the cursor position CRS9 to the cursor position CRS10.
- a plurality of process target areas are regularly defined by the same process as described above.
- FIG. 15 is a schematic diagram illustrating an example of a workpiece targeted by the image processing apparatus according to the second modification of the embodiment of the present invention.
- FIG. 15 shows an example of a lighting device in which a plurality of LEDs are mounted in each row.
- the positions where the LEDs are mounted may be shifted little by little between adjacent columns.
- a configuration is employed in which the LED mounting positions in the odd-numbered columns are different from the LED mounting positions in the even-numbered columns.
- FIG. 16 is a diagram showing an example of a user interface screen 203B related to setting of a processing target area provided by the image processing apparatus according to the second modification of the embodiment of the present invention.
- the user interface screen 203B shown in FIG. 16 is different from the user interface screen 203 shown in FIGS. 7 to 9 in that a matrix setting area 240B including more setting items is provided.
- the matrix setting area 240B is further arranged in a zigzag manner with radio buttons 247 for selecting objects to be shifted in order to arrange the matrix setting area 240 shown in FIGS. 7 to 9 in a zigzag manner.
- the number of columns whose positions are shifted is set in the direction selected by the radio button 247.
- the positional deviation interval As shown in FIG. 16, when “1” is set as the “positional deviation interval”, one arrangement, that is, a relative positional deviation between the odd-numbered column and the even-numbered column. Is set.
- a displacement amount (X direction and Y direction) that causes a displacement is set.
- a plurality of processing target areas can be defined based on the reference area.
- adjacent processing target areas are defined on the input image so as to satisfy these setting parameters.
- FIG. 17 is a diagram illustrating an example of a user interface screen 203C relating to setting of a processing target area provided by the image processing apparatus according to the third modification of the embodiment of the present invention.
- the user interface screen 203C shown in FIG. 17 shows an example in which a circular reference region 296 is set for the input image displayed in the image display area 250.
- the reference region 296 is not limited to this and can be set to an arbitrary shape.
- the image processing apparatus according to the present modification inscribes a plurality of processing target regions into a matrix by inscribed in the reference region set to an arbitrary shape. So that they do not overlap each other.
- Such processing target area setting processing is suitable, for example, for a case in which some work is packed as much as possible in various containers.
- FIG. 18 shows a fourth modification of the embodiment of the present invention. It is a figure which shows an example of the user interface screen 203D which concerns on the setting of the process target area
- a concentric circle or a circular reference area is set for the input image displayed in the image display area 250.
- the reference area is divided in the radial direction, and the number of processing target areas according to a predetermined rule is set for each of the circles or concentric circles obtained by the division.
- the user interface screen 203D shown in FIG. 18 is provided with a matrix setting area 240D including more setting items than the user interface screen 203 shown in FIGS. It is different.
- the matrix setting area 240D further includes a numerical value input box 294 for setting the number of radial divisions for defining the processing target area radially with respect to the matrix setting area 240 shown in FIGS.
- FIG. 18 shows an example in which the reference area is divided into three because “3” is set in the numerical value input box 294.
- the individual setting area 290 includes numerical value input boxes 291, 292, and 293 for setting the number of processing target areas assigned to each of the divided concentric circles (or circles).
- a processing target area is set for each divided area.
- a group number that is, an identification number of a group for each division number divided in the radial direction is set.
- the number of divisions in the circumferential direction is set for each group.
- the number of divisions input to the numerical value input box 292 is set as the number of divisions for a group of numbers corresponding to the numerical values set in the numerical value input box 291.
- an angle for starting region setting is set for each group.
- the start angle input to the numerical value input box 293 is set as the division number for the group of numbers corresponding to the numerical values set in the numerical value input box 291. Accordingly, a numerical value set corresponding to the number of radial division numbers set in the numerical value input box 294 is input to the circumferential direction division number (numerical value input box 292) and the start angle (numerical value input box 293). It will be.
- 1 visual sensor system 1 work set, 3 case, 4 photoelectric sensor, 4a light receiving unit, 4b light projecting unit, 6 transport mechanism, 8 imaging device, 100 image processing device, 102 display, 104 mouse, 106 memory card, 112 main Memory, 114 hard disk, 116 camera interface, 116a image buffer, 118 input interface, 120 display controller, 122 interface, 124 communication interface, 126 data reader / writer, 128 bus.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Quality & Reliability (AREA)
- Chemical & Material Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Health & Medical Sciences (AREA)
- Analytical Chemistry (AREA)
- Biochemistry (AREA)
- General Health & Medical Sciences (AREA)
- Immunology (AREA)
- Pathology (AREA)
- Image Processing (AREA)
- Facsimile Image Signal Circuits (AREA)
- Image Analysis (AREA)
Abstract
Description
あるいはさらに好ましくは、任意の形状に設定される基準領域に内接させて、複数の処理対象領域が互いに重ならないように規定される。
本実施の形態に係る画像処理装置では、入力画像に対して複数の処理対象領域を設定する。画像処理装置は、この設定された複数の処理対象領域の各々に対して、画像処理(計測処理)を実行するとともに、それぞれの処理対象領域についての画像処理の結果を反映した全体処理結果を出力する。
図1は、本発明の実施の形態に係る画像処理装置100を含む視覚センサシステム1の全体構成を示す模式図である。図2は、本発明の実施の形態に係る画像処理装置100を含む視覚センサシステム1が対象とするワークの例を示す模式図である。
まず、本実施の形態に係る画像処理装置100において実行される全体処理の概要について説明する。なお、本実施の形態に係る画像処理装置100は、各ワークセット2についての入力画像を実際に取得するとともに、取得した入力画像に対して計測処理を実行する「稼動モード」と、この「稼動モード」においてユーザが所望する動作を実現するための各種設定を行うための「設定モード」とを有する。これらの「設定モード」と「稼動モード」とは、ユーザ操作に応じて、適宜切替えられる。
本実施の形態に係る画像処理装置100が提供するユーザインターフェイス画面の一例を図5~図11および図13に示す。図5~図11に示すユーザインターフェイス画面は、設定モードにおいて提供され、図13に示すユーザインターフェイス画面は、稼動モードにおいて提供される。
まず、図4のステップS11に示すモデル登録処理について説明する。
次に、図4のステップS12に示す領域設定処理について説明する。
次に、図4のステップS13に示すマトリックス設定処理について説明する。
このように、ユーザインターフェイス画面203は、入力画像に対して複数の処理対象領域を規定するための基準領域の設定を受付ける。
次に、図4のステップS14に示す計測パラメータ設定処理について説明する。
次に、図4のステップS15に示す出力パラメータ設定処理について説明する。
次に、図4のステップS21~26に示す「稼動モード」における処理について説明する。
(5)設定されている計測モードに応じて、「OK」と判断された処理対象領域の数、または、「NG」と判断された処理対象領域の数の基づいて、全体処理結果を出力する。すなわち、計測モードが「OK領域数」である場合には、「OK」と判断された処理対象領域の数を算出し、この算出した数が判定条件として設定された範囲内であれば、全体処理結果として「OK」を出力し、そうでなければ「NG」を出力する。一方、計測モードが「NG領域数」である場合には、「NG」と判断された処理対象領域の数を算出し、この算出した数が判定条件として設定された範囲内であれば、全体処理結果として「OK」を出力し、そうでなければ「NG」を出力する。
本実施の形態に係る画像処理装置によれば、計測処理の対象となるワークが多数ある場合であっても、計測処理に必要な条件設定が1回で済む。特に、複数の処理対象領域を規定するための設定としては、基準領域(全体範囲)と処理対象領域を設定する規則(分割方法)とを指定するだけで済む。そのため、計測処理を開始するまでに要求される設定処理が簡素化される。
(l1:第1変形例)
上述の実施の形態においては、図7に示すように、ユーザインターフェイス画面上で2つの単位領域(コピー)266を定義することで基準領域が自動設定される。これに対して、ワークセット2の角(端)に検出対象のワークが存在しない場合などには、ユーザが基準領域を任意の形状に設定することがよりユーザフレンドである場合もある。本変形例では、ユーザが任意の形状を基準領域として設定できるユーザインターフェイスの例を示す。
上述の実施の形態および第1変形例においては、複数の処理対象領域を規則的に規定する一例として、複数の処理対象領域を行列状に配置する例を示した。これに対して、第2変形例においては、複数の処理対象領域を千鳥状に規定する例について説明する。
上述の実施の形態および第1変形例においては、方形状の基準領域に対して、行方向および列方向に指定された数の処理対象領域を規定する例を示した。これに対して、第3変形例においては、ユーザが任意に設定した基準領域に対して、最大数の処理対象領域を規定する例について説明する。より具体的には、本変形例においては、任意の形状に設定される基準領域に内接させて、複数の処理対象領域を規則的に互いに重ならないように規定する。
上述の実施の形態においては、処理対象領域を行列状に規定する例を示した。これに対して、第4変形例においては、基準領域内の点を中心として、放射状に複数の処理対象領域を規定する例について説明する
図18は、本発明の実施の形態の第4変形例に係る画像処理装置で提供される処理対象領域の設定に係るユーザインターフェイス画面203Dの一例を示す図である。図18に示すユーザインターフェイス画面203Dにおいては、画像表示エリア250に表示される入力画像に対して、同心円または円の基準領域が設定される。そして、この基準領域を半径方向に分割するとともに、分割によって得られた円または同心円の各々に対して、所定の規則に係る数の処理対象領域が設定される。
Claims (13)
- 入力画像に対して規定された複数の処理対象領域毎に画像処理を実行する画像処理装置であって、
前記複数の処理対象領域の各々に対して実行される共通の画像処理に係る設定を受付けるための第1の設定入力手段と、
前記入力画像に対して前記複数の処理対象領域を規定するための基準領域の設定を受付けるための第2の設定入力手段と、
前記基準領域を基準として、前記複数の処理対象領域を規則的に規定するための設定を受付けるための第3の設定入力手段と、
前記共通の画像処理に係る設定に従って、前記複数の処理対象領域の各々に対して画像処理を実行するための処理実行手段と、
前記複数の処理対象領域に対するそれぞれの画像処理の結果を反映した全体処理結果を出力するための出力手段とを備える、画像処理装置。 - 前記画像処理は、予め設定された条件を満足するか否かを判断する処理を含み、
前記画像処理装置は、前記複数の処理対象領域のうち、特定の判断結果を有する処理対象領域の数についての判断条件の設定を受付けるための第4の設定入力手段をさらに備え、
前記出力手段は、前記全体処理結果として、前記複数の処理対象領域のそれぞれにおける判断結果が前記判断条件を満足しているか否かを出力する、請求項1に記載の画像処理装置。 - 前記出力手段は、前記入力画像上の表示態様を異ならせることで、前記複数の処理対象領域のそれぞれにおける判断結果を出力する、請求項2に記載の画像処理装置。
- 前記複数の処理対象領域の別に、前記画像処理の実行対象として有効化または無効化することについての設定を受付けるための第5の設定入力手段をさらに備え、
前記処理実行手段は、前記複数の処理対象領域のうち、前記画像処理の実行対象として無効化された処理対象領域については前記画像処理をスキップする、請求項1~3のいずれか1項に記載の画像処理装置。 - 前記入力画像および前記入力画像に対して設定された前記複数の処理対象領域を表示するための表示手段をさらに備え、
前記第5の設定入力手段は、前記表示手段における表示位置と関連付けられた入力デバイスからの入力に応答して、前記複数の処理対象領域のうち選択された処理対象領域を特定するともに、当該処理対象領域を前記画像処理の実行対象として有効化または無効化すべきかを決定する、請求項4に記載の画像処理装置。 - 隣接する処理対象領域が前記第3の設定入力手段によって受付けられた設定を満足するように、前記複数の処理対象領域を前記入力画像上に規定するための領域規定手段をさらに備える、請求項1~3のいずれか1項に記載の画像処理装置。
- 前記領域規定手段は、前記第2の設定入力手段によって前記基準領域の新たな設定が受付けられた場合、および、前記第3の設定入力手段によって前記複数の処理対象領域を規則的に規定するための新たな設定が受付けられた場合、の少なくとも一方において、前記複数の処理対象領域を前記入力画像上に再規定する、請求項6に記載の画像処理装置。
- 前記領域規定手段は、方形状の前記基準領域に対して、前記複数の処理対象領域を行列状に規定する、請求項6に記載の画像処理装置。
- 前記領域規定手段は、前記複数の処理対象領域を千鳥状に規定する、請求項6に記載の画像処理装置。
- 前記領域規定手段は、任意の形状に設定される前記基準領域に内接させて、前記複数の処理対象領域を互いに重ならないように規定する、請求項6に記載の画像処理装置。
- 前記領域規定手段は、前記基準領域内の点を中心として、放射状に前記複数の処理対象領域を規定する、請求項6に記載の画像処理装置。
- 前記画像処理は、予め登録された単一のモデルを用いたマッチング処理を含む、請求項1に記載の画像処理装置。
- 入力画像に対して規定された複数の処理対象領域毎に画像処理を実行する画像処理方法であって、
前記複数の処理対象領域の各々に対して実行される共通の画像処理に係る設定を受付けるステップと、
前記入力画像に対して前記複数の処理対象領域を規定するための基準領域の設定を受付けるステップと、
前記基準領域を基準として、前記複数の処理対象領域を規則的に規定するための設定を受付けるステップと、
前記共通の画像処理に係る設定に従って、前記複数の処理対象領域の各々に対して画像処理を実行するステップと、
前記複数の処理対象領域に対するそれぞれの画像処理の結果を反映した全体処理結果を出力するステップとを備える、画像処理方法。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP11832509.1A EP2629262B1 (en) | 2010-10-13 | 2011-10-11 | Image processor and image processing method |
KR20137009022A KR20130065712A (ko) | 2010-10-13 | 2011-10-11 | 화상 처리 장치 및 화상 처리 방법 |
CN201180047963.1A CN103140872B (zh) | 2010-10-13 | 2011-10-11 | 图像处理装置以及图像处理方法 |
KR1020157002475A KR101525759B1 (ko) | 2010-10-13 | 2011-10-11 | 화상 처리 장치 및 화상 처리 방법 |
US13/825,392 US20130177250A1 (en) | 2010-10-13 | 2011-10-11 | Image processing apparatus and image processing method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010-230519 | 2010-10-13 | ||
JP2010230519A JP5728878B2 (ja) | 2010-10-13 | 2010-10-13 | 画像処理装置および画像処理方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012050074A1 true WO2012050074A1 (ja) | 2012-04-19 |
Family
ID=45938303
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/073301 WO2012050074A1 (ja) | 2010-10-13 | 2011-10-11 | 画像処理装置および画像処理方法 |
Country Status (6)
Country | Link |
---|---|
US (1) | US20130177250A1 (ja) |
EP (1) | EP2629262B1 (ja) |
JP (1) | JP5728878B2 (ja) |
KR (2) | KR101525759B1 (ja) |
CN (1) | CN103140872B (ja) |
WO (1) | WO2012050074A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014228412A (ja) * | 2013-05-23 | 2014-12-08 | 富士通周辺機株式会社 | ワークの検査装置及びワークの検査方法 |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6427332B2 (ja) * | 2014-04-08 | 2018-11-21 | 株式会社ミツトヨ | 画像測定機 |
KR101712857B1 (ko) * | 2015-06-01 | 2017-03-07 | 주식회사 웰탑테크노스 | 병렬 처리 기반의 비파괴 검사를 위한 장치 및 이를 위한 방법 |
JP3207214U (ja) * | 2015-06-17 | 2016-11-04 | 済南達宝文汽車設備工程有限公司 | プレスライン末端の全部品の自動箱詰め装置 |
JP6333871B2 (ja) * | 2016-02-25 | 2018-05-30 | ファナック株式会社 | 入力画像から検出した対象物を表示する画像処理装置 |
KR102584696B1 (ko) * | 2016-04-26 | 2023-10-06 | 삼성디스플레이 주식회사 | 표시 패널의 광학 검사 방법 |
DE102016215144A1 (de) * | 2016-08-15 | 2018-02-15 | Ifm Electronic Gmbh | Verfahren zur Vollständigkeitsprüfung |
CN109964247B (zh) * | 2016-11-01 | 2023-05-05 | 株式会社富士 | 图像处理用元件形状数据生成系统及图像处理用元件形状数据生成方法 |
CN108375586B (zh) * | 2018-02-08 | 2020-12-15 | 湘潭大学 | 具有多个检测模式的缺陷检测装置及其方法 |
JP7148858B2 (ja) * | 2018-06-07 | 2022-10-06 | オムロン株式会社 | 画像処理装置、画像処理方法及び画像処理プログラム |
US11336831B2 (en) * | 2018-07-06 | 2022-05-17 | Canon Kabushiki Kaisha | Image processing device, control method, and program storage medium |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08101913A (ja) * | 1994-09-30 | 1996-04-16 | Omron Corp | 円形検査対象領域の座標指示方法及び装置並びにその装置を用いた検査装置 |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6141009A (en) * | 1997-07-16 | 2000-10-31 | Cognex Corporation | Interface for model definition |
US6349144B1 (en) * | 1998-02-07 | 2002-02-19 | Biodiscovery, Inc. | Automated DNA array segmentation and analysis |
CA2416966C (en) * | 2003-01-22 | 2007-12-11 | Centre De Recherche Industrielle Du Quebec | Method and apparatus for testing the quality of reclaimable waste paper matter containing contaminants |
US8020601B2 (en) * | 2004-01-19 | 2011-09-20 | Krones Ag | Machine for equipping articles with labels |
JP5104291B2 (ja) * | 2007-12-26 | 2012-12-19 | 富士通株式会社 | 画像解析プログラム、画像解析装置、および画像解析方法 |
CN101424645B (zh) * | 2008-11-20 | 2011-04-20 | 上海交通大学 | 基于机器视觉的焊球表面缺陷检测装置与方法 |
CN201436584U (zh) * | 2009-07-23 | 2010-04-07 | 沈阳天嘉科技有限公司 | 滚针轴承缺针检测装置 |
DE102010033549A1 (de) * | 2010-08-05 | 2012-02-09 | Krones Aktiengesellschaft | Verfahren und Vorrichtung zur Bildung von Gruppen zu verpackender Artikel und profilierte Schubleiste zur Verwendung hierbei |
-
2010
- 2010-10-13 JP JP2010230519A patent/JP5728878B2/ja active Active
-
2011
- 2011-10-11 CN CN201180047963.1A patent/CN103140872B/zh active Active
- 2011-10-11 US US13/825,392 patent/US20130177250A1/en not_active Abandoned
- 2011-10-11 WO PCT/JP2011/073301 patent/WO2012050074A1/ja active Application Filing
- 2011-10-11 KR KR1020157002475A patent/KR101525759B1/ko active IP Right Grant
- 2011-10-11 EP EP11832509.1A patent/EP2629262B1/en active Active
- 2011-10-11 KR KR20137009022A patent/KR20130065712A/ko active Application Filing
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08101913A (ja) * | 1994-09-30 | 1996-04-16 | Omron Corp | 円形検査対象領域の座標指示方法及び装置並びにその装置を用いた検査装置 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014228412A (ja) * | 2013-05-23 | 2014-12-08 | 富士通周辺機株式会社 | ワークの検査装置及びワークの検査方法 |
Also Published As
Publication number | Publication date |
---|---|
JP5728878B2 (ja) | 2015-06-03 |
KR20130065712A (ko) | 2013-06-19 |
EP2629262B1 (en) | 2024-03-20 |
EP2629262A1 (en) | 2013-08-21 |
CN103140872A (zh) | 2013-06-05 |
KR101525759B1 (ko) | 2015-06-09 |
CN103140872B (zh) | 2016-08-31 |
KR20150020723A (ko) | 2015-02-26 |
JP2012084000A (ja) | 2012-04-26 |
US20130177250A1 (en) | 2013-07-11 |
EP2629262A4 (en) | 2015-06-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5728878B2 (ja) | 画像処理装置および画像処理方法 | |
EP1560017B1 (en) | Glass bottle inspection device | |
US9728168B2 (en) | Image processing apparatus | |
US8698815B2 (en) | Image processing apparatus | |
JP5152231B2 (ja) | 画像処理方法および画像処理装置 | |
JP5371099B2 (ja) | 目視検査装置と目視検査方法 | |
JP2011163766A (ja) | 画像処理方法および画像処理システム | |
JP2012151250A (ja) | 基板検査システム | |
JP5335614B2 (ja) | 欠陥画素アドレス検出方法並びに検出装置 | |
EP2784746A1 (en) | Setting up an area sensing imaging system to capture single line images | |
CN102854195B (zh) | 彩色滤光片上缺陷坐标的检测方法 | |
US11321811B2 (en) | Imaging apparatus and driving method of the same | |
US9140541B2 (en) | Image measuring apparatus and image measuring method | |
JP2008287506A (ja) | 画像処理装置および画像処理方法 | |
JP5141470B2 (ja) | 画像合成方法および画像処理システム | |
JP2010133744A (ja) | 欠陥検出方法およびその方法を用いた視覚検査装置 | |
JP6395895B1 (ja) | 映像検品認識装置 | |
US20110221884A1 (en) | Image processing apparatus, image processing program, visual sensor system and image processing method | |
JP2011112379A (ja) | 画像処理装置および画像処理プログラム | |
JPH06258226A (ja) | 錠剤の外観検査方法 | |
JP2009168883A (ja) | 画像表示装置 | |
US10852244B2 (en) | Image processing apparatus, image processing method, and recording medium | |
JP4074718B2 (ja) | 検査装置とその検査方法 | |
US20220343643A1 (en) | Image processing apparatus and method | |
JPH0663734B2 (ja) | 表示図柄のずれ検査装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201180047963.1 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11832509 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13825392 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 20137009022 Country of ref document: KR Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2011832509 Country of ref document: EP |