JP5759133B2 - Shape-based matching parameter adjustment device and component mounting device - Google Patents

Shape-based matching parameter adjustment device and component mounting device Download PDF

Info

Publication number
JP5759133B2
JP5759133B2 JP2010214489A JP2010214489A JP5759133B2 JP 5759133 B2 JP5759133 B2 JP 5759133B2 JP 2010214489 A JP2010214489 A JP 2010214489A JP 2010214489 A JP2010214489 A JP 2010214489A JP 5759133 B2 JP5759133 B2 JP 5759133B2
Authority
JP
Japan
Prior art keywords
parameter
edge
image
search
means
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2010214489A
Other languages
Japanese (ja)
Other versions
JP2012069003A (en
Inventor
山田 和範
山田  和範
Original Assignee
Juki株式会社
Juki株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Juki株式会社, Juki株式会社 filed Critical Juki株式会社
Priority to JP2010214489A priority Critical patent/JP5759133B2/en
Publication of JP2012069003A publication Critical patent/JP2012069003A/en
Application granted granted Critical
Publication of JP5759133B2 publication Critical patent/JP5759133B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Description

  The present invention relates to a shape adjusting device used for shape base matching for finding an object from a search target image by comparing shape information such as an edge gradient, a method for adjusting the parameter, and a component for positioning a mounted component by shape base matching. The present invention relates to a mounting apparatus.

  As a method for searching for an object existing in a search target image using template data created in advance, there is a shape-based matching method for comparing shape information such as an edge gradient. In this shape-based matching method, parameter setting for edge determination is very important. Due to the difference in the parameter settings, the shape of the detected edge line changes slightly, which affects characteristics such as position detection accuracy, processing time, and robustness in the matching processing and position detection processing using the matching processing.

However, these characteristics have different required characteristics and adjustment levels for each object, and such subtle parameter adjustment has conventionally been required to be adjusted by the user according to the usage environment. In the conventional device, the user interacts by providing numerical input functions for parameters related to edge detection, such as minimum density value, minimum edge strength, filter size, etc., and specifying whether the edge is valid / invalid by surrounding it with a window. There was a mechanism that can be adjusted in the form.
In addition, there is an apparatus that automatically creates template data, for example, by performing evaluation related to processing performance when performing position detection processing and determining a template candidate having the highest evaluation value as an actual template (for example, Patent Document 1). reference).

Japanese Patent No. 4470503

  However, when the user performs the fine parameter adjustment as described above, there is a problem in that it cannot be set properly without knowing the position detection processing algorithm and behavior, and the performance cannot be maximized. In addition, there is a demand to finely adjust the shape base matching processing with desired accuracy and processing speed according to the application, and it has been difficult to realize this with conventional devices.

  In addition, when positioning a mounted component by performing position detection processing using shape-based matching processing in a component mounting device, the material or color of the component changes due to component lot switching, or subtle size fluctuations due to manufacturing errors. As a result, a difference from the template data during teaching may occur, which may cause problems such as a slight deterioration in positioning accuracy and an increase in tact time. Such deterioration of the level that does not cause an error cannot be detected unless carefully observed, and it is difficult to determine the timing of readjustment of parameters.

  Therefore, the present invention provides a shape base matching parameter adjustment device, a shape base matching parameter adjustment method, and a component for creating template data optimized for characteristics such as accuracy, processing time, and robustness in shape base matching processing. It is an object to provide a mounting apparatus.

In order to solve the above problem, a shape-based matching parameter adjusting device according to claim 1 is a shape-based matching parameter adjusting device that adjusts parameters used for shape-based matching processing, and is required for shape-based matching processing. A specifying means for specifying the accuracy and the required tact time, an evaluation image acquiring means for acquiring an image of a target object having an arbitrary variation with respect to a reference posture as an evaluation image of the parameter, and the evaluation Parameter adjustment means for adjusting the parameter based on the evaluation image acquired by the image acquisition means, wherein the parameter adjustment means specifies the accuracy of the shape-based matching process by the user using the specification means Parameter setting means for automatically setting the parameters so as to satisfy the required accuracy; and When the shape-based matching process is performed on the evaluation image using the parameter set by the data setting unit, the tact time does not satisfy the required tact time specified by the user by the specifying unit. The accuracy of the shape base matching process is evaluated while gradually changing the parameter set by the parameter setting means in a direction that shortens the tact time, and the parameter is updated to a parameter value that is the limit that the accuracy can maintain the required accuracy. Parameter updating means, and imaging means for imaging the object; and template image acquisition means for imaging the object in a reference posture by the imaging means and acquiring the image as a template image, and the evaluation The image acquisition means is the object having a posture obtained by adding an arbitrary change to the reference posture by the imaging means. A plurality of images are acquired, and these are acquired as the evaluation images, and the parameter setting means has the largest amount of relative edge variation from the plurality of evaluation images under the same imaging conditions acquired by the evaluation image acquisition means. Extraction means for extracting two large evaluation images, and unstable edge extraction means for extracting unstable edges whose edge coincidence between the two evaluation images extracted by the extraction means is lower than a predetermined determination threshold And evaluating the accuracy of the shape base matching process while gradually changing the parameter in a direction to exclude the edge portion corresponding to the unstable edge extracted by the unstable edge extraction means from the target of the shape base matching process, Parameter gradual change means for determining, as the parameter, a parameter value whose accuracy satisfies the required accuracy.
As described above, a part where the edge detection result becomes unstable due to the imaging exclusion, such as an edge with low contrast, can be excluded from the target of the shape-based matching process. Therefore, it is possible to set an optimum parameter that satisfies the required accuracy.

Furthermore, the shape-based matching parameter adjusting apparatus according to claim 2 is the template according to claim 1 , wherein the parameter is composed of coordinates of an edge point of the object and an edge gradient vector from the template image. A template parameter used when creating data, and a search parameter used when detecting the target object from a search target image based on the template data, and the parameter gradual change means includes the unstable edge extraction The template parameter is gradually changed in a direction in which an edge portion corresponding to the unstable edge extracted by the means is excluded from the template data.
In this way, by increasing the minimum edge strength or decreasing the edge scale value among the template parameters, it is possible to prevent extraction of unstable edges when creating template data from a template image. it can. Thereby, the unstable edge can be surely excluded from the object of the shape-based matching process.

According to a third aspect of the present invention, there is provided the shape base matching parameter adjusting device according to the first or second aspect, wherein the parameter is composed of coordinates of an edge point of the object and an edge gradient vector from the template image. A template parameter used when creating the template data and a search parameter used when detecting the target object from the search target image based on the template data, and the parameter gradual change means includes the unstable parameter The search parameter is gradually changed in a direction in which an edge portion corresponding to the unstable edge extracted by the edge extraction means is difficult to detect from the search target image.
In this way, by reducing the minimum edge strength or increasing the edge scale value of the search parameters, it is possible to prevent extraction of unstable edges when extracting edges from the search target image. Can do. As a result, unstable edges can be reliably excluded from the shape-based matching process.

Furthermore, the shape base matching parameter adjusting device according to claim 4 is the invention according to any one of claims 1 to 3 , wherein the accuracy of the shape base matching process is shape base matching for the plurality of evaluation images. It is characterized in that the result of processing is evaluated with repeated accuracy obtained by statistical processing.
In this way, since the repeatability evaluation is performed, highly robust evaluation can be performed.

According to a fifth aspect of the present invention, there is provided the shape base matching parameter adjusting apparatus according to any one of the first to fourth aspects, wherein the shape base matching processing is performed on the result of performing the edge detection processing on the search target image. A coarse search process is performed, and a fine search process is performed based on a result of the coarse search process, wherein the parameter update means includes a tact time of the edge detection process, the coarse search process, and the fine search process. The accuracy of the shape-based matching process is evaluated while gradually changing the parameters used in the respective processes in the direction of shortening, and the parameters are updated to the limit parameter values that can maintain the required precision. It is said.

A component mounting apparatus according to claim 6 is a component mounting apparatus that sucks an electronic component by a suction nozzle and mounts the electronic component at a predetermined position on a substrate. The shape base matching parameter adjusting device described in the above section, storage means for storing parameters used for shape base matching processing, and a search for capturing an image of an electronic component sucked by the suction nozzle and acquiring this as a search target image Target image acquisition means and positioning means for performing positioning based matching processing on the search target image acquired by the search target image acquisition means using the parameters stored in the storage means, and positioning the electronic component And detecting means for detecting the change of the parts lot, and when detecting the change of the parts lot by the detecting means Determining means for determining whether or not adjustment of the parameter stored in the storage means is necessary, and when the determination means determines that the adjustment of the parameter is necessary, the shape-based matching parameter adjusting device adjusts the parameter. It is characterized by adjusting parameters.
In this way, since it is detected that the part lot has been switched during production and the necessity of parameter adjustment is determined, it is possible to readjust the parameter at an appropriate timing. Therefore, shape-based matching processing (positioning processing) using always optimal parameters is possible, and appropriate component mounting is possible.

  According to the present invention, it is possible to set a shape base matching parameter optimized with respect to characteristics such as accuracy, tact time, and robustness in shape base matching processing. At this time, since the parameters are automatically adjusted according to the required accuracy and required tact time specified by the user, even if the user has no know-how about the algorithm of shape-based matching processing, etc. Parameter adjustment is possible.

  In addition, according to the component mounting apparatus of the present invention, since the change of the component lot is detected and the parameter is readjusted, the parameter can be readjusted at an appropriate timing. Therefore, the accuracy of the parts slightly deteriorates due to changes in the material and color of the parts due to the change of part lots, and differences from the template data during teaching due to subtle variations in size due to manufacturing errors. It is possible to prevent the tact time from extending.

It is a block diagram which shows the structure of the component mounting apparatus in this invention. 3 is a diagram showing a task configuration in the image processing apparatus 12. FIG. It is a flowchart which shows the optimal parameter acquisition process procedure of 1st Embodiment. It is a flowchart which shows the minimum edge strength setting process sequence. It is a flowchart which shows the image imaging processing procedure for template creation of 2nd Embodiment. It is a flowchart which shows the optimal parameter acquisition process procedure of 2nd Embodiment. It is a flowchart which shows a precision parameter adjustment process procedure. It is a figure explaining the extraction method of an unstable edge. It is a flowchart which shows a tact time parameter adjustment processing procedure. It is a figure which shows the relationship between the parallel movement vector on a feature point, an expansion / contraction vector, and a rotation vector.

Hereinafter, embodiments of the present invention will be described with reference to the drawings.
(First embodiment)
(Constitution)
FIG. 1 is a block diagram when the shape base matching template adjusting apparatus according to the present invention is applied to a component mounting apparatus.
In the figure, reference numeral 1 denotes a component mounting apparatus. The component mounting apparatus 1 includes a suction nozzle 3 that sucks an electronic component 2, a lighting device 4 that irradiates light to the electronic component 2 that has been moved to a predetermined imaging position by the movement of the suction nozzle 3, and the imaging position. A standard camera 5a and a high-resolution camera 5b that capture the captured electronic component 2. In addition, the component mounting apparatus 1 controls the machine control device 11 that controls the operation of the suction nozzle 3 and the illumination device 4 and the cameras 5a and 5b, and processes the images captured by the cameras 5a and 5b to process the captured images. And an image processing device 12 that performs position detection (search processing) of an object (electronic component 2) existing in the electronic component 2 and positions the electronic component 2. In this embodiment, a shape-based matching process is used for the positioning process.

  The machine control device 11 normally selects the camera 5a or 5b to be imaged according to the electrode size of the electronic component 2, and after the electronic component 2 is sucked by the suction nozzle 3, the electronic component 2 is positioned at the imaging position of the selected camera. The suction nozzle 3 is moved as follows. Further, at this time, the machine control device 11 moves the lighting device 4 so that it can be imaged by the selected camera, and lights it. Then, the machine control device 11 transmits a command for instructing execution of the positioning process together with the selected camera channel information to the image processing device 12. When the machine control device 11 receives a response (positioning processing result) to the command transmission from the image processing device 12, the machine control device 11 moves the suction nozzle 3 to a predetermined component mounting position and mounts the electronic component 2 on the substrate. At this time, the component mounting position is determined based on the suction position deviation amount and the suction angle deviation amount of the electronic component 2 obtained from the positioning processing result.

Further, the machine control device 11 has a function of notifying the image processing device 12 of this when the part lot is switched. At this time, the machine control device 11 transmits various command parameters related to adjustment of template data used in the positioning process. Command parameters include required accuracy of positioning processing and required tact time. These parameters can be designated by the operator via the man-machine interface 6 connected to the machine control device 11 (designating means).
The image processing device 12 controls the designated camera 5a or 5b in accordance with a command transmitted from the machine control device 11, takes an image of the electronic component 2, and performs various processes on the captured image.

  Specifically, when the image processing apparatus 12 receives a command for executing the positioning process from the machine control apparatus 11, the image processing apparatus 12 controls the camera 5a or 5b based on the camera channel information and images the electronic component 2 at the imaging position. (Search target image acquisition means), positioning processing is performed on the captured image, and the result is returned to the machine control device 11 (positioning means). Further, when the image processing apparatus 12 receives a production start command from the machine control apparatus 11, the positioning process is performed for each positioning process for a predetermined time from the start of production (for example, a time for mounting a predetermined number of electronic components 2). The accuracy and tact time are measured, and the repetition accuracy and tact time are stored in the work memory 24 described later. Further, when the image processing apparatus 12 receives a command for notifying the switching of the parts lot from the machine control apparatus 11, the image processing apparatus 12 adjusts the template parameter for creating the template data used in the positioning process and the search parameter used in the search process. Processing (optimum parameter acquisition processing) is performed.

The image processing apparatus 12 includes an A / D conversion unit 21, an image memory 22, a calculation unit 23, a work memory 24, a parameter storage unit 25, a control unit 26, a parallel calculation unit 27, an interface 28, And a D / A conversion unit 29.
The A / D conversion unit 21 performs A / D conversion on the image data captured by the cameras 5 a and 5 b and stores the image data in the image memory 22 as multi-value image data. The calculation unit 23 performs positioning processing and optimum parameter acquisition processing based on the image data stored in the image memory 22. The work memory 24 stores processing data generated during processing by the calculation unit 23. The parameter storage unit 25 stores template parameters, template data, and search parameters. The control unit 26 controls the cameras 5a and 5b. The parallel computing unit 27 performs processing requiring a processing speed such as filter computation in parallel with the processing in the computing unit 23. The interface 28 transmits and receives signals to and from the machine control device 11. The D / A conversion unit 29 performs D / A conversion on the image data stored in the image memory 22 and displays the images captured by the cameras 5 a and 5 b on the monitor 7.

FIG. 2 is a diagram showing a task configuration in the image processing apparatus 12.
As shown in FIG. 2, the image processing apparatus 12 includes a command analysis task 12a, an image input task 12b, a recognition execution task 12c, and a template adjustment task 12d.
The command analysis task 12 a receives a command from the machine control device 11 and transmits a response such as a positioning processing result to the machine control device 11. The command analysis task 12a executes a command that does not require parallel processing.

When an execution command for processing that requires a high tact time such as component positioning processing is received from the machine control device 11, the command analysis task 12a issues an execution request to the image input task 12b and the recognition execution task 12c, respectively. . Thus, while the positioning process is performed by the recognition execution task 12c, the next component image can be taken by the image input task 12b, and the positioning process can be performed in parallel with the image input.
Further, when the command analysis task 12a receives a part lot switching notification from the machine control device 11, it issues an execution request to the template adjustment task 12d and performs an optimum parameter acquisition process in the background process.

Next, the overall flow of the optimum parameter acquisition process executed by the image processing device 12 will be described.
When the image processing apparatus 12 receives a command to notify the part lot change from the machine control apparatus 11, the command analysis task 12a wakes up the template adjustment task 12d (detection means), and measures the accuracy and tact time of the positioning process. Then, the value is compared with the value held in the work memory 24. Here, a search evaluation image obtained by imaging the electronic component 2 in order to evaluate the accuracy and tact time of the positioning process is acquired, and the template data stored in the parameter storage unit 25 at that point in time for the search evaluation image and Accuracy and tact time are measured by performing positioning processing using search parameters.

If the measured positioning processing accuracy or tact time has deteriorated beyond a certain allowable range with respect to the value held in the work memory 24, template data adjustment (optimum parameter acquisition processing) is required. Therefore, the command analysis task 12a turns on the processing flag provided in the work memory 24 (determination means). At this time, the template adjustment task 12d temporarily enters a dormant state.
When the image input task 12b detects that the processing flag is ON, the image input task 12b interrupts the production imaging process, captures an image for template data adjustment (template image), and stores it in the image memory 22.

When an image for template data adjustment is taken by the image input task 12b, the command analysis task 12a wakes up the template adjustment task 12d again and starts the optimum parameter acquisition process. At this time, the priority of the template adjustment task 12d is determined according to the accuracy and the degree of deterioration of the tact time, and when the urgency (deterioration degree) is low, the production efficiency is reduced as much as possible, such as processing in idle time. Do not. On the other hand, when the degree of urgency (the degree of deterioration) is high, the production stability is not reduced, for example, the template is immediately readjusted.
When the optimum parameter acquisition process is completed, the command analysis task 12a re-measures the accuracy and tact time of the positioning process, updates the values held in the work memory 24, and turns off the process flag. At this time, the template adjustment task 12d is in a dormant state.

Hereinafter, the optimum parameter acquisition process executed by the image processing apparatus 12 will be described in detail.
FIG. 3 is a flowchart showing an optimal parameter acquisition processing procedure.
The optimum parameter acquisition process is started when the image input task 12b detects that the process flag is ON. In the present embodiment, it is assumed that optimum parameters are acquired (template data adjustment) from one template image.

  First, in step S1, the image processing device 12 controls the camera 5a or 5b designated by the machine control device 11, and takes an image of the target object (electronic component) 2 in the reference posture. The captured image is digitized by the A / D converter 21 and stored as template image data in the image memory 22.

In step S2, the image processing apparatus 12 performs initial setting of template parameters, stores them in the parameter storage unit 25, and proceeds to step S3. In this embodiment, as template parameters, the minimum density value (minimum value of density value regarded as an object), minimum edge strength (minimum edge strength regarded as edge), edge scale (edge gradient width = filter size). The maximum number of edge points (the number of edges registered as template data) is used. Here, the initial value of each parameter is set to, for example, minimum density value = 80, minimum edge strength = 32, edge scale = 2, and number of edge points = 1024.
In step S3, the image processing apparatus 12 obtains a threshold value for separating the background area and the object area from the template image data stored in the image memory 22 using a known discriminant analysis method, and uses this as the minimum density value. get.

Next, in step S4, the image processing apparatus 12 binarizes the template image data using the minimum density value acquired in step S3. Further, dilation processing is performed according to the edge detection filter size, and the minimum density value is adjusted so that the edge gradient section is on the object area side. Then, the initial value stored in the parameter storage unit 25 is updated with the adjusted minimum density value.
Next, in step S5, the image processing apparatus 12 performs a filtering process on the template image data stored in the image memory 22, and proceeds to step S6. Here, in order to obtain the most accurate result, the filtering process is performed using the maximum edge scale value (2.5) among the preset edge scale candidate values.

In step S6, the image processing apparatus 12 uses the binarized data obtained in step S4 to perform statistical processing on the filter output value obtained in step S5 for each of the background area and the object area. The average value, variance value, minimum value, and maximum value of the concentration distribution are obtained. Then, using these values, the process shown in FIG. 4 is performed to obtain the minimum edge strength MinPwr.
As shown in FIG. 4, first, in step S6a, the maximum background edge strength a is set as the minimum edge strength MinPwr, and the process proceeds to step S6b where the maximum background edge strength a is smaller than the minimum object edge strength b. It is determined whether or not. If a <b, the process proceeds to step S6c, where the average value (a + b) / 2 of the maximum background edge strength a and the minimum object edge strength b is defined as the final minimum edge strength MinPwr. Set and finish the process.

  If it is determined in step S6b that a ≧ b, the process proceeds to step S6d to determine whether the background maximum edge strength a is smaller than the average object edge strength c. If a <c, the process proceeds to step S6e. If a ≧ c, the minimum edge strength MinPwr remains as the background maximum edge strength a set in step S6a. The process ends.

In step S6e, the minimum object edge strength b is calculated based on the following equation, and the process proceeds to step S6f.
b = cd−k (1)
Here, d is an object dispersion value, and k is a correction coefficient.
In step S6f, it is determined whether the maximum background edge strength a is smaller than the minimum object edge strength b. If a <b, the process proceeds to step S6g, where the average value (a + b) / 2 of the maximum background edge strength a and the minimum object edge strength b is defined as the final minimum edge strength MinPwr. Set and finish the process.

On the other hand, if it is determined in step S6f that a ≧ b, the minimum edge strength MinPwr is left as the background maximum edge strength a set in step S6a, and the processing is ended as it is.
That is, here, based on the idea that if a threshold (minimum edge strength MinPwr) is provided between the background maximum edge strength a and the object minimum edge strength b, the minimum edge can be detected. Strength acquisition processing is performed. As described above, in this embodiment, the edge region of the template image is extracted, and the optimum edge detection parameter (minimum edge strength) is set based on the distribution of edge characteristic values such as contrast, edge strength, and edge scale. To do.
Returning to FIG. 3, in step S7, the image processing apparatus 12 performs threshold processing on the filter output value obtained in step S5 using the minimum edge strength MinPwr set in step S6, and detects an edge point. To do.

  Next, in step S8, the image processing apparatus 12 determines whether or not the adjustment of the edge scale value has been completed. Here, with respect to the edge point detected in step S7, a filter calculation is performed using all preset edge scale candidate values, and it is determined whether or not each filter calculation result has been evaluated (edge evaluation). For example, the edge scale candidate values are 1, 2, and 2.5. If it is determined that all edge evaluations have not been completed, the process proceeds to step S9. If it is determined that all edge evaluations have been completed, the process proceeds to step S12 described later.

In step S9, the image processing apparatus 12 switches the edge scale value to a value that is smaller by one step without performing the filter operation, and proceeds to step S10. At the first time, the edge scale value is set to the maximum value (2.5), which is set as the edge scale reference value.
In step S10, the image processing apparatus 12 performs a filter operation on the edge point detected in step S7 using the edge scale value set in step S9.

  Next, in step S11, the image processing apparatus 12 performs edge evaluation on the filter calculation result (filter output value) performed in step S10, and proceeds to step S8. Specifically, at each edge point, the filter output value in step S10 is compared with the filter output value at the edge scale reference value, and the edge scale having the larger filter output value (strong filter response) The value is selected as the optimal edge scale value for that edge point. Then, the optimum edge scale value selected at the most edge points is set as the optimum edge scale value as a template parameter, which is used as the edge scale reference value, and the edge scale value stored in the parameter storage unit 25 is Update to this value. Since the filter output value increases as the filter size (edge scale) increases, the edge evaluation is performed by normalizing the output values of the respective filter sizes so that the magnitude relationship can be compared even with different filter sizes.

  With the above processing, the acquisition of the optimum template parameter value regarding the accuracy of the positioning processing is completed. In other words, at this time, among the template parameters, the edge detection accuracy parameters (minimum density value, minimum edge strength, edge scale value) provide the best edge detection accuracy within the preset parameter candidate value range. It is in a state that satisfies the required accuracy specified by the operator.

  Therefore, in the following process, an optimal template parameter value related to the tact time is acquired. Here, the template parameter (maximum number of edge points) is changed in the direction in which the tact time is shortened, and the search process is performed each time to evaluate the accuracy and the tact time. And the method of acquiring the parameter value in the limit tact time which can hold | maintain required precision as an optimal parameter value is used.

  In step S12, the image processing apparatus 12 creates template data using the template parameter value stored in the parameter storage unit 25 at this time, and proceeds to step S13. Here, the template parameter value is used to detect the edge of the template image, a list composed of the coordinates of the edge point and the edge gradient vector is created, and this is used as template data. At this time, thinning processing is performed so as not to exceed the maximum number of edge points, and edge points to be registered in the template data are determined. The created template data is stored in the parameter storage unit 25.

  In step S <b> 13, the image processing apparatus 12 generates an image for search evaluation and stores it in the image memory 22. Here, image data obtained by adding arbitrary fluctuations (parallel movement, rotation, enlargement / reduction) to the template image data stored in the image memory 22 is generated and used as a search evaluation image. The fluctuation amount given here is used as an absolute accuracy evaluation value of the positioning process as a true value of the positioning process result.

Next, in step S14, the image processing apparatus 12 performs a search process on the search evaluation image generated in step S13, using the template data created in step S12.
Here, the search processing procedure will be briefly described.
The search process in the present embodiment is a pattern matching process using a pyramid structure search, and detects the position of an object existing in the captured image based on the search parameters stored in the parameter storage unit 25. As the search parameters, the minimum density value, the minimum edge strength, the edge scale value, and the pyramid start layer are used.

  In the pyramid structure search, first, an edge is detected from the search target image using the search parameters, and the image is compressed at a compression rate corresponding to the pyramid start layer. Next, rough processing (coarse search) is performed on the compressed image to detect a rough coordinate position where the object exists. Thereafter, detailed processing (fine search) is performed using the template data in the vicinity of the detected coordinates to detect a more accurate position. This fine search is performed with gradually increasing accuracy each time the hierarchy advances. In this way, after performing a rough search and specifying an approximate position, a detailed search is performed on the specified range based on this result, so that only the necessary portion is searched without performing a detailed search over the entire image. Sequential inspection can be performed, and the total processing tact time can be shortened.

That is, in step S14, search processing is performed using the search evaluation image generated in step S13 as the search target image. The search parameters used in the search process here are set according to the following rules based on the template data acquired at this time.
(1) If the template parameter value is the default value, the default value is used.
(2) If it is not the default value, it is set to be looser than the template parameter (for example, one level lower based on the candidate values of each parameter shown in Table 1).
(3) The pyramid start layer is set according to the external size of the object.

In step S15, the image processing apparatus 12 obtains the tact time and accuracy (deviation amount between the positioning processing result and the true value (variation amount given in step S13)) from the search processing result in step S14, and these are obtained. It is stored in the work memory 24 as a reference value.
Next, in step S16, it is determined whether or not to end the tact time adjustment process. Here, when the tact time held in the work memory 24 has reached the required tact time, when the tact time is not shortened from the previous value, or when the number of adjustments exceeds a predetermined number of times, It is determined that the tact time adjustment process is finished. If it is determined that the tact time adjustment process is to be continued, the process proceeds to step S17. If it is determined that the tact time adjustment process is to be terminated, the process proceeds to step S23 described later.

In step S17, the image processing apparatus 12 optimizes the maximum number of edge points. The tact time of the positioning process depends on the number of edge points registered in the template data. Therefore, the tact time can be shortened by reducing the number of edge points. However, accuracy deteriorates.
In this embodiment, the number of edge points registered in the template data is reduced step by step, and the accuracy of the positioning process is confirmed each time, thereby obtaining the number of edge points at the limit tact time that can maintain the required accuracy. To do. The number of edge points is set as the optimum number of edge points. Here, the number of edge points is reduced by thinning processing so that the edge points are evenly arranged on the edge line. That is, the number of edge points is gradually reduced by increasing the number of thinning points in the order of 1, 2, 3,.

  When creating template data from template parameters, the thinning process is automatically performed so as not to exceed the maximum number of edge points, so the number of edge points can be reduced (the number of thinning points increased) by reducing the maximum number of edge points. . Therefore, in step S17, the maximum number of edge points is set to a value smaller than the value stored in the parameter storage unit 25 by a predetermined value.

In step S18, the image processing apparatus 12 recreates template data using the template parameter set in step S17, and proceeds to step S19.
In step S19, the image processing apparatus 12 performs a search process on the search evaluation image generated in step S13 using the template data recreated in step S18, and proceeds to step S20.

  In step S20, the image processing apparatus 12 obtains the tact time and accuracy from the search processing result of step S19, proceeds to step S21, and holds the positioning processing accuracy obtained in step S20 and the work memory 24. Compare the set standard value. Then, if the accuracy of the positioning process at this time is not deteriorated compared to the reference value (deterioration of accuracy is within the allowable range), the process proceeds to step S22, and the reference time and accuracy reference values are updated. Then, the process proceeds to step S16.

  On the other hand, if the accuracy of the positioning process at this time is worse than the reference value (deterioration of accuracy is outside the allowable range), the previous maximum number of edge points is the limit value that can maintain the required accuracy. Judge that it was. Therefore, in this case, the process proceeds to step S23, and initial setting of search processing parameters is performed. Here, based on the optimum template parameter values obtained by the processing so far, the initial values of the search processing parameters are set using the rules (1) to (3) described above.

  Note that the standard camera 5a and the high-resolution camera 5b correspond to an imaging unit, and the parameter storage unit 25 corresponds to a storage unit. In FIG. 3, step S1 corresponds to the template image acquisition means, steps S3 to S5 correspond to the edge region extraction means, step S6 corresponds to the edge detection parameter setting means, and step S13 corresponds to the evaluation image acquisition means. It corresponds to. Steps S3 to S11 correspond to parameter setting means, and steps S12 to S22 correspond to parameter updating means.

(Operation)
Next, the operation of the first embodiment will be described.
Now, it is assumed that optimum template data and optimum search parameters are stored in the parameter storage unit 25 of the image processing apparatus 12. When the component mounting apparatus 1 performs the mounting process of the electronic component 2 in this state, the machine control device 11 moves the suction nozzle 3 to a component supply position of an electronic component supply device (not shown) and sucks the electronic component 2. . When the electronic component 2 is picked up, the machine control device 11 moves the suction head 3 to a predetermined imaging position. At this time, the camera 5a or 5b to be imaged is selected according to the suction component, and the suction nozzle 3 is moved so that the electronic component 2 is positioned at the imaging position of the selected camera. When the suction nozzle 3 is moved to the imaging position, the machine control device 11 controls the illumination device 4 to irradiate the electronic component 2 with illumination light, and performs positioning processing on the image processing device 12 together with the selected camera channel information. Send a command to execute.

  The image processing apparatus 12 receives this command via the interface 28. Then, the command analysis task 12a wakes up the image input task 12b, and the control unit 26 controls the camera to image the electronic component 2 at the imaging position. The captured image of the electronic component 2 is digitized by the A / D converter 21 and stored in the image memory 22. This image data becomes the search target image. At this time, the image data stored in the image memory 22 is converted to analog by the D / A converter 29 and displayed on the monitor 7.

  When the search target image is prepared, the command analysis task 12a wakes up the recognition execution task 12c, and executes the positioning process of the electronic component 2 by the calculation unit 23 and the parallel calculation unit 27. In this positioning process, first, a filter according to the edge scale stored as a search parameter in the parameter storage unit 25 is applied to the search target image, and threshold processing is performed using the minimum edge strength to perform the search target image. Extract edge points in the image.

  Next, the edge image is compressed at a compression rate corresponding to the pyramid start layer which is a search parameter, and a rough search is performed using the template data stored in the parameter storage unit 25. In this coarse search, a rough position and orientation of an object are acquired from a search target image using known search techniques such as edge search, normalized correlation search, generalized Hough transform, and geometric hashing.

  Subsequently, a fine search is performed on the search area acquired by the coarse search, and a detailed position of the object is detected. In this fine search, the position and orientation of the object obtained in the coarse search are used as the start position and the start orientation, and a template (pattern model) created from the template data is overlaid on the search target image to translate and rotate the template. The position where the degree of coincidence with the object in the search target image becomes the highest is detected by repeatedly performing the scale variation. Then, based on the movement amount of the template, the detailed position of the target object in the search target image is acquired. As a result, positioning processing results (a parallel movement amount, a rotational movement amount, and a scale fluctuation amount of the target object in the search target image with respect to the template) can be obtained. The amount of suction angle deviation is known, and mounting positioning is possible.

When the positioning process is completed, the recognition execution task 12c enters a sleep state, and the command analysis task 12a transmits the positioning process result to the machine control apparatus 11 through the interface 28 as a response to the machine control apparatus 11.
When the machine control device 11 receives the positioning processing result from the image processing device 12, it adjusts the component mounting position according to the positioning processing result and mounts the electronic component 2 on the substrate. Thereby, the electronic component 2 can be appropriately mounted at a desired position.
Thereafter, when the part lot is switched, the machine control device 11 notifies the image processing device 12 of this. At this time, the machine control device 11 also transmits to the image processing device 12 the required accuracy of the positioning process designated by the operator and the required tact time.

  When the image processing apparatus 12 receives the part lot change notification from the machine control apparatus 11 via the interface 28, the command analysis task 12a wakes up the image input task 12b, and the control unit 26 controls the camera to control the reference posture. An image of the electronic component 2 is taken. The captured image of the electronic component 2 is digitized by the A / D converter 21 and stored in the image memory 22. Next, the command analysis task 12a wakes up the template adjustment task 12d and generates image data obtained by adding arbitrary fluctuations (translation, rotation, enlargement / reduction) to the image data stored in the image memory 22. This is stored in the work memory 24. This image data becomes a search evaluation image for determining the necessity of parameter adjustment by measuring the accuracy of the positioning process and the tact time. Then, search processing is performed on the generated search evaluation image using template data stored in the parameter storage unit 25, and a positioning processing result is obtained.

  At this time, the accuracy of the positioning process is obtained by comparing the positioning process result obtained and the true value (variation amount used when generating the search evaluation image). At the same time, the tact time of the positioning process is also measured. Then, these values are compared with the values held in the work memory 24. If the values are worse than the allowable range, the template data and search parameters stored in the parameter storage unit 25 should be adjusted. Determine and execute the optimum parameter acquisition process.

  In the optimum parameter acquisition process, the minimum density value, the minimum edge strength, the edge scale, and the maximum edge number, which are parameters for creating template data, are adjusted to optimum values. In the present embodiment, a template image that is an image for creating a template is captured (step S1 in FIG. 3), and this template image is processed to obtain template parameters (minimum density value, minimum edge strength, Edge scale) is set to an optimum value that provides the best accuracy (steps S3 to S11).

  Then, using the template data created based on these template parameters, search processing (search test) is performed on the search evaluation image generated from the template image to determine the accuracy of the positioning processing, and this is retained as a reference value. (Steps S12 to S15). Next, the template parameter (maximum number of edge points) related to the tact time of the positioning process is gradually changed from the initial value to the direction in which the tact time is improved, and the accuracy is evaluated for each adjustment. A limit value that can satisfy the above-described reference value is acquired as an optimum parameter value (steps S16 to S22).

  By the way, in the conventional template parameter adjustment, the operator actually adjusts the parameters according to the usage environment, for example, by entering numerical values of the parameter values or by specifying whether the edge is valid / invalid surrounded by a window. I had to. Therefore, in order to make fine parameter adjustments for each object, it is necessary to know the algorithm and behavior of positioning processing, and not only high-precision and high-speed optimum adjustments, but also fine-tuned adjustments according to the application. It was very difficult to do.

On the other hand, in the present embodiment, template parameters satisfying the required accuracy and required tact time can be automatically acquired, so that delicate parameter adjustment can be performed appropriately. In addition, since the operator can instruct the required accuracy and the required tact time, for example, the accuracy may be sacrificed to some extent, so that it is possible to make adjustments according to the application, for example, to perform at a higher speed.
In addition, the necessity of parameter adjustment is determined at the time of part lot switching, and parameter adjustment is performed as necessary.Therefore, when the part material, color, and size change due to part lot switching, Can also be updated with appropriate parameters. Therefore, it is possible to automatically detect deterioration of accuracy and tact time and automatically repair it.

(effect)
As described above, in the first embodiment, not only the template data capable of position detection but also the characteristics such as position detection accuracy, tact time, and robustness when used for the position detection processing are optimized. Parameters for creating template data can be set. In addition, the parameters are automatically adjusted according to the required accuracy and required tact time specified by the operator, so even if the operator has no know-how about the shape-based matching processing algorithm, etc. Adjustment is possible.

  At this time, a search evaluation image is generated by internally changing the template image, and parameter adjustment is performed using the generated search evaluation image. Therefore, parameter adjustment can be performed only by capturing one template image. Become. Therefore, the present invention can be applied to a system having a small memory capacity for storing captured images and a system in which an actual image for evaluation cannot be prepared. Further, since the fluctuation amount added to the template image can be used as the true value of the positioning processing result and used as the absolute accuracy evaluation value of the positioning processing, the accuracy evaluation of the shape base matching processing is easy.

  Furthermore, when performing parameter adjustment, first, parameters are set so that the accuracy of the positioning process satisfies the required accuracy. At this time, the edge region of the template image is extracted, and the distribution of edge characteristic values such as contrast, edge strength, and edge scale is measured, and optimal edge detection parameters (threshold values used in edge detection threshold processing) are set. To do. As described above, since the parameters can be set so that the edge detection accuracy is the best, it is possible to reliably set the parameters that satisfy the required accuracy.

  Further, when performing parameter adjustment, the maximum number of edge points is reduced stepwise, the accuracy of positioning processing is evaluated for each adjustment, and the necessary number of edge points is acquired with a limit tact that can maintain the required accuracy. Thus, since the maximum number of edge points is reduced step by step, the parameter can be gradually changed in a direction in which the tact time is surely shortened. In addition, since the accuracy of the positioning process is evaluated while gradually changing the parameters in the direction of shortening the tact time and the optimum parameter value is acquired, finer parameter adjustment becomes possible.

In addition, since the change of parts lots is detected during production and the necessity of parameter adjustment is judged, the material and color of parts change due to the change of parts lots, and subtle size fluctuations due to manufacturing errors cause teaching. When a difference from the template data occurs, the parameters can be readjusted automatically and updated to the optimum parameters. For this reason, it is possible to prevent the accuracy from deteriorating slightly and the tact time from extending due to the change of the parts lot.
Therefore, in such a component mounting apparatus, component mounting with high accuracy is always possible.

(Second Embodiment)
Next, a second embodiment of the present invention will be described.
In the second embodiment, the optimum parameter acquisition process is performed using the search evaluation image generated from the template image in the first embodiment described above, whereas a plurality of actually acquired search evaluation images are displayed. In this way, the optimum parameter acquisition process is performed. When used in the initial creation of a template before production, a search evaluation image is prepared in consideration of a posture variation range and illumination variation range that may occur during production.

(Constitution)
The machine control device 11 according to the present embodiment includes, as command parameters relating to template data adjustment, in addition to the required accuracy in positioning processing and the required tact time, the imaging posture (evaluation posture) of the search evaluation image, the number of images to be taken, The imaging conditions for the search evaluation image, such as the brightness of the illumination, are transmitted to the image processing device 12. The imaging conditions for these search evaluation images can be specified by the operator via the man-machine interface 6.
FIG. 5 is a flowchart illustrating a template creation image capturing process procedure according to the second embodiment.

First, in step S31, the image processing device 12 controls the camera 5a or 5b designated by the machine control device 11, and takes an image of the target object (electronic component) 2 in the reference posture. The captured image is digitized by the A / D converter 21 and stored as template image data in the image memory 22.
In step S32, the image processing apparatus 12 determines whether to end the search evaluation image capturing process. Here, it is determined whether or not the search evaluation image has been imaged under the imaging conditions specified by the machine control device 11. If not, the process proceeds to step S33. The process proceeds to S34.

In step S <b> 33, the image processing apparatus 12 moves the suction nozzle 3 that sucks the object (electronic component 2) to the machine control apparatus 11 to one evaluation posture that is an imaging condition, and performs illumination in the evaluation posture. Send a command to specify the brightness of the. Upon receiving a response (search evaluation image capturing command) from the machine control device 11, the image processing device 12 controls the designated camera 5a or 5b to image the object (electronic component 2) in the evaluation posture. . At this time, the designated number of images are taken. When imaging of the search evaluation image in one evaluation posture is completed, the process proceeds to step S32. In this way, by repeating the processes of steps S32 and S33, the search evaluation image is captured under all imaging conditions.
In step S <b> 34, the image processing apparatus 12 creates a list of captured search evaluation images, and stores this in the image memory 22 as attached data.

Next, the optimum parameter acquisition process in this embodiment will be described.
FIG. 6 is a flowchart illustrating an optimal parameter acquisition processing procedure according to the second embodiment.
First, in step S41, the image processing apparatus 12 performs initial setting of template parameters, and proceeds to step S42. Here, the template parameters are set to the default values of minimum density value = 80, minimum edge strength = 32, edge scale = 2, and number of edge points = 1024. The set template parameters are stored in the work memory 24.

In step S41, the template parameters may be initialized by performing the optimum parameter acquisition process shown in FIG. In this case, since the initial value of the template parameter can be a useful value, the template parameter value can be converged to an optimal value relatively easily in the following processing.
In step S42, the image processing apparatus 12 creates template data using the template parameters stored in the work memory 24, and proceeds to step S43.

In step S43, the image processing apparatus 12 determines whether or not search processing has been performed for all search evaluation images captured in the processing illustrated in FIG. If all search processes have not been completed, the process proceeds to step S44. If all search processes have been completed, the process proceeds to step S46, which will be described later.
In step S44, using the template data created in step S42, the image processing device 12 performs search processing for each of a plurality of search evaluation images obtained by imaging the object in the same evaluation posture, and obtains a positioning processing result.

In step S45, the image processing apparatus 12 acquires statistical processing data for the positioning processing result obtained in step S44, and proceeds to step S43. Here, the positioning processing result data subjected to statistical processing are the center coordinates (x, y) and the farthest point coordinates (x, y) of the template, and an average value and a standard deviation (3σ) are obtained, respectively. As for the tact time, the total processing tact time, the edge detection tact time, the coarse search tact time, and the fine search tact time are measured, and the average tact time, the minimum tact time, and the maximum tact time are obtained, respectively. Further, here, the positioning processing result (a translation amount, a rotation amount, and a scale variation amount of the object in the search evaluation image with respect to the template) is also obtained.
In step S46, the image processing apparatus 12 determines whether the standard deviation (3σ) of the central coordinates of the template obtained in step S45 and the standard deviation (3σ) of the farthest point coordinates are predetermined according to the required accuracy. Evaluate whether it is within the allowable range.

  In step S47, if the image processing apparatus 12 determines in step S46 that each standard deviation (3σ) is within the allowable range, it is necessary to adjust a parameter (accuracy parameter) relating to the accuracy of the template data. If not, the process proceeds to step S52 described later. On the other hand, if it is determined that it is outside the allowable range, the process proceeds to step S48 to adjust the accuracy parameter of the template data.

The accuracy parameter is adjusted by adjusting the minimum edge strength and edge scale of the template parameters and the minimum edge strength and edge scale of the search parameters. In the present embodiment, the above parameters are changed so that the accuracy is improved step by step in a direction in which the accuracy is improved, or gradually decreased by one step, and the accuracy is evaluated by performing a search process each time. This process is repeated (retry) until the required accuracy is satisfied.
In step S48, the image processing apparatus 12 determines whether or not the number of retries for accuracy parameter adjustment exceeds a preset maximum number. If the maximum number of times has not been exceeded, the process proceeds to step S49, and accuracy parameter adjustment processing is performed.

FIG. 7 is a flowchart showing the accuracy parameter adjustment processing procedure executed in step S49.
First, in step S49a, two images having the largest difference are extracted as a result of statistical processing on a plurality of search evaluation image data. Here, each group of the same imaging conditions (imaging posture, illumination brightness, etc.) is processed.

Next, in step S49b, edge detection is performed for each of the two images extracted in step S49a, and the process proceeds to step S49c.
In step S49c, first, the edges of the two search evaluation images obtained in step S49b are converted into templates based on the positioning processing results (parallel movement amount, rotation amount, scale variation amount) obtained in step S45. Convert to image coordinate system. Then, as shown in FIG. 8, the edges (a) and (b) of the two converted search evaluation images and the edge (c) of the template image are overlapped to extract a non-matching part (unstable edge). To do. At this time, a part where the degree of coincidence of the edges of the three superimposed images is lower than a predetermined determination threshold is extracted as an unstable edge. Finally, a coordinate value list (unstable edge list) of unstable edge points is created for a total of three search evaluation image data and template image data. At this time, the edge detection parameter values (minimum edge strength, edge scale value) are associated with the edge points on the list.

Next, in step S49d, it is determined whether or not an unstable edge (difference edge) has been extracted as a result of the image overlay processing in step S49c. If an unstable edge has been extracted, the process proceeds to step S49e. If an unstable edge has not been extracted, the accuracy parameter adjustment process ends.
In step S49e, the minimum edge strength, which is an accuracy parameter, is adjusted so that unstable edges are eliminated. The minimum edge strength of the template parameter is adjusted so as to increase in order to delete the difference edge of the template. Further, the minimum edge strength of the search parameter is adjusted so that the value is decreased so as to eliminate unstable edges by increasing the number of extracted edge points and improving the degree of coincidence with the edges of the template image. The adjustment of the minimum edge strength is selected from the candidate values shown in Table 1 so as to be stricter by one step or gradual by one step based on the edge detection parameter value stored in the unstable edge list. Do that.

Next, in step S49f, the edge scale value, which is a precision parameter, is adjusted so that unstable edges are eliminated, and the precision parameter adjustment process ends. The edge scale value of the template parameter is adjusted so as to increase the value in order to delete the difference edge of the template. Further, the edge scale value of the search parameter is adjusted so that the value is decreased so as to eliminate unstable edges by increasing the number of extracted edge points and improving the degree of coincidence with the edges of the template image. Note that the adjustment of the edge scale value is selected from the candidate values shown in Table 1 above so as to be one step stricter or one step gradual based on the edge detection parameter value stored in the unstable edge list. To do.
Returning to FIG. 6, in step S <b> 50, the image processing apparatus 12 increments the retry counter corresponding to the number of retries for accuracy parameter adjustment, and proceeds to step S <b> 42.

  If it is determined in step S48 that the number of retries for adjusting the accuracy parameter has reached the preset maximum number, the process proceeds to step S51, where the accuracy of the search process has not reached the required accuracy. Set the parameter as the final parameter. Then, template data is created based on the parameters, and the optimum parameter acquisition process is terminated.

  In step S52, the image processing apparatus 12 analyzes the statistical data obtained in step S45, and whether the target tact time has been reached for each of the edge detection tact time, the coarse search tact time, and the fine search tact time. Determine whether or not. Here, the target tact time of each process is set by distributing the requested tact time value at a breakdown ratio set in consideration of the positioning process algorithm. If all the tact times have reached the target tact time, it is determined that the optimum parameter has been successfully acquired, and the process proceeds to step S53. In step S53, the image processing apparatus 12 sets the template parameter at that time as a final parameter, creates template data based on the parameter, and ends the optimum parameter acquisition process.

On the other hand, if it is determined in step S52 that at least one of the three tact times has not reached the target tact time, the process proceeds to step S54 to perform tact time parameter adjustment processing.
FIG. 9 is a flowchart showing the tact time parameter adjustment processing procedure executed in step S54.

  First, in step S54a, it is determined whether or not the edge detection tact time has reached the target tact time. If the target tact time has not been reached, it is determined that the edge detection tact time needs to be adjusted, and the process proceeds to step S54b. If the target tact time has been reached, the edge detection tact time is adjusted. Is determined to be unnecessary, and the process proceeds to step S54c described later.

In step S54b, the edge detection tact time is adjusted. The edge detection tact time depends on the filter size and the number of edge candidate points. Therefore, here, the edge scale value of the template parameter is decreased, and the minimum edge strength of the template parameter is increased to adjust the edge detection tact time in a direction to be shortened.
Adjustment of each parameter is performed by selecting from the candidate values shown in Table 1 described above so that one step is stricter or one step is lenient based on the current value. Further, the search parameters are adjusted in consideration of not being stricter than the template parameters.

  Next, in step S54c, it is determined whether or not the rough search tact time has reached the target tact time. If the target tact time has not been reached, it is determined that the coarse search tact time needs to be adjusted, and the process proceeds to step S54d. If the target tact time has been reached, the coarse search tact time is adjusted. Is determined to be unnecessary, and the process proceeds to step S54e described later.

  In step S54d, the coarse search tact time is adjusted. Since the coarse search process searches for an object on the basis of feature points, the coarse search tact time depends on the number of feature points, that is, the distance between feature points. Therefore, here, the distance between the feature points set as the template parameter is increased to reduce the number of feature points, and the coarse search tact time is adjusted to be shortened. The adjustment of the distance between feature points is performed by adding a predetermined adjustment addition value set in advance to the current value. At this time, adjustment is performed in consideration that the distance between feature points does not exceed a preset upper limit value.

  Next, in step S54e, it is determined whether or not the fine search tact time has reached the target tact time. If the target tact time has not been reached, it is determined that the fine search tact time needs to be adjusted, and the process proceeds to step S54f. If the target tact time has been reached, the fine search tact time is adjusted. The tact time parameter adjustment process is terminated.

In step S54f, the fine search tact time is adjusted, and then the tact time parameter adjustment process is terminated. The fine search tact time depends on the number of edge points registered in the template data. Therefore, here, the fine search tact time is adjusted to be shortened by reducing the maximum number of edge points of the template parameter. The maximum number of edge points is adjusted by subtracting a predetermined adjustment subtraction value set in advance from the current value. This is equivalent to reducing the edge points in the thinning process so that the edge points are evenly arranged on the edge line, and by increasing the number of thinning points in the order of 1, 2, 3,. Reduce the number of edge points step by step.
By the way, in the fine search process, as described above, the template is superimposed on the search target image, and the degree of coincidence with the target object in the search target image becomes the highest by repeatedly performing parallel movement, rotational movement, and scale fluctuation of the template. The position is detected.

When a certain feature point is in the vicinity of the edge on the search target image, it receives a force from the image energy (second-order differentiation of the image density data in the x and y directions) and is attracted to the edge. The force received from the image energy is the translation vector of the feature point. As shown in FIG. 10, when the translation vector g of a feature point is decomposed into a vertical component and a horizontal component with respect to a straight line (centroid normal) direction connecting the feature point and the center of gravity of the template, the horizontal component is expanded. A reduced vector h is obtained, and a vertical component is a rotation vector v. In the present embodiment, the translation vector g, the rotation vector v, and the enlargement / reduction vector h are each converted into a movement amount (a parallel movement amount, a rotation angle, and a scale fluctuation amount) using predetermined correction coefficients. The amount of movement of the template in the loop processing is calculated. The amount of movement is calculated based on the following formulas (2) to (4).
m = Γ · γg · Σg (2)
θ = Γ · γθ · Σv / ΣRd (3)
scl = Γ · γscl · Σh / ΣRd (4)

Here, m is a parallel movement amount (unit: pixel), θ is a rotation angle (unit: rad), scl is a scale variation amount (unit: magnification), and Rd is a center-to-feature point distance. Γ is an overall correction coefficient (default value: 0.2), γg is a translation correction coefficient (default value: 1.0), γθ is a rotation correction coefficient (default value: 1.0), and γscl is The enlargement / reduction correction coefficient (default value: 0.007).
Then, the template is repeatedly moved until the energy sum obtained by adding the sizes of the translation vector g, the rotation vector v, and the enlargement / reduction vector h is equal to or less than a predetermined convergence determination threshold value.

Therefore, the fine search tact time depends on the number of loops of the fine search process and the correction coefficients (Γ, γg, γθ, γscl) for converting the movement vector into the movement amount. Therefore, here, the fine search tact time is adjusted to be shortened by reducing the number of loops and increasing the correction coefficient. The number of loops and the correction coefficient are adjusted by subtracting or adding a predetermined adjustment value to the current value. The adjustment value is set according to the application environment, and is adjusted in consideration of the number of loops and the correction coefficient not reaching preset limit values.
Returning to FIG. 6, in step S <b> 55, the image processing apparatus 12 creates template data using the template parameters set at the present time. The created template data is stored in the parameter storage unit 25.

In step S56, the image processing apparatus 12 determines whether or not the search processing has been completed for all search evaluation images using the template data created in step S55. If it is determined that all search processes have not been completed, the process proceeds to step S57. If it is determined that all search processes have been completed, the process proceeds to step S60 described later.
In step S57, the image processing apparatus 12 performs a search process for each of the plurality of search evaluation images obtained by imaging the object in the same evaluation posture using the template data created in step S55, and obtains a positioning process result.

Next, in step S58, the image processing apparatus 12 measures the total processing tact time, edge detection tact time, rough search tact time, and fine search tact time in the search processing in step S57, and then proceeds to step S59. Transition.
In step S59, the image processing apparatus 12 acquires statistical processing data for the positioning process result obtained in step S57, and proceeds to step S56. Here, an average value and a standard deviation (3σ) are obtained for the center coordinates (x, y) and the farthest point coordinates (x, y) of the template. For the four tact times measured in step S58, the average tact time, the minimum tact time, and the maximum tact time are obtained.

In step S60, the image processing apparatus 12 determines whether the standard deviation (3σ) of the center coordinates of the template obtained in step S59 and the standard deviation (3σ) of the farthest point coordinates are predetermined according to the required accuracy. Evaluate whether it is within the allowable range.
In step S61, if the image processing apparatus 12 determines in step S60 that each standard deviation (3σ) is within the allowable range, the process proceeds to step S52 on the assumption that the tact time parameter adjustment process is continued. To do. On the other hand, if it is determined that the value is out of the allowable range, the process proceeds to step S62, and it is assumed that the accuracy of the search process has fallen below the required accuracy before the tact time reaches the target tact time. Set the parameter as the final parameter. Then, template data is created based on the parameters, and the optimum parameter acquisition process is terminated.

  In FIG. 5, step S31 corresponds to the template image acquisition unit, and steps S32 to S34 correspond to the evaluation image acquisition unit. In FIG. 6, steps S42 to S50 correspond to parameter setting means, and steps S52 to S61 correspond to parameter updating means. Further, in FIG. 7, step S49a corresponds to the extraction means, steps S49b and S49c correspond to the unstable edge extraction means, and steps S49d to 49f correspond to the parameter gradual change means.

(Operation)
Next, the operation of the second embodiment will be described.
When the part lot is switched, the machine control device 11 notifies the image processing device 12 of this. At this time, the machine control device 11 also transmits the required accuracy of the positioning process and the required tact time to the image processing device 12.
When the image processing apparatus 12 receives the part lot change notification from the machine control apparatus 11 via the interface 28, the image processing apparatus 12 measures the accuracy and tact time of the positioning process as in the first embodiment. Compare with the value held in the work memory 24. If the measured accuracy and tact time have deteriorated beyond the allowable range with respect to the values held in the work memory 24, the template data and search parameters stored in the parameter storage unit 25 are adjusted. The optimum parameter acquisition process is executed.

  In the optimum parameter acquisition process, the template parameter and the search parameter are adjusted to optimum values. In the present embodiment, a template image is captured (step S31 in FIG. 5), and a plurality of search evaluation images are captured under predetermined imaging conditions (steps S32 and S33). Then, using the template data in the initial state, search processing is performed on the plurality of captured search evaluation images, and iterative accuracy is measured to determine whether the positioning processing accuracy satisfies the required accuracy (step S43 in FIG. 6). To S47). If the required accuracy is not satisfied, parameters (minimum edge strength, edge scale) regarding the accuracy of the positioning process are adjusted so as to satisfy the required accuracy (step S49).

  Specifically, two search evaluation images having the largest relative edge variation are extracted from a plurality of search evaluation images (step S49a in FIG. 7), and the two search evaluation images and the template are extracted. Unstable edges are extracted by overlapping the image (steps S49b and S49c). Then, the minimum edge strength and the edge scale of the template parameter are adjusted so that the edge portion of the template corresponding to the unstable edge is deleted, and the unstable edge can be stably searched in the search process. The minimum edge strength and edge scale of the search parameters are adjusted (steps S49e and S49f).

  When the adjustment of the parameter related to accuracy is completed, the parameter related to the tact time is adjusted next. Here, it is determined whether or not the tact time satisfies the required tact time (step S52 in FIG. 6). If the required tact time is not satisfied, a parameter (minimum) regarding the tact time of the positioning process so as to satisfy the required tact time. Edge strength, edge scale value, distance between feature points, maximum number of edge points, number of loops, movement amount correction coefficient are adjusted (step S54).

  Specifically, the minimum edge strength and the edge scale value are gradually changed in a direction that shortens the edge detection tact time, and the limit value that can maintain the required accuracy is set as the optimum parameter value. Also, the distance between feature points is gradually changed in a direction that shortens the coarse search tact time, and the limit value that can maintain the required accuracy is set as the optimum parameter value. Furthermore, the maximum number of edge points, the number of loops, and the movement amount correction coefficient are changed in a direction that shortens the fine search tact time, and the limit value that can maintain the required accuracy is set as the optimum parameter value. As described above, the parameter values are adjusted so that the edge detection, the coarse search, and the fine search are performed with the limit tact time that can maintain the required accuracy (steps S55 to S61).

(effect)
As described above, in the second embodiment, the search evaluation image is captured under a plurality of imaging conditions indicating the actual fluctuation range of the search target image, and parameter adjustment is performed based on the captured search evaluation image. It is possible to create highly template data. Therefore, the template can be expanded to other machines without adjustment.
Further, when performing parameter adjustment, parameters such as edges with weak contrast are adjusted so as to exclude portions where the edge detection result is unstable depending on the imaging conditions from the search processing target. Therefore, it is possible to set an optimum parameter that satisfies the required accuracy.

  At this time, since the minimum edge strength among the template parameters is increased or the edge scale value is decreased, it is possible to prevent unstable edges from being extracted when creating template data from the template image. Stable edges can be reliably excluded from search processing targets. Similarly, since the minimum edge strength among the search parameters is reduced or the edge scale value is increased, it is possible not to extract unstable edges when extracting edges from the search target image. It is possible to reliably exclude the unstable edge from the search processing target.

Furthermore, when performing parameter adjustment, the parameters relating to the tact time of the edge detection process, the coarse search process, and the fine search process can be adjusted in a direction in which the tact time of each process is shortened. Thereby, more optimal parameter adjustment is possible.
At this time, since the minimum edge strength is increased or the edge scale value is decreased, the edge detection tact time can be surely shortened. Further, since the distance between the feature points is increased, the coarse search tact time can be surely shortened. In addition, since the maximum number of edge points is reduced, the number of loops of the fine search process is reduced, and the correction coefficient when converting the movement vector to the movement amount in the fine search process is increased, the fine search tact time is reliably ensured. Can be shortened.

  DESCRIPTION OF SYMBOLS 1 ... Component mounting apparatus, 2 ... Electronic component, 3 ... Adsorption nozzle, 4 ... Illumination device, 5a ... Standard camera, 5b ... High resolution camera, 6 ... Man-machine interface, 7 ... Monitor, 11 ... Machine control apparatus, 12 ... Image processing device 21... A / D conversion unit 22... Image memory 23... Arithmetic unit 24 .. working memory 25 .. parameter storage unit 26 .. control unit 27. ... D / A converter

Claims (6)

  1. A shape-based matching parameter adjusting device that adjusts parameters used for shape-based matching processing,
    A specifying means for specifying the required accuracy and required tact time of the shape-based matching process;
    Evaluation image acquisition means for acquiring an image of a target object having an arbitrary variation with respect to a reference posture as an evaluation image of the parameter;
    Parameter adjusting means for adjusting the parameter based on the evaluation image acquired by the evaluation image acquiring means;
    With
    The parameter adjusting means includes
    Parameter setting means for automatically setting the parameters so that the accuracy of the shape-based matching process satisfies the required precision specified by the user with the specifying means;
    As a result of performing shape-based matching processing on the evaluation image using the parameters set by the parameter setting means, when the tact time does not satisfy the required tact time specified by the user by the specifying means, The accuracy of the shape base matching process is evaluated while gradually changing the parameter set by the parameter setting means in a direction that shortens the tact time, and the parameter is updated to a parameter value that is the limit that the accuracy can maintain the required accuracy. Parameter updating means,
    Imaging means for imaging the object;
    A template image acquisition unit that images the target object in a reference posture with the imaging unit and acquires the target image as a template image;
    The evaluation image acquisition means includes
    A plurality of images of the object in a posture obtained by adding an arbitrary change with respect to a reference posture by the imaging means, and acquiring these as the evaluation image,
    The parameter setting means includes
    Extraction means for extracting two evaluation images having the largest relative edge variation amount from a plurality of evaluation images under the same imaging condition acquired by the evaluation image acquisition means;
    Unstable edge extraction means for extracting unstable edges whose degree of coincidence between the two evaluation images extracted by the extraction means is lower than a predetermined determination threshold;
    The accuracy of the shape base matching process is evaluated while gradually changing the parameter in a direction to exclude the edge portion corresponding to the unstable edge extracted by the unstable edge extracting means from the target of the shape base matching process. And a parameter gradual change means for determining a parameter value satisfying the required accuracy as the parameter.
  2. The parameters are template parameters used when creating template data composed of coordinates of edge points of the object and edge gradient vectors from the template image, and the object from the search object image based on the template data. Including search parameters used when detecting objects,
    The said parameter gradual change means gradually changes the said template parameter in the direction which excludes the edge part corresponding to the unstable edge extracted by the said unstable edge extraction means from the said template data. Apparatus for adjusting shape-based matching parameter as described.
  3. The parameters are template parameters used when creating template data composed of coordinates of edge points of the object and edge gradient vectors from the template image, and the object from the search object image based on the template data. Including search parameters used when detecting objects,
    The parameter gradual change means gradually changes the search parameter in a direction in which an edge portion corresponding to the unstable edge extracted by the unstable edge extraction means is difficult to detect from the search target image. Item 3. The apparatus for adjusting a shape-based matching parameter according to Item 1 or 2.
  4.   The accuracy of the shape base matching process is evaluated with a repetition accuracy obtained by statistically processing a result of performing the shape base matching process on the plurality of evaluation images. Apparatus for adjusting shape-based matching parameter as described.
  5. The shape-based matching process performs a rough search process on the result of the edge detection process on the search target image, and performs a fine search process based on the result of the rough search process.
    The parameter updating means evaluates the accuracy of the shape-based matching process while gradually changing the parameters used in each of the processes in such a direction that the tact times of the edge detection process, the coarse search process, and the fine search process are shortened. The shape-based matching parameter adjusting apparatus according to claim 1, wherein the parameter is updated to a parameter value whose limit is such that the accuracy can maintain the required accuracy.
  6. A component mounting apparatus that sucks an electronic component by a suction nozzle and mounts the electronic component at a predetermined position on a substrate,
    The shape-based matching parameter adjusting device according to any one of claims 1 to 5,
    Storage means for storing parameters used for shape-based matching processing;
    A search target image acquisition means for capturing an image of an electronic component sucked by the suction nozzle and acquiring the image as a search target image;
    A positioning unit that performs a shape-based matching process on the search target image acquired by the search target image acquisition unit using a parameter stored in the storage unit, and performs mounting positioning of the electronic component;
    A detecting means for detecting a change of parts lot;
    A determination means for determining whether or not adjustment of a parameter stored in the storage means is necessary when the detection means detects a change of a part lot;
    The component mounting apparatus, wherein when the determination unit determines that the parameter needs to be adjusted, the shape adjustment parameter adjustment apparatus adjusts the parameter.
JP2010214489A 2010-09-24 2010-09-24 Shape-based matching parameter adjustment device and component mounting device Active JP5759133B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2010214489A JP5759133B2 (en) 2010-09-24 2010-09-24 Shape-based matching parameter adjustment device and component mounting device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2010214489A JP5759133B2 (en) 2010-09-24 2010-09-24 Shape-based matching parameter adjustment device and component mounting device
CN201110288632.9A CN102421279B (en) 2010-09-24 2011-09-22 The shape basis adjusting device of match parameter, method of adjustment and apparatus for mounting component

Publications (2)

Publication Number Publication Date
JP2012069003A JP2012069003A (en) 2012-04-05
JP5759133B2 true JP5759133B2 (en) 2015-08-05

Family

ID=45945431

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2010214489A Active JP5759133B2 (en) 2010-09-24 2010-09-24 Shape-based matching parameter adjustment device and component mounting device

Country Status (2)

Country Link
JP (1) JP5759133B2 (en)
CN (1) CN102421279B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014002813A1 (en) * 2012-06-26 2014-01-03 コニカミノルタ株式会社 Image processing device, image processing method, and image processing program
JP6370038B2 (en) * 2013-02-07 2018-08-08 キヤノン株式会社 Position and orientation measurement apparatus and method
JP6418739B2 (en) * 2014-01-09 2018-11-07 ヤマハ発動機株式会社 Evaluation device, surface mounter, evaluation method
JP6413071B2 (en) * 2014-07-17 2018-10-31 パナソニックIpマネジメント株式会社 Component mounting method and component mounting system
EP3404583A1 (en) * 2017-05-19 2018-11-21 MVTec Software GmbH System and method for model adaptation

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000260699A (en) * 1999-03-09 2000-09-22 Canon Inc Position detector and semiconductor aligner employing the same
JP2002140693A (en) * 2000-10-31 2002-05-17 Keyence Corp Image processing parameter determination device, its method and recording medium with recorded image processing parameter determination program
JP2003042970A (en) * 2001-07-31 2003-02-13 Matsushita Electric Ind Co Ltd Visual inspection device and visual inspection method for mounting board
JP3834297B2 (en) * 2003-05-12 2006-10-18 ファナック株式会社 Image processing device
JP4470503B2 (en) * 2004-01-30 2010-06-02 株式会社ニコン Reference pattern determination method and apparatus, position detection method and apparatus, and exposure method and apparatus
JP2007004829A (en) * 2006-09-11 2007-01-11 Mitsubishi Electric Corp Image processing device
JP5160366B2 (en) * 2008-10-10 2013-03-13 Juki株式会社 Pattern matching method for electronic parts

Also Published As

Publication number Publication date
JP2012069003A (en) 2012-04-05
CN102421279B (en) 2016-04-13
CN102421279A (en) 2012-04-18

Similar Documents

Publication Publication Date Title
DE102009029478B4 (en) System and procedure for a quick approximate focus
CN101274432B (en) Apparatus for picking up objects
JP4983905B2 (en) Imaging apparatus, 3D modeling data generation method, and program
JP4317465B2 (en) Face identification device, face identification method, and face identification program
JP2004340840A (en) Distance measuring device, distance measuring method and distance measuring program
JP5180428B2 (en) Imaging recipe creation apparatus and method for scanning electron microscope, and semiconductor pattern shape evaluation apparatus
JP5276854B2 (en) Pattern generation apparatus and pattern shape evaluation apparatus
US9858659B2 (en) Pattern inspecting and measuring device and program
US7103210B2 (en) Position detection apparatus and exposure apparatus
WO2004106856A9 (en) Device and method of supporting stereo camera, device and method of detecting calibration, and stereo camera system
WO2004102478A1 (en) Image processing device
JP3739550B2 (en) Method for determining wafer measurement position
KR20130135962A (en) Defect classification method, and defect classification system
JP5154392B2 (en) Imaging device
US20030095700A1 (en) Method and apparatus for three dimensional edge tracing with Z height adjustment
JP2005265424A (en) Scanning electron microscope and reproducibility evaluation method as device in the same
CN101996398B (en) Image matching method and equipment for wafer alignment
JP4229767B2 (en) Image defect inspection method, image defect inspection apparatus, and appearance inspection apparatus
JP2013134255A (en) High performance edge focus tool
JP5699432B2 (en) Image processing device
JP2010156669A (en) Camera with dynamic calibration function and method thereof
CN101887033A (en) Method of measuring measurement target
US8581162B2 (en) Weighting surface fit points based on focus peak uncertainty
JP5604067B2 (en) Matching template creation method and template creation device
US9329024B2 (en) Dimension measuring apparatus, dimension measuring method, and program for dimension measuring apparatus

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20130821

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20140314

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20140415

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20140612

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20141111

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20141226

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20150519

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20150605

R150 Certificate of patent or registration of utility model

Ref document number: 5759133

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150