CN101996404A - Image analysis method and image analysis device - Google Patents

Image analysis method and image analysis device Download PDF

Info

Publication number
CN101996404A
CN101996404A CN2010102631700A CN201010263170A CN101996404A CN 101996404 A CN101996404 A CN 101996404A CN 2010102631700 A CN2010102631700 A CN 2010102631700A CN 201010263170 A CN201010263170 A CN 201010263170A CN 101996404 A CN101996404 A CN 101996404A
Authority
CN
China
Prior art keywords
unit
error
identifying object
texture
consistent degree
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN2010102631700A
Other languages
Chinese (zh)
Inventor
横瀬仁彦
寺地隆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Juki Corp
Original Assignee
Juki Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Juki Corp filed Critical Juki Corp
Publication of CN101996404A publication Critical patent/CN101996404A/en
Pending legal-status Critical Current

Links

Images

Abstract

The present invention provides an image analysis method and an image analysis device, capable of effectively identifying an object. For a unit area (A3) of a texture area (A2), a characteristic quantity is calculated by various methods, for each unit area (A5) of an abstraction area (A4), the characteristic quantity is calculated similarly, the characteristic quantities are weighted by various methods and synthesized to obtain an identical degree; according to a value of the synthetic identical degree, whether each unit area of the abstraction area being an identification object or background is judged; according to the judgment, a border line of the identification object and the background is obtained, an error of the position of an actual border line is obtained in filming of an image and the border line obtained according to the judgment is calculated, weighting pattern is changed step by step in a manner of thoroughly reducing the error, so as to optimize; according to the optimized weighting pattern, the identification object is identified from the filmed image.

Description

Method for analyzing image and image analysis apparatus
Technical field
The present invention relates to a kind of method for analyzing image and image analysis apparatus that is used for the identifying object thing.
Background technology
As the captured image data of having determined according to the brightness value of each pixel of photographic images, the method that the identifying object thing is discerned, known following method, promptly, from half tone information, obtain the method for the outline line of identifying object thing based on captured image data, with from based on the marginal information of extracting object the shading image of view data, carry out the extraction of object and the method for position probing (for example, patent documentation 1 and 2) according to the marginal point sequence.
Patent documentation 1: the spy of Japan opens flat 9-171553 communique
Patent documentation 2: No. 2981382 communiques of Japan's special permission
Summary of the invention
But, obtaining in the method for outline line based on the brightness step shown in the above-mentioned patent documentation 1, if there is not to produce the luminance difference that can obtain abundant brightness step between object and the background, then can't obtain clear and definite outline line, if there is the problem that between object and background, does not have sufficient luminance difference just can't discern.
In addition, under the situation of the prior art shown in the patent documentation 2, from object, obtaining marginal information,, there is the problem of identification difficulty if between object and background, do not have sufficient luminance difference then can't obtain sufficient marginal information as prerequisite.
The objective of the invention is to, can stably carry out the identification of object.
The invention of technical scheme 1 record, it is a kind of method for analyzing image that is undertaken by image analysis apparatus, this image analysis apparatus is based on comprising the captured image data of identifying object and described identifying object background on every side at interior photographic images, carry out the identification of described identifying object, the surface of described identifying object is made up of specific texture, it is characterized in that having:
Reference characteristic measure operation, it utilizes several different methods at the unit area of the regulation in the texture region of only being made up of described particular texture in the described photographic images, obtains the characteristic quantity as the texture of benchmark;
Comparative feature measure operation, it will comprise background around described texture region and the described texture region in interior extraction zone in the described photographic images at least, be divided into unit area with the identical size of unit area of described texture region, for each described unit area, utilize described several different methods, obtain characteristic quantity;
The unanimity degree is obtained operation, and it is for described constituent parts zone of extracting the zone, obtains indivedual consistent degree with described texture region at each characteristic quantity of described each method, and obtains and respectively described indivedual consistent degree are weighted and synthetic synthetic consistent degree;
Judge operation, it judges that according to the value of described synthetic consistent degree described constituent parts zone of extracting the zone is identifying object or background;
Operation is obtained in the boundary line, and it is from differentiating described differentiation operation in a plurality of unit areas of identifying object, obtains the identifying object in the described extraction zone and the boundary line of background; And
The error-detecting operation, it detects the boundary line of the reality in the described photographic images and the error that obtains between the boundary line of obtaining in the operation in described boundary line,
By changing weighting pattern on one side one by one at described indivedual consistent degree, carrying out on one side described consistent degree repeatedly obtains operation, judges that operation, boundary line obtain operation and error-detecting operation, thereby be identified for the weighting pattern that makes described error be less than or equal to threshold value or make described error minimum
According to described weighting pattern, from described photographic images, described identifying object is discerned.
The invention of technical scheme 2 record has the same formation of invention with technical scheme 1 record, it is characterized in that, part or all of computing method of 14 kinds of characteristic quantities that utilizes co-occurrence matrix is as the several different methods of obtaining described characteristic quantity.
The invention of technical scheme 3 records, it is a kind of image analysis apparatus, it is based on comprising the captured image data of identifying object and described identifying object background on every side at interior photographic images, carry out the identification of described identifying object, the surface of described identifying object is made up of specific texture, it is characterized in that having:
Reference characteristic measure the unit, it utilizes several different methods at the unit area of the regulation in the texture region of only being made up of described particular texture in the described photographic images, obtains the characteristic quantity as the texture of benchmark;
Comparative feature measure the unit, its with in the described photographic images comprise at least described texture region and shown in background around the texture region in interior extraction zone, be divided into unit area with the identical size of unit area of described texture region, for each described unit area, utilize described several different methods, obtain characteristic quantity;
The unanimity degree is obtained the unit, and it is at described constituent parts zone of extracting the zone, obtains indivedual consistent degree with described texture region at each characteristic quantity of described each method, and obtains respectively the synthetic consistent degree of will described indivedual consistent degree weightings and synthesizing;
Identifying unit, it judges that according to the value of described synthetic consistent degree described constituent parts zone of extracting the zone is identifying object or background;
The unit is obtained in the boundary line, and it is differentiated a plurality of unit areas for identifying object from utilizing described judgement unit, obtains the identifying object in the described extraction zone and the boundary line of background; And
Error detection unit, it detects actual boundary line in the described photographic images and the error that obtained between the boundary line of obtaining the unit by described boundary line,
By changing weighting pattern on one side one by one at described indivedual consistent degree, on one side obtaining the unit by described consistent degree repeatedly carries out the obtaining of described synthetic consistent degree, is judged, obtained the unit by described boundary line and carry out obtaining of described boundary line and carry out error-detecting by error detection unit by described identifying unit, thereby be identified for the weighting pattern that makes described error be less than or equal to threshold value or make described error minimum
According to weighting pattern, from photographic images, described identifying object is discerned.
The invention of technical scheme 4 record has the identical formation of invention with technical scheme 3 records, and, it is characterized in that part or all of computing method of 14 kinds of characteristic quantities that utilizes co-occurrence matrix is as the several different methods of obtaining described characteristic quantity.
The effect of invention
The invention of technical scheme 1 and 3 records, in the texture region by the identifying object thing from captured image data, utilize several different methods to obtain characteristic quantity and synthetic, from the unit area of characteristic quantity unanimity, obtain the boundary line of identifying object and background, so that the mode that diminishes with the error of actual boundary position makes repeated attempts, and determine the weighting pattern relevant with the consistent degree of characteristic quantity.Thus, according to characteristic quantity based on the intrinsic texture of identifying object, the identifying object thing, thus with background whether luminance difference being arranged, can discern identifying object effectively.
In addition, owing to determine the appropriate mode of the weighting relevant with several different methods, so get rid of the influence that each method there are differences on accuracy of identification, can realize more correct identification, above-mentioned several different methods is relevant with texture.
The inventions of technical scheme 2 and 4 records are adopted the computing method of 14 kinds of characteristic quantities of definite co-occurrence matrix because conduct is used to obtain the method for texture characteristic amount, so can discern the identifying object of various textures more effectively.
Description of drawings
Fig. 1 is the vertical view of the related electronic component mounting apparatus integral body of expression present embodiment.
Fig. 2 is the block diagram of the control system of expression electronic component mounting apparatus.
Fig. 3 is the functional block diagram of image analysis apparatus.
Fig. 4 is that learning data generates the process flow diagram of handling.
Fig. 5 is the key diagram that is illustrated in the various zones of setting in the photographic images of captured image data.
Fig. 6 (A) is the key diagram of expression texture region and unit area relation, and Fig. 6 (B) is the key diagram that zone and unit area relation is extracted in expression.
Fig. 7 (A) is the key diagram of the brightness value example in each pixel in representation unit zone, and Fig. 7 (B) is the key diagram of the notion of the displacement in the expression co-occurrence matrix, and Fig. 7 (C) is the key diagram of expression co-occurrence matrix.
Fig. 8 is the line chart of normal distribution of the characteristic quantity of expression texture region.
Fig. 9 is the line chart of error with the relation of consistent degree of representation feature amount.
Figure 10 is the key diagram whether expression constituent parts zone exists solder flux.
Figure 11 is with real contour line and generates the illustrated key diagram of contours superimposed.
Figure 12 is the actual key diagram that carries the example of the photographic images when moving of expression.
Embodiment
(integral body of embodiment constitutes)
Based on Fig. 1 to Figure 12, embodiments of the present invention are described.Present embodiment is represented following example: image analysis apparatus 10 is used for electronics loading device 100, this image analysis apparatus 10 is used for the identifying object thing and the background that are formed by specific texture in photographic images are discerned, and the purpose of present embodiment is that the loading position with electronic unit is that the solder flux pattern is discerned as the identifying object thing.
(electronic component mounting apparatus)
Electro part carrying device 100 carries various electronic units to substrate, as the mounted unit of electronic unit, as shown in Figure 1, has: a plurality of electronic unit feeders 101, and they are supplied with the electronic unit that carries; As the feeder maintaining part 102 of electronic unit supply unit, its arrangement also keeps a plurality of electronic unit feeders 101; Substrate supply unit 103, it is to the certain orientation conveying substrate; Carry homework department 104, it is used for carrying out the electro part carrying operation with respect to utilizing this substrate supply unit 103 to be arranged on substrate transport path substrate midway; As the boarded head 106 of parts holding unit, its maintenance is adsorbed with the adsorption mouth 105 of electronic unit, carries out the maintenance of electronic unit; As the X-Y portal frame 107 of boarded head mobile unit, its optional position in specialized range drives carries boarded head 106; As the CCD camera 108 of taking the unit, it is used to discern the solder flux pattern that carries out electro part carrying, carries on boarded head 106, carries out substrate imaging; And control device 120, its each structure to electro part carrying device 100 is controlled.
Image analysis apparatus 10 is assembled in the control device 120, its objective is, in order to carry electronic unit, identification solder flux pattern is reflected in correct positional information in the location action of boarded head 106 from the substrate imaging image of CCD camera 108.
In addition, in the following description, will be called X-direction along a direction of surface level mutually orthogonal, another direction is called Y direction, direction is called Z-direction vertically.
Substrate supply unit 103 has not shown conveying belt, utilizes this conveying belt, along the X-direction conveying substrate.
In addition, as mentioned above, carry homework department 104 being provided with of substrate transport path of substrate supply unit 103 midway, it carries electronic unit to substrate.Substrate supply unit 103 is delivered to substrate and carries homework department 104 and stop, and utilizes not shown maintaining body to carry out the maintenance of substrate.That is, substrate carries out the lift-launch operation of stable electronic unit with the state that is kept by maintaining body.
Boarded head 106 is provided with: adsorption mouth 105, and it utilizes air to attract to keep electronic unit at leading section; Z axle motor 111 (with reference to Fig. 2), it is the drive source that drives this adsorption mouth 105 on Z-direction; And turning motor 112 (with reference to Fig. 2), it is to be the center with the Z-direction, the electronic unit that keeps via adsorption mouth 105 is rotated the rotary driving source of driving.
In addition, each adsorption mouth 105 is connected with vacuum generation source, undertaken by leading section in this adsorption mouth 105 air-breathing, thereby carry out the absorption and the maintenance of electronic unit.
Promptly, utilize above-mentioned structure, when carrying operation, utilize leading section attract electrons parts from the electronic unit feeder 101 of regulation of adsorption mouth 105, adsorption mouth 10 is descended towards substrate, and make adsorption mouth 105 rotation, Yi Bian adjust electronic unit towards, Yi Bian carry operation.
In addition, above-mentioned CCD camera 108 carries on boarded head 106, drives boarded head 106 by utilizing X-Y portal frame 107, is positioned on the camera site of substrate.
X-Y portal frame 107 has: X-axis guide rail 107a, and it moves to X-direction guiding boarded head 106; Two Y-axis guide rail 107b, they guide with this X-axis guide rail 107a boarded head 106 to Y direction; X-axis motor 109 (with reference to Fig. 2), it is the drive source that boarded head 106 is moved along X-direction; And Y-axis motor 110 (with reference to Fig. 2), it is the drive source that boarded head 106 is moved via X-axis guide rail 107 to Y direction.And, utilize the driving of each motor 109,110, carry boarded head 106 in the roughly whole zone that can between two Y-axis guide rail 107b, form.
In addition, each motor is controlled to the rotation amount of expectation, thereby carries out the location of adsorption mouth 105 and CCD camera 108 via boarded head 106 by by control device 120 identifications rotation amount separately.
In addition, can be corresponding with the needs of electronic unit, but above-mentioned feeder maintaining part 102 and lift-launch homework department 104 all are configured in the conveyor zones of the boarded head 106 that is formed by X-Y portal frame 107.
Feeder maintaining part 102 has a plurality of pars along X-Y plane, and on this par, arranging also along X-direction, mounting is equipped with a plurality of electronic unit feeders 101.
CCD camera 108 is to be held from the state of boarded head 106 towards the below, by substrate being taken identification solder flux pattern from the top.
(control device)
Fig. 2 is the block diagram of the control system of expression electro part carrying device 100.As shown in Figure 2, control device 120 mainly has: CPU 121, it carries out the action control of X-axis motor 109, the Y-axis motor 110 of X-Y portal frame 107, the Z axle motor 111 that carries out the lifting of adsorption mouth 105 in boarded head 106, the turning motor 112 of rotation that carries out adsorption mouth 105 and CCD camera 108, and control program is according to the rules carried out various processing and control; ROM 122, and its storage is used to carry out the program of various processing and control; RAM 123, and it becomes the operating area of various processing by store various kinds of data; Not shown I/F (interface), its realization CPU 121 is connected with various device; Guidance panel 124, it is used for importing in various settings and the required data of operation; EEPROM 126, and its storage is used to carry out the setting data of various processing and control; And display 125, it shows the content of various settings and check result described later etc.In addition, each above-mentioned motor 109~112 is connected with control device 120 via not shown motor driver.
(image analysis apparatus)
Fig. 3 is the functional block diagram of image analysis apparatus 10.Each function described later of image analysis apparatus 10 the in fact program that puts rules into practice of the CPU 121 by control device 120 realizes.
Image analysis apparatus 10 has: master control part 11, and it unifies control to integral body; Image input part 20, it obtains captured image data from the output of CCD camera 108; Image segmentation portion 30, it is divided into the image-region of rectangle based on captured image data; Error determine point input part 40, it utilizes the operator a plurality of points of appointment on the boundary line of the solder flux pattern of guidance panel 124 in photographic images and its background to be set at the error determine point; Characteristic Extraction portion 50, it extracts the characteristic quantity of the texture of solder flux pattern; Characteristic quantity synthesizes portion 60, the aftermentioned parameter that the characteristic quantity that its decision is used for extracting synthesizes, and characteristic quantity synthesized; And error-detecting portion 70, it detects the error of error determine point and output shape and judges.
According to said structure, image analysis apparatus 10 is before the lift-launch action of carrying out electronic unit, carry out learning data in advance and generate processing, this learning data generates processing and obtains and be used for the correct parameter that the characteristic quantity to the solder flux pattern synthesizes, when carrying action, utilization generates the parameter that obtains in the processing at learning data, according to the photographic images of CCD camera 108, carries out the processing of identification solder flux pattern.
(learning data generates and handles)
Below, according to the processing sequence shown in the process flow diagram of Fig. 4, illustrate that the learning data of the texture of the solder flux pattern that above-mentioned image analysis apparatus 10 carries out generates processing.
At first, utilize image input part 20, comprise of the shooting of solder flux pattern, this image is stored (step S1) as captured image data at interior substrate by CCD camera 108.
Then, carry out the input of error determine point G by the operator, error determine point input part 40 is handled, and record is by the position in the image of this input operation appointment (step S2).
Fig. 5 represents the photographic images that photographed by CCD camera 108, and outermost rectangular area A1 represents the scope of photographic images, and H represents the solder flux pattern that photographs.
Because the surface of solder flux is similarly covered by the peculiar grain of solder flux, so when the lift-launch action of electronic unit, obtain the characteristic quantity of this texture and discern, obtain outline line R, carry out electro part carrying as the boundary line of solder flux pattern H and background with background.Generate in the processing at this learning data, be used for from the photographic images contour identification line necessary correct parameter of R (aftermentioned) in order to obtain, based on the photographic images that in display device 125, demonstrates, the operator visually discerns outline line R, with any a plurality of positions on this outline line (being 6 in this example) as error determine point G, for example utilize cursor operations etc. to carry out that specify the position and input from guidance panel 124, in image analysis apparatus 10, the position coordinates of error determine point G in the photographic images zone of importing is stored among the RAM 123.
Then, be used to obtain the input of texture region A2 of the characteristic quantity of solder flux texture by the operator, Characteristic Extraction portion 50 handles, and record is by the position and the scope (step S3) in the zone of this input operation appointment.
This texture region A2 does not have the size restriction, but is more preferably the scope of the unit area A3 that forms aftermentioned 8 * 8 pixels that can obtain 30 degree at least, in addition, requires not comprise solder flux part in addition in the zone of texture region A2.
Promptly, with the input of above-mentioned error determine point G similarly, based on the photographic images that in display device 125, shows, the operator utilizes guidance panel 124 for example by utilizing cursor operations etc. four jiaos of specified scope to be carried out specify the position and input, thereby specify input texture region A2, in image analysis apparatus 10, with the texture region A2 of expression input in the photographic images zone the position and the coordinate of scope be stored among the RAM 123.
Then, Characteristic Extraction portion 50 handles, according to the input texture region A2 obtain texture image feature amount (step S4: reference characteristic measure operation).In this was handled, Characteristic Extraction portion 50 worked as " reference characteristic measure unit ".
In the obtaining of the image feature amount of texture, use co-occurrence matrix.So-called co-occurrence matrix, be to utilize statistics to carry out one of textural characteristics Calculation Method during texture characteristic amount calculates, utilization is with undefined co-occurrence matrix, can be to obtaining characteristic quantity as 14 kinds of key elements such as the contrast of one of spatial texture of image and locally coherences.
At first,, shown in Fig. 6 (A), carry out following processing: utilize divided in horizontal direction line LH and vertical direction cut-off rule LV that above-mentioned texture region A2 is cut apart, cut out the foursquare unit area A3 of 8 * 8 pixels in order to generate co-occurrence matrix.As mentioned above, be that 30 degree (being 48 in the drawings) are just enough as long as cut apart the quantity of the unit area A3 that obtains.As described later, owing to will and obtain standard deviation,, be sufficient quantity statistically so prepare 30 (being 48 in the drawings) unit area A3 based on the characteristic quantity equalization of constituent parts zone A3.
And A3 generates co-occurrence matrix respectively for the constituent parts zone.
Unit image A 3 with 8 * 8 shown in Fig. 7 (A) serves as that row describe, shown in Fig. 7 (B), pixel from the brightness value i (forming 32 rank gray scale: i=0~31) of image, on the direction of angle θ at a distance of displacement δ=(r of length r, the brightness value of pixel θ) is j, (i is a co-occurrence matrix for the matrix shown in Fig. 7 of key element (C) j) with this probability P δ.In addition, in the present embodiment, at the key element of the capable j of i row, input under the situation of described displacement with respect to the pixel of brightness value i and brightness value becomes the number of the pixel of j.In addition, in the present embodiment,, when r=1, generate a co-occurrence matrix for angle θ=0 ° and θ=180 ° this both direction for displacement.In addition, similarly, also generate the co-occurrence matrix of θ=45 (225), 90 (270), 135 (315).
And, for the co-occurrence matrix of constituent parts zone A3, utilize the calculating formula of following formula (1)~(14), calculate 14 kinds of characteristic quantity fi (i=1~14).In addition, the fi value is a real number, for example, and f1=angle second moment, f2=contrast, f3=entropy etc.As everyone knows, the fi value has the ability of difference texture.Because for the represented meaning of the characteristic quantity of above-mentioned texture, in " new Knitting, portrait resolve Ha Application De Block Star Network, the high wooden Dry hero Yang Jiu Prison that goes to the field and repair ", be described in detail, so omit explanation here.
[several 1]
F1 angle second moment
(angular?second?moment)
Figure BSA00000244789700101
The f2 contrast
(contrast)
Figure BSA00000244789700102
F3 is relevant
(correlation)
Figure BSA00000244789700103
Wherein μ x = Σ i = 0 n - 1 i · P x ( i ) , μ y = Σ j = 0 n - 1 j · P y ( j )
σ x 2 = Σ i = 0 n - 1 ( i - μ x ) 2 P x ( i ) , σ y 2 = Σ j = 0 n - 1 ( j - μ y ) 2 P y ( j )
F4 total sum of squares: variance
(sum?of?square:variance)
Figure BSA00000244789700107
F5 unfavourable balance square
(inverse?difference?moment)
Figure BSA00000244789700108
F6 and average
(sum?average)
Figure BSA00000244789700109
[several 2]
F7 and variance
(sum?variance)
Figure BSA00000244789700111
F8 and entropy
(sum?entropy)
The f9 entropy
(entropy)
F10 difference variance
(difference?variance)
Figure BSA00000244789700114
F11 difference entropy
(difference?entropy)
Figure BSA00000244789700115
The f12 associated information calculation
(information?measures?of?correlation)
The f13 associated information calculation [1-exp{-2.0 (HXY2-HXY) }] 1/2(13)
(information?measures?of?correlation)
Wherein
HXY = - Σ i = 0 n - 1 Σ j = 0 n - 1 P δ ( i , j ) log { P δ ( i , j ) }
Σ i = 0 n - 1 P x ( i ) log { P x ( i ) } HY = - Σ j = 0 n - 1 P y ( j ) log { P y ( j ) }
HXY 1 = - Σ i = 0 n - 1 Σ j = 0 n - 1 P δ ( i , j ) log { P x ( i ) P y ( j ) } HXY 2 = - Σ i = 0 n - 1 Σ j = 0 n - 1 P x ( i ) P y ( j ) log { P x ( i ) P y ( j ) }
F14 maximum correlation coefficient (the 2nd big eigenvalue of Q) 1/2(14)
(maximal?correlation?coefficient)
Wherein Q ( i , j ) = Σ k = 0 n - 1 P δ ( i , k ) P δ ( k , j ) P x ( i ) P y ( j )
In addition, in various, n=31 (maximum brightness value) in addition, carries out the definition of following formula (15)~(18).
[several 3]
P x ( i ) = Σ j = 0 n - 1 P δ ( i , j ) , i = 0,1 , . . . , n - 1 . . . ( 15 )
P y ( j ) = Σ i = 0 n - 1 P δ ( i , j ) , j = 0,1 , . . . , n - 1 . . . ( 16 )
P x + y ( k ) = Σ i = 0 n - 1 Σ j = 0 n - 1 P δ ( i , j ) i + j = k , k = 0,1 , . . . , 2 n - 2 . . . ( 17 )
P x - y ( k ) = Σ i = 0 n - 1 Σ j = 0 n - 1 P δ ( i , j ) | i - j = k | , k = 0,1 , . . . , n - 1 . . . ( 18 )
According to above-mentioned processing, if obtain characteristic quantity fi value (i=1~14) for whole unit area A3, then calculate average value mu i and the standard deviation i of whole unit area A3 at each kind (each value of i) of characteristic quantity, with they characteristic quantities, be stored among RAM 123 or the EEPROM 126 as study fi value as texture region A2.In addition, the characteristic quantity fi of texture region A2 is generally as shown in Figure 8 according to normal distribution.
Then, the synthetic parameters of 60 pairs of characteristic quantities of the synthetic portion of characteristic quantity carries out initialization (step S5).
Various characteristic quantity fi for above-mentioned various (1)~(14), in the processing of back, obtain the consistent degree E of the characteristic quantity that calculates equally with unit area at other image beyond the texture region, at this moment, will be also synthetic for the consistent degree E difference weighting that various characteristic quantities (each i) obtains.Synthetic parameters is represented the combination parameter for the consistent degree E weighting of various characteristic quantities and the coefficient wi that multiplies each other (i=1~14).
Synthetic parameters is defined as w1+w2+w3+ ... + w14=1, when the initialization of synthetic parameters, w1=1.That is, other w2, w3, w4 ... be 0.That is, only obtain the consistent degree E relevant with characteristic quantity f1.
Then, in input picture 5, set the new extraction zone A4 that surrounds solder flux integral body.
Characteristic quantity synthesizes portion 60, based on above-mentioned synthetic parameters, to whether being that the texture of solder flux is discerned, in order to form the outline line of solder flux pattern, surround the input of the new extraction zone A4 of solder flux integral body by the operator, the synthetic portion 60 of characteristic quantity handles, and record is by the position and the scope (step S6) in the zone of this input operation appointment.
This extracts regional A4, gets final product so long as surround the scope of solder flux pattern, does not have the size restriction, but as shown in Figure 5, requires to form the scope that comprises above-mentioned whole error determine point G, and, the scope that preferably can surround the solder flux pattern.Input method is identical with the situation of above-mentioned texture region A2.
And, the synthetic portion 60 of characteristic quantity as Fig. 6 (B) shown in, with the situation of texture region A2 in the same manner, with the regional A4 of the unit area A5 segmented extraction of 8 * 8 pixels (step S7: comparative feature measure operation).
And the constituent parts zone A5 for extracting regional A4 calculates the characteristic quantity fi that tries to achieve from 14 above-mentioned formulas.At this moment, according to wi in the synthetic parameters (i=1~14) ≠ 0 calculated characteristics amount.In addition, at this moment, the synthetic portion 60 of characteristic quantity works as " comparative feature measure unit ".
Then, the consistent degree E of the study fi value of the constituent parts zone A5 of 60 couples of regional A4 of extraction of the synthetic portion of characteristic quantity and the regional A3 of texture region A2 calculates (step S8: consistent degree is obtained operation).
That is, at each characteristic quantity fi of constituent parts zone A5, calculate its with the average value mu i of corresponding study fi value between error delta i.And, calculate consistent degree ei (indivedual consistent degree) for each characteristic quantity fi of constituent parts zone A5.
The calculating of each consistent degree ei utilizes Function e val shown in Figure 9.For this Function e val, the longitudinal axis is that maximal value is 1 consistent degree ei, and transverse axis is error delta i.And this Function e val is as long as be monotone decreasing, and the variation of its slope is arbitrarily.That is, can be the straight line of monotone decreasing, slope is near 0 near 0 and 1 along with consistent degree but here adopt, and slope forms maximum curve at the place, centre.That is, the value of ei is 0≤ei≤1, ei=1 (δ i=0), ei=0 (δ i 〉=β σ i), middle monotone decreasing.σ i is the standard deviation of study fi value, β be to and characteristic quantity fi value and learn the coefficient that the error delta i between the average value mu i of fi value estimates, β big more easy more being considered to image texture.In the present embodiment, β=1.5 for example.
Thus, each the characteristic quantity fi at constituent parts zone A5 calculates consistent degree ei.And, according to each coefficient wi of synthetic parameters, multiply each other with each ei and amount to, calculate consistent degree E (synthetic consistent degree) at constituent parts zone A5.
In addition, at this moment, the synthetic portion 60 of characteristic quantity works as " consistent degree is obtained the unit ".
Whether then, characteristic quantity synthesizes portion 60, based on the consistent degree E of the constituent parts zone A5 that extracts regional A4, be that solder flux is judged (step S9: judge operation) to this constituent parts zone A5.
Whether this judgement is to prepare consistent degree threshold value C in advance, carry out more than or equal to threshold value according to the consistent degree E in constituent parts zone.That is, if E more than or equal to threshold value C then be solder flux, otherwise is a background.
In addition, the value of threshold value C here for example is C=0.8 as long as suitably (obtained by test etc.).
In addition, at this moment, the synthetic portion 60 of characteristic quantity works as " identifying unit ".
Result of determination according to the consistent degree E of the constituent parts zone A5 of said extracted zone A4, as shown in figure 10, can obtain ' 2 value images ' (1=solder flux, 0=background), it identifies with unit area A5 is the solder flux and the background (the unit area A5 of solder flux illustrates with shade) of a unit.
Corresponding with it, error-detecting portion 70, the line of centres of the solder flux unit area A5 adjacent with background unit area A5 that will be in solder flux unit area A5 generates the outline line RA (step S10: operation is obtained in the boundary line) of solder flux pattern.
In addition, at this moment, error-detecting portion 70 works as " unit is obtained in the boundary line ".
And error-detecting portion 70 detects and generates error D (the step S11: error-detecting operation) of outline line RA with respect to the outline line R of reality.
The outline line R of reality is overlapping with generation outline line RA, as shown in figure 11.Error D calculates based on the error determine point G that logins in step S2.That is, outline line RA calculates and the bee-line d (pixel distance) of each error determine point G with respect to generating.If the bee-line that obtains thus be d1, d2 ..., then by error D=max{d1, d2 ... obtain.
In addition, at this moment, error-detecting portion 70 works as " error detection unit ".
Then, 70 couples of error D of error-detecting portion and predefined reference value alpha compare (step S12), be less than or equal at error D under the situation of reference value alpha, think that the setting of current synthetic parameters can derive the correct consistent degree that is used to discern solder flux and background, end process.Thus, reference value alpha is to be used to the importance value of judging that whether suitable synthetic parameters is, if with its bigger setting, though then can easily obtain suitable synthetic parameters, the accuracy of identification of solder flux pattern is descended.In addition,, then be difficult for trying to achieve suitable synthetic parameters, but the accuracy of identification of solder flux pattern improves if set reference value alpha lessly.Thus, must utilize experiment to wait corresponding and be set at suitable value with purpose.
On the other hand, in the judgement of step S12, surpassing under the situation of reference value alpha, the setting of current synthetic parameters is thought improper, and (step S13) changed in the setting of synthetic parameters.
That is the change and the coefficient wi of the corresponding consistent degree of characteristic quantity fi value independently respectively.For example, with 0,0.2,0.4,0.6,0.8,1.0 0.2 such units, 6 grades change coefficient wi (also can cut apart more meticulously) with the corresponding consistent degree of each characteristic quantity fi value.In addition, as mentioned above, carry out standardization for coefficient wi integral body.In the present embodiment, parameter with product space w1 * w2 * ... overall variation.
And,, then handle being back to step S8 if determine new synthetic parameters, again obtain the consistent degree E of the constituent parts zone A5 of new extraction zone A4, obtain the unit area A5 of solder flux, generate outline line RA, compare the processing of error D and reference value alpha again according to it.
Thus, before obtaining the synthetic parameters that error D is less than or equal to reference value alpha, handle repeatedly.
In addition, error D becomes under the situation that is less than or equal to reference value alpha in step S12, the combinations of values (for example w6=0.5, w8=0.5, the numerical value group that other w=0 is such) of each coefficient wi of the synthetic parameters of this moment is carried out record as learning data (study synthetic parameters).
Thus, finish learning data and generate processing.
(processing during electro part carrying)
Below, the processing of carrying out when the lift-launch of electronic unit is described.
By described adsorption mouth 106 attract electrons parts, boarded head 105 moves under near the loading position of substrate the state, utilizes CCD camera 108 to carry out the shooting of the loading position of substrates, obtains captured image data.
And, utilize characteristic quantity to synthesize portion 60, the part (for example central portion) of photographic images is also taken out as extracting the zone, carry out and the above-mentioned same processing of step S7~S10.
That is, be unit area with the extraction Region Segmentation of photographic images, be not 0 i calculated characteristics amount at the coefficient wi of consistent degree ei, according to generate the synthetic parameters of obtaining in handling at learning data, calculate the consistent degree of constituent parts zone and solder flux.And,, calculate outline line based on the unit area that is judged to be solder flux.Because by the solder flux pattern that this outline line area surrounded is a loading position, institute is positioned at adsorption mouth 106 on the position of outline line encirclement so that boarded head 105 moves, and carries out the lift-launch of electronic unit.
(effect of embodiment)
In above-mentioned image analysis apparatus 10, among the texture region A2 by the solder flux H from captured image data, utilize 14 formulas of co-occurrence matrix, obtain characteristic quantity and synthetic, from the unit area A5 of characteristic quantity unanimity, obtain the boundary line of solder flux H identifying object and background,, thereby determine the weighting pattern relevant with the consistent degree of characteristic quantity so that the mode that diminishes with the error of actual boundary position makes repeated attempts.Thus, according to the characteristic quantity based on the intrinsic texture of solder flux H, H discerns to this solder flux, thereby with background whether luminance difference is arranged, and can discern identifying object effectively.
Thus, for example as shown in figure 12,, also can discern the outline line RA of expression solder flux shape reliably even in shooting area A1, exist under the state of solder flux H and substrate K, the Wiring pattern N different, pad U with this solder flux texture.
In addition, owing to determine the suitable synthetic parameters relevant with 14 formulas of co-occurrence matrix, so get rid of the influence that each method there are differences on accuracy of identification, can realize more correct identification, 14 formulas of above-mentioned co-occurrence matrix are relevant with texture.
Owing to, use the computing method of 14 kinds of characteristic quantities of co-occurrence matrix, so can discern the identifying object of various textures more effectively as the method that is used to obtain texture characteristic amount.
(other)
In above-mentioned image analysis apparatus 10, calculating as the characteristic quantity of solder flux texture, use derives from the method for co-occurrence matrix, but also can utilize other possible method to give feature to texture, also method and other method of co-occurrence matrix can be used in combination.For example, as the analytic method of texture, for example can utilize frequency, intensity histogram, sweep length matrix and fractal dimension that the area image Fourier transform is obtained.
Also can obtain the effect identical in this case with above-mentioned embodiment.
In addition, in above-mentioned image analysis apparatus 10, generate in the processing, only utilize texture region A2, do not use the image information of the background area of solder flux, but also can use at learning data.In this case, for example from background, propose unit area A6 (with reference to Fig. 5), from this unit area A6, obtain the fi value at each error determine point.Thus, compare, for example, can divide into the fi value little that differs greatly between the fi value with difference by fi value with texture region A2.Its result needn't be repeatedly calculates in that the product space of coefficient wi is region-wide, reduces the number of occurrence, can promptly obtain correct synthetic parameters.
In addition, in above-mentioned image analysis apparatus 10, generate in the processing at learning data, in the judgement of the trial of step S 12, by error D and reference value alpha are compared, judge and as termination condition, but also error D can be presented in the display device 125 that the operator attempts judging to whether carrying out.Also can obtain the effect identical in this case with above-mentioned embodiment.
In addition, in above-mentioned image analysis apparatus 10, generate in the processing at learning data, in the judgement of the trial of step 12, by with error D less than benchmark α as the termination condition of handling, but be not limited thereto, also can be to the operating and setting stipulated number that makes repeated attempts, will be in stipulated number the synthetic parameters of error minimum as learning data.Also can obtain the effect roughly the same in this case, and can realize rapidization handled with above-mentioned embodiment.
In addition, error determine point G is not limited to 6 above-mentioned points.Minimumly can be 1, also can form more in addition.Also can obtain the effect identical in this case with above-mentioned embodiment.
In addition, as the object of above-mentioned error D, used the bee-line d of outline line RA and R, but other also can.For example, also the number of specification error measuring point G more in interregional area and the center of gravity obtained of solder flux, compares.Also can obtain the effect identical in this case with above-mentioned embodiment.

Claims (4)

1. method for analyzing image that is undertaken by image analysis apparatus, this image analysis apparatus is based on comprising the captured image data of identifying object and described identifying object background on every side at interior photographic images, carry out the identification of described identifying object, the surface of described identifying object is made up of specific texture, it is characterized in that having:
Reference characteristic measure operation, it utilizes several different methods at the unit area of the regulation in the texture region of only being made up of described particular texture in the described photographic images, obtains the characteristic quantity as the texture of benchmark;
Comparative feature measure operation, it will comprise background around described texture region and the described texture region in interior extraction zone in the described photographic images at least, be divided into unit area with the identical size of unit area of described texture region, for each described unit area, utilize described several different methods, obtain characteristic quantity;
The unanimity degree is obtained operation, and it is for described constituent parts zone of extracting the zone, obtains indivedual consistent degree with described texture region at each characteristic quantity of described each method, and obtains and respectively described indivedual consistent degree are weighted and synthetic synthetic consistent degree;
Judge operation, it judges that according to the value of described synthetic consistent degree described constituent parts zone of extracting the zone is identifying object or background;
Operation is obtained in the boundary line, and it is from differentiating described differentiation operation in a plurality of unit areas of identifying object, obtains the identifying object in the described extraction zone and the boundary line of background; And
The error-detecting operation, it detects the boundary line of the reality in the described photographic images and the error that obtains between the boundary line of obtaining in the operation in described boundary line,
By changing weighting pattern on one side one by one at described indivedual consistent degree, carrying out on one side described consistent degree repeatedly obtains operation, judges that operation, boundary line obtain operation and error-detecting operation, thereby be identified for the weighting pattern that makes described error be less than or equal to threshold value or make described error minimum
According to described weighting pattern, from described photographic images, described identifying object is discerned.
2. method for analyzing image according to claim 1 is characterized in that,
Part or all of computing method of 14 kinds of characteristic quantities that utilizes co-occurrence matrix is as obtaining the several different methods of described characteristic quantity.
3. image analysis apparatus, it carries out the identification of described identifying object based on comprising the captured image data of identifying object and described identifying object background on every side at interior photographic images, and the surface of described identifying object is made up of specific texture, it is characterized in that having:
Reference characteristic measure the unit, it utilizes several different methods at the unit area of the regulation in the texture region of only being made up of described particular texture in the described photographic images, obtains the characteristic quantity as the texture of benchmark;
Comparative feature measure the unit, its with in the described photographic images comprise at least described texture region and shown in background around the texture region in interior extraction zone, be divided into unit area with the identical size of unit area of described texture region, for each described unit area, utilize described several different methods, obtain characteristic quantity;
The unanimity degree is obtained the unit, and it is at described constituent parts zone of extracting the zone, obtains indivedual consistent degree with described texture region at each characteristic quantity of described each method, and obtains respectively the synthetic consistent degree of will described indivedual consistent degree weightings and synthesizing;
Identifying unit, it judges that according to the value of described synthetic consistent degree described constituent parts zone of extracting the zone is identifying object or background;
The unit is obtained in the boundary line, and it is differentiated a plurality of unit areas for identifying object from utilizing described judgement unit, obtains the identifying object in the described extraction zone and the boundary line of background; And
Error detection unit, it detects actual boundary line in the described photographic images and the error that obtained between the boundary line of obtaining the unit by described boundary line,
By changing weighting pattern on one side one by one at described indivedual consistent degree, on one side obtaining the unit by described consistent degree repeatedly carries out the obtaining of described synthetic consistent degree, is judged, obtained the unit by described boundary line and carry out obtaining of described boundary line and carry out error-detecting by error detection unit by described identifying unit, thereby be identified for the weighting pattern that makes described error be less than or equal to threshold value or make described error minimum
According to weighting pattern, from photographic images, described identifying object is discerned.
4. image analysis apparatus according to claim 3 is characterized in that,
Part or all of computing method of 14 kinds of characteristic quantities that utilizes co-occurrence matrix is as obtaining the several different methods of described characteristic quantity.
CN2010102631700A 2009-08-21 2010-08-20 Image analysis method and image analysis device Pending CN101996404A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2009191904A JP2011043998A (en) 2009-08-21 2009-08-21 Method and apparatus for analyzing image
JP2009-191904 2009-08-21

Publications (1)

Publication Number Publication Date
CN101996404A true CN101996404A (en) 2011-03-30

Family

ID=43786523

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2010102631700A Pending CN101996404A (en) 2009-08-21 2010-08-20 Image analysis method and image analysis device

Country Status (2)

Country Link
JP (1) JP2011043998A (en)
CN (1) CN101996404A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103578110A (en) * 2013-11-12 2014-02-12 河海大学 Multi-band high-resolution remote sensing image segmentation method based on gray scale co-occurrence matrix

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5648600B2 (en) * 2011-06-17 2015-01-07 株式会社デンソー Image processing device
JP6365010B2 (en) * 2014-06-30 2018-08-01 富士ゼロックス株式会社 Learning program and information processing apparatus
JP2017026565A (en) * 2015-07-28 2017-02-02 株式会社島津製作所 Inspection device and inspection method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103578110A (en) * 2013-11-12 2014-02-12 河海大学 Multi-band high-resolution remote sensing image segmentation method based on gray scale co-occurrence matrix
CN103578110B (en) * 2013-11-12 2016-06-08 河海大学 Multiband high-resolution remote sensing image dividing method based on gray level co-occurrence matrixes

Also Published As

Publication number Publication date
JP2011043998A (en) 2011-03-03

Similar Documents

Publication Publication Date Title
KR20170027812A (en) Techniques for arrayed printing of a permanent layer with improved speed and accuracy
CN1885014B (en) Board inspecting apparatus, its parameter setting method and parameter setting apparatus
CN107784672A (en) For the method and apparatus for the external parameter for obtaining in-vehicle camera
CN101996404A (en) Image analysis method and image analysis device
CN106251365A (en) Many exposure video fusion method and device
US20210090230A1 (en) System and method for efficiently scoring probes in an image with a vision system
CN103597825A (en) Display screen for camera calibration
JP7174074B2 (en) Image processing equipment, work robots, substrate inspection equipment and specimen inspection equipment
CN102859675A (en) Semiconductor fault analysis device and fault analysis method
CN107038701B (en) The detection method and system of cable surface blemish in a kind of industrial production
CN102047189A (en) Cutting process simulation display device, method for displaying cutting process simulation, and cutting process simulation display program
CN109919038A (en) Power distribution cabinet square pressing plate state identification method based on machine vision and deep learning
DE102019104310A1 (en) System and method for simultaneously viewing edges and normal image features through a vision system
CN111812119A (en) Device and method for detecting area ratio of dots on surface of printed matter
CN111915485B (en) Rapid splicing method and system for feature point sparse workpiece images
CN106446888A (en) Camera module multi-identifier identification method and camera module multi-identifier identification equipment
CN101685000B (en) Computer system and method for image boundary scan
CN107346558A (en) Accelerate the method for the direct lighting effect drafting of three-dimensional scenic using the space correlation of area source observability
CN115546016B (en) Method for acquiring and processing 2D (two-dimensional) and 3D (three-dimensional) images of PCB (printed Circuit Board) and related device
CN107578001B (en) Method and device for testing resolution of fingerprint acquisition equipment
EP3046406B1 (en) Method for generating compensation matrix during substrate inspection
CN113763380B (en) Vector gradient-based reference-free image definition evaluation method
CN113379899B (en) Automatic extraction method for building engineering working face area image
CN114170449A (en) Artificial intelligence image recognition device based on degree of depth learning
CN104024837B (en) Liquid crystal array inspecting apparatus and the signal processing method of liquid crystal array inspecting apparatus

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C02 Deemed withdrawal of patent application after publication (patent law 2001)
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20110330