CN1551036A - Part recognizing method and device - Google Patents

Part recognizing method and device Download PDF

Info

Publication number
CN1551036A
CN1551036A CNA2004100351305A CN200410035130A CN1551036A CN 1551036 A CN1551036 A CN 1551036A CN A2004100351305 A CNA2004100351305 A CN A2004100351305A CN 200410035130 A CN200410035130 A CN 200410035130A CN 1551036 A CN1551036 A CN 1551036A
Authority
CN
China
Prior art keywords
angle
parts
terminal
absorption
bight
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CNA2004100351305A
Other languages
Chinese (zh)
Other versions
CN1319012C (en
Inventor
藤江公子
安部好晃
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Juki Corp
Original Assignee
Juki Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Juki Corp filed Critical Juki Corp
Publication of CN1551036A publication Critical patent/CN1551036A/en
Application granted granted Critical
Publication of CN1319012C publication Critical patent/CN1319012C/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Images

Landscapes

  • Image Analysis (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Processing (AREA)
  • Supply And Installment Of Electrical Components (AREA)

Abstract

Provided is a part recognizing method and a part recognizing device that detect the actual adsorption angle of a part for reliably detecting the deviation between the actual adsorption angle and a specified one. Scanning on a diagonal line for connecting part corners C1-C4 obtained by the image processing of the part detects the presence or absence of corner terminals 30a, 30b, and 30c at the corner section. The corner section of the part at which the corner terminal exists when the direction of one side of the part is at a reference angle is recorded in advance and the corner section in which the detected corner terminal exists is compared with the corner section in which the prerecorded corner terminal exists, thus detecting the deviation between the adsorption angle of the part and the reference angle.

Description

The component identification method and apparatus
Technical field
The present invention relates to the component identification method and apparatus, in more detail, relate to and a kind ofly come component identification method and apparatus that the absorption orientation (posture) of parts is discerned by the parts with angle terminal being taken and handled its image.
Background technology
Patent documentation 1: the spy opens flat 6-223185 communique (claim 1)
In the past, in component mounter, for electronic unit (being designated hereinafter simply as parts) is assembled on the printed base plate accurately, before being assembled to the parts by adsorption nozzle absorption on the printed base plate, camera heads such as use CCD camera are taken parts.Then, by obtaining parts center and parts gradient, calculate the absorption deviation, and it is proofreaied and correct, parts correctly are assembled on the printed base plate.In this component mounter during identification component, owing on each limit of parts, have lead terminal, thereby, shown in patent documentation 1, terminal is carried out on each limit that has lead terminal detect, come calculating unit center and gradient according to this testing result.And, when identification component, the terminal that nonrecognition exists in the parts bight etc. with center (centering) terminal that it doesn't matter, but only the required terminal of centering of lead terminal and spherical terminal and so on is detected, improve the computational accuracy when obtaining parts center and gradient.
And, when being assembled to parts on the substrate, have at parts under the situation of polarity (directivity), must be one side of parts towards the direction at its polarity place and be installed on the substrate.Therefore, it is 0 ° (making progress), 90 ° (to the right), 180 ° (downwards), 270 ° (left) that the direction on one side of parts is appointed as the absorption angle, according to specified absorption angle rotary part data, carries out component identification.
Yet, the error during owing to appointment error of adsorbing angle or actual adsorption element, Shi Ji parts direction is different with specified direction (absorption angle) sometimes.Therefore, use existing recognition methods, whether the testing result of lead terminal that only can discern each limit is consistent with the specified lead terminal information of parts data.Therefore, when the absorption angle of appointment and actual absorption angle not simultaneously, have the possibility that causes mistake assembling and identification error.
For example, shown in Figure 11 (A), suppose that parts 50,51 have lead terminal on each limit respectively, and have the angle terminal, one side each parts has the directivity that makes progress of regulation in the bight.When parts 50 have been specified 0 ° of absorption angle, according to the parts direction generating unit event data shown in the left side of Figure 11 (B).In this case, lead terminal information is only described owing to ignore the angle terminal, thus parts data to have become the right side illustrated like that.In this state, when actual adsorption element, when for example a part of parts are accommodated in the loader that is used for supply part with the state different with specified direction mistakenly, shown in the left side of Figure 11 (C), parts are adsorbed with the direction different with specified direction (for example to the right).When identification, owing to be identified as normally consistent with parts data of data of lead terminal, these parts are mounted on the substrate according to this direction, and the result is different with specified absorption angle (direction), causes the mistake assembling.
And, when parts 51 have been specified 0 ° of absorption angle, according to the direction generating unit event data of the illustrated parts of Figure 11 (D).In this state, when actual adsorption element, shown in Figure 11 (E), if parts are adsorbed with the direction different with specified direction (for example to the right), then when identification, because the data and the parts data of lead terminal are inconsistent, though be same parts, identification error appears.
Summary of the invention
Therefore, the present invention proposes in view of this problem, and purpose provides a kind of component identification method and apparatus that can be reliably the deviation of the actual absorption angle of parts and specified absorption angle be detected.
The present invention is in order to solve above-mentioned problem, adopted a kind of component identification method of coming the absorption orientation of identification component by the image of handling adsorbed parts, in the method, there is the angle terminal in record in which bight of parts when the direction on one side of parts is in the benchmark angle in advance; The image of processing element also detects the angle terminal; And, come the deviation of the absorption angle of detection part with respect to the benchmark angle according to this results of comparison existing the bight of detected angle terminal and the bight that has the angle terminal of writing down in advance to contrast.
And, in the present invention, adopted a kind of component recognition apparatus that comes the absorption orientation of identification component by the image of handling adsorbed parts, in this device, be provided with: storage unit is used for writing down in advance when the direction on one side of parts is in the benchmark angle have the angle terminal in which bight of parts; Camera head is used for parts are taken; Image processing apparatus is used for the image of the parts of processing screened; Angle terminal detecting unit is used for detecting the angle terminal according to the Flame Image Process of above-mentioned parts; And absorption angular detection unit, be used for coming the deviation of the absorption angle of detection part according to this results of comparison with respect to the benchmark angle existing the bight of detected angle terminal and the bight that has the angle terminal of writing down in advance to contrast.
In this structure, detect the angle terminal, bight that has this detected angle terminal and the bight that has the angle terminal of writing down are in advance contrasted.Therefore, when having adsorbed parts, can detect this situation reliably, can prevent from the direction of parts by mistake is assembled on the substrate with the angle different with specified absorption angle.
And, in the present invention, when having the angle terminal, also write down the shape of this angle terminal in advance, and to which bight to exist the angle terminal of which kind of shape detect in.Therefore, configuration (having or not) that can the diagonal angle terminal and the absorption angle that is shaped as asymmetrical various parts thereof detect.
Description of drawings
Fig. 1 is the block scheme of the schematic configuration of expression apparatus for mounting component.
Fig. 2 is the structural drawing of the structure of expression parts camera head and image processing apparatus.
Fig. 3 is the process flow diagram of expression component identification flow process.
Fig. 4 is the key diagram of the kind of expression lead terminal of various parts and angle terminal.
To be expression come the key diagram of state that shape type and configuration mode are encoded according to angle terminal and configuration thereof to Fig. 5.
Fig. 6 is the key diagram that the flow process of lead-in wire row straight line and angular coordinate is obtained in expression.
Fig. 7 is that explanation scans the process flow diagram that whether has the flow process of angle terminal in the detection part by diagonal line.
Fig. 8 (A) is the chart of explanation diagonal line scan method, and Fig. 8 (B) and Fig. 8 (C) are the key diagrams of explanation diagonal line scan method.
Fig. 9 is the key diagram of flow process that the profile of detected angle terminal is obtained in expression.
Figure 10 is the shape of angle terminal is determined in expression by template matches the key diagram of flow process.
Figure 11 (A)~(E) is the key diagram of the asynchronous problem of angle of explanation absorption angle and appointment.
Symbol description
5: camera head; 7: image processing apparatus; 7b: video memory; 7d: parts data storer; 30: parts; 30a, 30b, 30c: angle terminal; C1~C4: bight.
Embodiment
Below, the embodiment shown in reference to the accompanying drawings, the present invention is described in detail.
The structure of<apparatus for mounting component and image processing apparatus 〉
Fig. 1 is the block scheme that the control structure of apparatus for mounting component of the present invention (component mounter) is used in expression, apparatus for mounting component has head (head portion) 3, and this head 3 has the adsorption nozzle 3a that is used for adsorbing the parts of being supplied with by assembly supply devices such as loader (not illustrating) (electronic unit) 20.
The structure of this head 3 is: can be driven by X-axis motor 11, Y-axis motor 12, move at the XY direction of principal axis, wherein, X-axis motor 11, Y-axis motor 12 are driven by the controller 10 that CPU constitutes.And the structure of head 3 is: when adsorption element and built-up member, can be driven and carry out lifting in Z-direction by Z axle motor 13, and adsorption nozzle 3a can be that the center is rotated with the nozzle shaft by θ axle motor 14.
And, in apparatus for mounting component, dispose the camera head with lens 5a 5 of CCD camera that parts 20 are taken and so on, head 3 moves to camera head 5 behind adsorption element, takes by 5 pairs of parts 20 by lighting device 15 illuminations of camera head.
The photographs of parts be imported into have CPU7a, the image processing apparatus 7 of video memory 7b, A/D converter 7c, parts data storer 7d.Image processing apparatus 7 is taken into the image in the zone that sets, and uses known algorithm to handle this image, the absorption orientation of parts is discerned the position deviation of calculating unit center and adsorption site and parts gradient.Controller 10 is proofreaied and correct this position deviation when driving X-axis motor 11, Y-axis motor 12, θ axle motor 14, and parts 20 are assembled on the substrate (not illustrating) that sends.
And, in apparatus for mounting component, be provided with input medias such as the keyboard 21 that is used for the input block data, mouse 22.And the parts data that is generated can be stored in the memory storage 23 or the parts data storer 7d in the pattern recognition device 7 that is made of hard disk, flash memory etc.Also can not pass through input media, but the parts data that will be connected the principal computer (not illustrating) on the apparatus for mounting component and provide is stored in advance among memory storage 23 or the parts data storer 7d.And, be provided with monitor (display) 24, can the display unit data on its picture, operational data and the image of component taken with camera head 5.
In Fig. 2, show by 15 pairs of lighting devices be adsorbed on parts 20 on the adsorption nozzle 3a throw light on, by camera head 5 to its state of taking.Camera head 5 is by the control line 7d from CPU7a, and the image in the zone that sets is taken.Captured image is imported into image processing apparatus 7 by signal wire 7e, and is converted into digital signal by A/D converter 7c.Afterwards, it is stored in the video memory 7b, carries out Flame Image Process, and the orientation of identification component.And, can be presented at handled image on the monitor 24.
<component identification flow process 〉
Below, according to the flow process of Fig. 3 performed under the control of the CPU7a of image processing apparatus, the flow process that the direction of detection part is promptly adsorbed angle and carried out component identification describes.
In the rectangular member that has, except lead terminal, also be provided with the angle terminal in the bight, in the parts 20a shown in Fig. 4 (A), be positioned at the shape fully different (lower right corner terminal is not set, but when acerous terminal, can be thought of as the shape of nothing (imagination)) of the angle terminal in 4 bights.In parts 20b, the 20c shown in Fig. 4 (B), 20d, the shape of 1 angle terminal in 4 angle terminals is different with other.And in parts 20e, the 20f shown in Fig. 4 (C), the shape of 2 pairs of terminals in 4 angle terminals is identical respectively, the shape difference of the angle terminal at diagonal angle.And in parts 20g, the 20h shown in Fig. 4 (D), the shape of 4 angle terminals is identical, and in parts 20i, the 20j shown in Fig. 4 (E), the shape of 2 pairs of terminals in 4 angle terminals is identical respectively, and the shape of the angle terminal at diagonal angle is identical.
Like this, parts shown in Fig. 4 (A), (B), (C), by detecting the shape that whether has angle terminal and this angle terminal in the bight, absorption angle that can identification component, and for Fig. 4 (D), (E) corresponding components, only according to the information of angle terminal can not identification component the absorption angle.
Therefore, in the present invention, the absorption angle of the parts shown in Fig. 4 (A), (B), (C) is judged.For this reason, when the generating unit event data, except the terminal information (terminal number, terminal sizes, terminal pitch etc.) of part dimension and Ge Bian, also to which bight to exist the angle terminal to carry out record in.Then, when having the angle terminal, also write down the shape of this angle terminal, also the location and the shape of angle terminal are carried out medelling and specified this pattern simultaneously.
About the type of this angle terminal shape, classify according to the shape shown in Fig. 5 (A), to each shape distribute Sort Code (0), (1) ... (7) ... and encode.Then, when the parts direction is a benchmark angle (for example 0 °) and upwards the time, be initial point with the parts center, specify the location of angle terminal with coordinate, use Sort Code to specify the shape type of the terminal of this place existence.Then, shown in Fig. 5 (B), the location of diagonal angle terminal and shape are carried out medelling.Then, to each mode assignments Sort Code (0), (1) ... (8) and encode, use Sort Code to specify the mode type of each parts.
In this case, whether the shape information of the angle terminal that exists according to each bight in advance is to being that the parts that can discern the absorption angle are judged the line item of going forward side by side.For example, in Fig. 5 (B), be the parts of (0)~(3) for the Sort Code shown in up, because the angle terminal is axisymmetric, thereby the absorption angle can not discern, and the Sort Code shown in descending be the parts of (4)~(8) is to discern the parts that adsorb angle.
Like this, to each parts, generate the pattern classification code of the location of data, angle terminal of part dimension and the terminal information of Ge Bian, the information that acerous terminal is arranged, angle terminal shape and shape and could discern data such as absorption angle.These parts data generate with input medias such as the keyboard 21 of Fig. 2, mouses 22, perhaps use principal computer (not illustrating) to generate, and are stored in the memory storage 23.And these parts data are loaded in the parts data storer 7d of image processing apparatus 7, and manage by each parts ID.The reception (being written into) of the parts data among the step S1 of Here it is Fig. 3.
Then, in step S2, obtain recognition instruction.In recognition instruction, specified parts ID, variety of components, shooting condition (specifying camera, illumination etc.).Usually, also must specify the absorption angle, but only limit to discern the parts of absorption angle, also can omit this appointment.When the absorption angle of specified parts, the state during the generating unit event data forms 4 appointments of 0 °, 90 °, 180 °, 270 ° as 0 °.
The absorption angle is when being 0 °, one side in the prescribed direction that is in of the regulation of rectangular member, for example be under the state upwards, this moment the orientation as the benchmark angle, the generating unit event data one side parts are adsorbed with the state that makes progress of afore mentioned rules, and is assembled on the substrate according to this direction.
The absorption angle is when being 90 °, one side parts are adsorbed with the state to the right of afore mentioned rules, and is assembled on the substrate according to this direction.Equally, the absorption angle is when being 180 °, 270 °, one side parts are adsorbed with afore mentioned rules downwards, left state respectively, and is assembled on the substrate according to this direction.
In addition, when parts are supplied to according to benchmark angle (making progress) and can not be according to the absorption of other direction the time, after according to benchmark angle adsorption element, make adsorption nozzle according to the rotation of absorption angle, the direction of parts is changed to the absorption angle.
Then, in step S3, generate the parts data (hereinafter referred to as recognition data) of identification usefulness.When in step S2, having specified the absorption angle, generate the recognition data corresponding with it.For example, when having specified the absorption angle to be 90 °, one side since make parts afore mentioned rules to the right, parts are taken and are carried out component identification, thereby, parts data in the direction of the clock the parts data of half-twist also as recognition data.When not specifying the absorption angle, be judged as direction adsorption element (the absorption angle is 0 °) identical according to the generating unit event data time, generate recognition data.Therefore, recognition data becomes data identical when generating with parts data.
Then, in step S4, carry out the detection of lead terminal.This detection is: shown in Fig. 6 (A), (there are 5 lead terminals on every limit for having angle terminal 30a, 30b, 30c in the bight and having lead terminal 30d on each limit, marked symbol on one therein) rectangular member 30, scan from the inboard, image lateral of parts, the outer contact of detection part is set the external window W that represents the parts approximate location according to this result.
Then, lead-in wire row window W1~W4 (Fig. 6 (B)) are set on each limit, in having the zone of terminal, set the inspection area.Then, at every limit in 4 limits, in lead-in wire row window, carry out DOG filtering (Marr ﹠amp; The Difference-Of-Gaussian of Posio (difference of Gaussian) filtering) detect the edge of lead terminal.Then, use it to detect the coordinate of lead-in wire front end ("+" symbol among Fig. 6 (C)).The detection of this lead terminal is known, for example, opens in the flat 6-223185 communique the spy and to have done record.
Then, in step S5,, obtain the lead-in wire row straight line L1~L4 (Fig. 6 (D)) that connects the lead-in wire front end according to the lead-in wire front end coordinate that is detected.Then, the bight that its intersection point C1~C4 regards parts as, obtain its coordinate figure.At this moment, because angle terminal 30a, 30b, 30c reduce the precision of lead-in wire row straight line, thereby for example, lead-in wire row straight line is obtained at the edge by getting rid of detected two ends etc.
<scan by diagonal line and to detect the angle terminal
Then, in step S5,, obtain the diagonal line of parts,, detect the bight and whether have terminal by this diagonal line is scanned according to the angular coordinate of the above-mentioned parts of obtaining.This flow process is to carry out under the control of the CPU7a of image processing apparatus as shown in Figure 7.
At first, owing to obtained the angular coordinate (Fig. 6 (D)) of the bight C1~C4 of image of component, thereby obtain the diagonal line (step S21) between the angle that connects the diagonal angle, set the direction of scanning and the scope (step S22) that are used for detection terminal.Shown in Fig. 8 (A), in order to improve detection sensitivity, make the direction of scanning along with cornerwise gradient with the position, angle and different.Sweep limit is following setting: shown in Fig. 8 (B), determine to get in the front and back of angular coordinate angular separation from the scope M1 and the M2 of 1/4 length, set sweep starting point m1 and terminal point m2.
Like this, after having set direction of scanning and sweep starting point and terminal point, begin scanning (step S23).Shown in Fig. 8 (C), when the diagonal line gradient is milder than 45 °, play terminal point from sweep starting point till, point on the scan diagonal and point up and down (y scanning direction), when the diagonal line gradient is more precipitous than 45 °, the point on the x direction of principal axis scan diagonal and about point.
In step S24, as the CPU7a judgement scanning result of angle terminal detecting unit, detecting has acerous terminal.When detecting the point of the defined threshold of using more than or equal to the angle terminal, finish diagonal line scanning, enter the step S25 that is used to obtain the terminal profile.Otherwise, when scanning sweep stopping point, do not find under the situation more than or equal to the point of threshold value yet, think acerous terminal.
When in step S24, detecting terminal, begin to detect profile (step S25) from check point.Figure 9 illustrates the state that obtains this profile.In Fig. 9 (A), Q1 scans detected point by diagonal line, and 8 consecutive point around except this check point Q1 are investigated according to numerical order.Because Q2 point place more than or equal to threshold value, can detect profile, thereby, it regarded as next point.Then, shown in Fig. 9 (B), be that the center begins 8 consecutive point are investigated with Q2, repeat same processing up to turning back to initial starting point.In addition, the investigation direction of 8 consecutive point is decided by the direction of diagonal line scanning.
Detected like this point has been shown in Fig. 9 (C), and its information is according to detecting sequential storage in arrangement.Then, in step S26, S27, arrange according to this point, the edge of obtaining directions X respectively to the edge of coordinate, y direction to coordinate (Fig. 9 (D), (E)).And, in step S28, coordinate is obtained center of gravity Xg, the Yg (Fig. 9 (F), (G)) of detection terminal according to this edge.This center of gravity is obtained according to following formula.
Lsum(y)=Edge2(y)-Edge1(y)+1
Ladr_sum(y)=Lsum(y)×(Edge1(y)+Edge2(y))/2
SUM=∑Lsum(y)
Ymom=∑(Lsum(y)×y)
Xmom=∑Ladr_sum(y)
Xg=Ymom/Sum
Yg=Xmom/Sum
When the center of gravity of the detection terminal of being obtained is near diagonal line, detected terminal is considered as the angle terminal, otherwise, when center of gravity during away from diagonal line, owing to have the possibility that flase drop is measured lead terminal, thereby be considered as acerous terminal (step S29).
Like this,, further detect the profile of this terminal, obtain the center of gravity of terminal at according to more than or equal to the detected angle terminal of threshold value.Then, owing to whether be that the angle terminal is judged to detected terminal also, thereby can prevent from non-angle terminal flase drops such as the noise that exists on the diagonal line are surveyed to being the angle terminal.
By diagonal line scanning shown in Figure 8, in the example of parts shown in Figure 6 30,, in bight C1 and C3, detect angle of departure terminal 30a and 30c by the diagonal line that connects bight C1 and C3 is scanned.And, by the diagonal line that connects bight C2 and C4 is scanned, in the C4 of bight, detect angle of departure terminal 30b, in the C2 of bight, detect acerous terminal.Like this, owing to can detect in which bight and have the angle terminal, thereby as the CPU7a of absorption angular detection unit existing the bight of detected angle terminal and the bight that has the angle terminal of record in advance to contrast, according to this results of comparison, but the absorption angle of detection part is with respect to the deviation of benchmark angle.
For example, in the parts 30 of Fig. 6, when parts data when generating, exists the acerous terminal in upper right bight, other bights the situation of angle terminal to be recorded in the parts data storer 7d according to benchmark angle (the absorption angle is 0 °).Then, by diagonal line scanning, detect the acerous terminal of bight C2, other bight angled end period of the day from 11 p.m. to 1 a.m, because identical with parts data, thereby to detect the absorption angle be 0 °.
Then, if detect the acerous terminal of bight C3, other bight angled end, then detecting the absorption angle is 90 °.And if detect the acerous terminal of bight C4, other bight angled end, then detecting the absorption angle is 180 °, and if detect the acerous terminal of bight C1, other bight angled end, then detecting the absorption angle is 270 °.
Therefore, for example, the direction of a part of parts is being made a mistake and it is being accommodated under the situation about waiting in the loader, when having adsorbed parts, can detect the mistake (mistake of parts direction) of absorption angle in the component identification stage according to the direction different with specifying the absorption angle.That is to say, when parts assemble, must consider polarity (assembly direction) when assembling,, can prevent the mistake assembling by proofreading and correct the absorption angle.
It more than is stage of step S9 of the processing of Fig. 3.As the Sort Code that is sorted in Fig. 5 (B) is for the parts of (4)~(8), and when the angle terminal in two of one side of parts bights (it has or not or its shape) was asymmetric with respect to the central vertical line on limit at least, this absorption angle can be judged.Yet, for the angle terminal have or not or its shape has symmetric Sort Code and is the parts of (0)~(3) because the absorption angle can not judge, thereby the judgement by step S6, skip obtaining of absorption angle.And,,, also detect the shape (step S7, S8) of angle terminal for not only needing to judge the parts (the parts 20b of Fig. 4 etc.) that have acerous terminal but also needs to judge the shape of angle terminal in order to obtain the absorption angle.
Figure 10 illustrates the SHAPE DETECTION flow process of this angle terminal.At first, shown in Figure 10 (A),, obtain the external wiring of angle terminal, and obtain by this external wiring area surrounded 41 according to the profile of the angle terminal 40a of the parts 40 that in step S25, obtain.And, owing in step S4, obtained the lead-in wire coordinate, thereby shown in Figure 10 (B),, obtain parts gradient θ roughly according to the lead-in wire coordinate.
Then, the size of consideration of regional 41 and parts gradient θ generate the template 42,43 corresponding with specified angle terminal shape (Figure 10 (C)).Then, shown in Figure 10 (D), set the window w1~w4 that comprises the angle terminal, use the template 42,43 that is generated to carry out template matches in the periphery of each angle terminal of the image of parts 40.Shown in Figure 10 (E), as the angle terminal 40a that uses 42 pairs of parts 40 of template, when 40c, 40d carry out template matches, correlation uprises, and angle terminal 40b is when using template 43, correlation uprises, thereby can be defined as the angle terminal shape by the predetermined shape of each template.
Like this, because CPU7a can detect the angle terminal which bight there is which kind of shape in, thereby by contrasting with terminal shape by the bight of parts data appointment, the absorption angle that can obtain parts is with respect to parts data how many degree (with the deviation of benchmark angle) (step S9) that tilt.When the absorption angle of reality and specified absorption angle not simultaneously, carry out mistake end as absorption angle mistake.But, also can not carry out the mistake end and identification be proceeded at last built-up member after the absorption angular deviation is proofreaied and correct always.
Then, in step S10,, recognition data is proofreaied and correct according to the absorption angle of in step S9, obtaining.For example when detecting the absorption angle when 90 ° deviation is arranged, if recognition data is not proofreaied and correct, identification error (Figure 11 (E) etc.) can take place in the parts that then have, thereby according to the absorption angle of reality, make also half-twist of recognition data, recognition data is proofreaied and correct.
Then, inspections (step S11) that go between of the recognition data after use proofreading and correct, unusual to wire length, radical is unusual, spacing is unusual, the curvature abnormalities that goes between etc. detects unusually.If the inspection of this lead-in wire is normal, then use the lead-in wire coordinate that detects, calculate parts center and gradient (step S12).
After calculating parts center and gradient, adsorption site and gradient are proofreaied and correct, parts are assembled on the substrate.In addition, at this moment, when detecting the deviation of above-mentioned absorption angle, the deviation of this absorption angle is proofreaied and correct laggard luggage join.Like this, the mistake assembling in the time of can preventing under the situation that must be assembled to parts by assigned direction on the substrate, to detect the deviation of adsorbing angle.
In addition, in the above-described embodiment, though rectangular member is described,, certainly, also can be applied to the bight to have angle terminal, angle terminal be symmetrical arrangements, can detect other polygon parts of absorption angle.
Just as described above, in the present invention, the angle terminal that the bight at parts is existed detects.Thereby, exist the bight of detected angle terminal and the bight that has the angle terminal of writing down in advance to contrast by handle, reliably the absorption angle of detection part.Therefore,, this situation can be detected, the mistake assembling can be prevented having adsorbed under the situation of parts with the angle different with specified absorption angle.Meanwhile, can specify the absorption angle of required appointment when discerning in the past.

Claims (4)

1. component identification method is come the absorption orientation of identification component by the image of handling adsorbed parts, it is characterized in that,
There is the angle terminal in record in which bight of parts when the direction on one side of parts is in the benchmark angle in advance;
The image of processing element also detects the angle terminal; And
Bight that has detected angle terminal and the bight that has the angle terminal of writing down are in advance contrasted,
Come the deviation of the absorption angle of detection part according to this results of comparison with respect to the benchmark angle.
2. the described component identification method of claim 1 is characterized in that,
When carrying out above-mentioned record, also write down the shape of angle terminal;
The processing element image also detects the angle terminal which bight there is which kind of shape in; And
Come the deviation of the absorption angle of detection part according to this testing result with respect to the benchmark angle.
3. component recognition apparatus comes the absorption orientation of identification component by the image of handling adsorbed parts, it is characterized in that having:
Storage unit is used for writing down in advance when the direction on one side of parts is in the benchmark angle have the angle terminal in which bight of parts;
Camera head is used for parts are taken;
Image processing apparatus is used for the image of the parts of processing screened;
Angle terminal detecting unit is used for detecting the angle terminal according to the Flame Image Process of above-mentioned parts; And
Absorption angular detection unit is used for coming the deviation of the absorption angle of detection part with respect to the benchmark angle existing the bight of detected angle terminal and the bight that has the angle terminal of writing down in advance to contrast according to this results of comparison.
4. the described component recognition apparatus of claim 3 is characterized in that,
Said memory cells also writes down the shape of angle terminal when writing down;
Above-mentioned angle terminal detecting unit detects the angle terminal which bight there is which kind of shape in according to the Flame Image Process of parts; And
Above-mentioned absorption angular detection unit comes the deviation of the absorption angle of detection part with respect to the benchmark angle according to this testing result.
CNB2004100351305A 2003-04-23 2004-04-23 Part recognizing method and device Expired - Fee Related CN1319012C (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2003117739A JP4106301B2 (en) 2003-04-23 2003-04-23 Component recognition method and apparatus
JP117739/2003 2003-04-23

Publications (2)

Publication Number Publication Date
CN1551036A true CN1551036A (en) 2004-12-01
CN1319012C CN1319012C (en) 2007-05-30

Family

ID=33497494

Family Applications (1)

Application Number Title Priority Date Filing Date
CNB2004100351305A Expired - Fee Related CN1319012C (en) 2003-04-23 2004-04-23 Part recognizing method and device

Country Status (2)

Country Link
JP (1) JP4106301B2 (en)
CN (1) CN1319012C (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104976951A (en) * 2014-04-09 2015-10-14 英懋达光电股份有限公司 Device and method for identifying image

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1484020A1 (en) 2003-06-06 2004-12-08 Kontron Medical AG Motorized multiplane transesophageal probe with coupling fluid
JP5730114B2 (en) * 2011-04-25 2015-06-03 富士機械製造株式会社 Component rotation angle detection device, image processing component data creation device, component rotation angle detection method, and image processing component data creation method
JP5939775B2 (en) * 2011-11-30 2016-06-22 キヤノン株式会社 Image processing apparatus, image processing program, robot apparatus, and image processing method
AT513747B1 (en) 2013-02-28 2014-07-15 Mikroelektronik Ges Mit Beschränkter Haftung Ab Assembly process for circuit carriers and circuit carriers
JP6427754B2 (en) * 2014-01-31 2018-11-28 パナソニックIpマネジメント株式会社 Component recognition data inspection system and component recognition data inspection method

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09282444A (en) * 1996-04-16 1997-10-31 Matsushita Electric Ind Co Ltd Visual recognition device
US6549683B1 (en) * 2000-05-02 2003-04-15 Institut National D'optique Method and apparatus for evaluating a scale factor and a rotation angle in image processing
JP3709800B2 (en) * 2001-03-14 2005-10-26 株式会社村田製作所 Mounting machine and component mounting method

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104976951A (en) * 2014-04-09 2015-10-14 英懋达光电股份有限公司 Device and method for identifying image

Also Published As

Publication number Publication date
JP4106301B2 (en) 2008-06-25
JP2004325146A (en) 2004-11-18
CN1319012C (en) 2007-05-30

Similar Documents

Publication Publication Date Title
US4450579A (en) Recognition method and apparatus
CN1317544C (en) Apparance detector and image acquisition method
JP2009250971A (en) System and method for inspecting electronic device
US7529410B2 (en) Local localization using fast image match
CN1269121A (en) Method and device for mounting parts
CN1929106A (en) Method for controlling parallelism between probe card and mounting table, inspection program, and inspection apparatus
CN1854681A (en) Measuring device and method
CN1932726A (en) Digital image sensor locator based on CMOS and locating method
CN1551036A (en) Part recognizing method and device
CN1285903C (en) Figure detecting method and figure detecting device
CN1194601C (en) Element and device mounting method
US20210048473A1 (en) Injection device, micro light emitting diode inspection and repairing equipment and inspection and repairing method
CN1672481A (en) Apparatus and method for insepecting cream solder printed on a substrate
CN1610499A (en) Parts data generating device and electronic parts mounting apparatus with the same device
KR20110072771A (en) Method for inspecting edge of plate
US10069042B2 (en) Light-emitting components containing body, manufacturing method of light-emitting components containing body, components mounting apparatus, components mounting method, and components mounting system
JP2013243168A (en) Component mounting apparatus and part recognition method
CN1108739C (en) Mounting electronic component method and apparatus
JP4704218B2 (en) Component recognition method, apparatus and surface mounter
JP4884032B2 (en) Nozzle type recognition control method and component mounting apparatus
JP5412076B2 (en) Electronic component mounting device
CN1821968A (en) Memory address monitoring device and memory address monitoring method
CN1487265A (en) Position detecting method, position detecting apparatus and method for positioning printed circuit board
CN1834836A (en) Method for shorten base board identify time and part carrying device use this method
KR100910771B1 (en) Chip mounter

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20070530

CF01 Termination of patent right due to non-payment of annual fee