CN105049700A - Image data processing apparatus for parts feeder and parts feeder - Google Patents
Image data processing apparatus for parts feeder and parts feeder Download PDFInfo
- Publication number
- CN105049700A CN105049700A CN201510173788.0A CN201510173788A CN105049700A CN 105049700 A CN105049700 A CN 105049700A CN 201510173788 A CN201510173788 A CN 201510173788A CN 105049700 A CN105049700 A CN 105049700A
- Authority
- CN
- China
- Prior art keywords
- workpiece
- mentioned
- unit
- imaging apparatus
- view data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B65—CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
- B65G—TRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
- B65G47/00—Article or material-handling devices associated with conveyors; Methods employing such devices
- B65G47/02—Devices for feeding articles or materials to conveyors
- B65G47/04—Devices for feeding articles or materials to conveyors for feeding articles
- B65G47/12—Devices for feeding articles or materials to conveyors for feeding articles from disorderly-arranged article piles or from loose assemblages of articles
- B65G47/14—Devices for feeding articles or materials to conveyors for feeding articles from disorderly-arranged article piles or from loose assemblages of articles arranging or orientating the articles by mechanical or pneumatic means during feeding
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B65—CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
- B65G—TRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
- B65G43/00—Control devices, e.g. for safety, warning or fault-correcting
- B65G43/08—Control devices operated by article or material being fed, conveyed or discharged
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Feeding Of Articles To Conveyors (AREA)
- Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Image Processing (AREA)
- Sorting Of Articles (AREA)
- Closed-Circuit Television Systems (AREA)
- Studio Devices (AREA)
Abstract
The present invention relates to an image data processing apparatus for a parts feeder and a parts feeder. The image data processing apparatus for the parts feeder is applied to a parts feeder. The parts feeder is provided with a camera for shooting workpieces transported along a transport path. The camera employs a surface scanning camera having a plurality of camera elements arranged on the transport direction of the workpieces and a direction orthogonal to the transport path and configured to obtain image data through the camera elements. The image data processing apparatus comprises: a setting unit configured to set one part of the camera elements contained in the camera and arranged to be orthogonal to the transport direction to be capable of shooting; an image taking-in unit configured to timely take in obtained image data from the scanning camera in the case of using the part of the image elements to shoot; and a posture discrimination unit configured to be a good and bad discrimination unit of the workpieces, employing the taking-in image data from the image taking-in unit to perform discrimination process of the workpieces W.
Description
Technical field
The present invention relates to the view data that a part of imaging apparatus that a kind of basis only utilizes Surface scan camera to possess obtains and differentiate the outward appearance of workpiece or the technology of posture, relate to a kind of simply and correctly can set a part of imaging apparatus position and feed appliance image processing apparatus and the feed appliance of the transfer rate high speed of the view data got by Surface scan camera can be made.
Background technology
In the past, known a kind of feed appliance, this feed appliance can provide destination (such as patent documentation 1) using what be transported to regulation as the workpiece of the conveying objects such as electronic unit along transport path.Feed appliance disclosed in patent documentation 1 is configured to, and differentiates the posture of workpiece, got rid of by the workpiece of rejected unit by inappropriate for posture (incorrect posture) from transport path according to the view data that obtains of shooting workpiece.
Use Fig. 8 that the principle of carrying out the feed appliance 200 of pose discrimination process as described in Patent Document 1 according to view data is described, the workpiece W arriving camera site P1 is taken by the Surface scan camera 202 as image unit, the view data obtained is taken into after unit 204a is taken into control device 204 via image, is carried out the preliminary treatment such as binary conversion treatment by pretreatment unit 204b.Afterwards, differentiated the posture of workpiece W by pose discrimination unit 204C according to pretreated view data, according to it, rejected unit 5 differentiates that result gets rid of the unsuitable workpiece W of posture.In addition, multiple imaging apparatuss of Surface scan camera 202, to arrange in mesh-like manner, obtain the two-dimensional image data that pixel count is more.In addition, when convention is the shooting being configured to Surface scan camera 202, when being detected the assigned position that workpiece W arrives on transport path 10 by laser sensor 203, input external trigger is also taken.
Patent documentation 1: Japanese Unexamined Patent Publication 2013-39981 publication
Summary of the invention
the problem that invention will solve
In addition, in the feed appliance 200 of said structure, as shown in Figure 9, when taking at moment t11 Surface scan camera 202, be taken into unit 204a at moment t12 via image and start being taken into of view data, start to carry out preliminary treatment to view data such as binaryzations by pretreatment unit 204b at moment t13.Afterwards, consider following process as customary means: at the end of preliminary treatment, differentiated the posture of workpiece at moment t14 by pose discrimination unit 204C according to pretreated view data.
But, the data that the pixel count that roughly all elements that feed appliance 200 is taken into the imaging apparatus utilizing Surface scan camera 202 to have obtain is many, therefore exist that to be taken into the time (delivery time) elongated and for the time elongated this problem of a workpiece from shooting to pose discrimination.Need to take at camera site P1 the differentiation that the workpiece W obtained carried out posture before arriving the eliminating position P2 carrying out getting rid of process, therefore then need the transporting velocity limiting workpiece W when the time needed for pose discrimination is long, thus be difficult to conveying workpieces W and cause treatment effeciency to decline at high speed.In addition, in order to shorten until the time of pose discrimination, also consider to improve the performance of control device 204 (CPU) and shorten preliminary treatment, time needed for pose discrimination process, but, as shown in Figure 9, the time that is taken into is fully large compared with the time needed for preliminary treatment, pose discrimination process, even if improve the performance of control device 204, the time also can not fully shorten.
In order to address this is that, consider to use line scan cameras to replace Surface scan camera 202.Line scan cameras, for taking a row imaging apparatus, because coverage is narrow, is therefore thought that the pixel count of the view data obtained is few and and shortens the delivery time.But, line scan cameras carries out the wire shooting of one dimension, therefore be difficult to judge that the view data obtained is the view data which part of shooting obtains, in order to correctly be arranged at the position can taking suitable position, need to use from outside ultra-high-speed photographic machine to carry out taking or using the virtual workpiece of location, thus there is this problem of labor intensive and time.
The object of the invention is to effectively address this is that, a kind of feed appliance image processing apparatus and feed appliance are provided, the imaging apparatus of camera correctly and easily can be set in position, and the transfer rate of the view data got by this camera can be improved and make the conveying high speed of workpiece.
for the scheme of dealing with problems
The present invention puts in view of the above problems and takes following scheme.
Namely, feed appliance image processing apparatus of the present invention, be applied to the feed appliance possessing camera, the shooting of this camera is along the workpiece of transport path conveying, the feature of this feed appliance image processing apparatus is, Surface scan camera is adopted as above-mentioned camera, this Surface scan camera has the multiple imaging apparatuss arranged on the throughput direction and the direction orthogonal with the throughput direction of this workpiece of above-mentioned workpiece, view data is obtained by these imaging apparatuss, and, above-mentioned feed appliance image processing apparatus possesses: setup unit, the a part of imaging apparatus forming row orthogonally with above-mentioned throughput direction in its multiple imaging apparatuss only had by above-mentioned Surface scan camera is set as can be used in taking, image is taken into unit, and it is taken into accessed view data when only above-mentioned a part of imaging apparatus is for taking immediately from above-mentioned Surface scan camera, and the fine or not judgement unit of workpiece, it differentiates process according to the quality being taken into view data that unit is taken into by above-mentioned image and carrying out workpiece.
At this, the differentiation of the quality of workpiece represents to differentiate that whether the outward appearance of workpiece, posture are outward appearance, the posture of regulation.
Setup unit can only utilize a part of imaging apparatus, reduce the pixel count of the view data got by Surface scan camera in once taking thus, the speed that is taken into (transfer rate) that image is taken into unit can be improved, therefore shortened for a workpiece time from shooting to quality differentiates process and the conveying high speed of workpiece can be made.On the other hand, the roughly all imaging apparatuss had by utilizing Surface scan camera, can make more widerly than line scan cameras to appear in view data, by to appear at parts in this view data etc. for benchmark, can by above-mentioned a part of imaging apparatus simply and be correctly set in suitable position.In addition, all imaging apparatuss are roughly utilized also to comprise the situation utilizing all imaging apparatuss.
As concrete structure, above-mentioned feed appliance possesses workpiece processing unit, this workpiece processing unit exerts a force from force section to the workpiece arriving the workpiece processing site be set in above-mentioned transport path, thus this workpiece got rid of from above-mentioned transport path or carry out correcting posture at above-mentioned transport path, above-mentioned feed appliance image processing apparatus is configured to, above-mentioned workpiece processing unit work is made according to the differentiation result of the fine or not judgement unit of above-mentioned workpiece, and, when nearly all imaging apparatus is all used for taking, the coverage of above-mentioned Surface scan camera is set in the position comprising above-mentioned force section, selected in the view data occurring this coverage by above-mentioned setup unit and set the position of above-mentioned a part of imaging apparatus.
When being set to this structure, can observing and occur that the view data of force section is the position that benchmark selects above-mentioned a part of imaging apparatus by setup unit with force section, thus significantly can shorten the positioning time of above-mentioned a part of imaging apparatus.
On the other hand, in the structure that other is concrete, be configured to above-mentioned feed appliance and possess workpiece processing unit, this workpiece processing unit is to the workpiece force arriving the workpiece processing site be set on above-mentioned transport path, thus this workpiece got rid of from above-mentioned transport path or carry out correcting posture at above-mentioned transport path, above-mentioned setup unit sets the first imaging apparatus group of forming row with above-mentioned throughput direction orthogonally and in the second imaging apparatus group more forming row than above-mentioned first imaging apparatus group by the position of above-mentioned conveyance direction downstream side and above-mentioned throughput direction orthogonally from multiple imaging apparatus, the fine or not judgement unit of above-mentioned workpiece carries out quality according to the view data got by above-mentioned first imaging apparatus group and differentiates process, and carry out quality according to the view data got by above-mentioned second imaging apparatus group and differentiate process, above-mentioned feed appliance image processing apparatus makes above-mentioned workpiece processing unit work according to the differentiation result of the fine or not judgement unit of above-mentioned workpiece.
When being set to this structure, after carrying out primary quality according to the view data got by the first imaging apparatus group and differentiating process, secondary quality can be carried out according to the view data got by the second imaging apparatus group and differentiates process.Therefore, be configured to make workpiece processing unit work according to the differentiation result of the fine or not judgement unit of this workpiece, thus with only carry out once quality and differentiate compared with situation about processing, can only outward appearance, workpiece that posture is suitable be more stably sent to and carry destination.
On the other hand, in other more specifically structure, be configured to above-mentioned feed appliance and possess workpiece processing unit, this workpiece processing unit is to the workpiece force arriving the workpiece processing site be set on above-mentioned transport path, thus this workpiece got rid of from above-mentioned transport path or carry out correcting posture at above-mentioned transport path, the part being used in certain surface as above-mentioned workpiece defines the workpiece of the characteristic point of regulation, above-mentioned setup unit sets the first imaging apparatus group of forming row with above-mentioned throughput direction orthogonally and in the second imaging apparatus group more forming row than above-mentioned first imaging apparatus group by the position of above-mentioned conveyance direction downstream side and above-mentioned throughput direction orthogonally from multiple imaging apparatus, above-mentioned feed appliance image processing apparatus is adjusted to the characteristic point this workpiece formed when the throughput direction front end of above-mentioned workpiece or throughput direction rear end are in the coverage of above-mentioned second imaging apparatus group and appears in the coverage of above-mentioned first imaging apparatus group, also possesses pretreatment unit, this pretreatment unit can detect the throughput direction front end of workpiece or throughput direction rear end and above-mentioned characteristic point according to being taken into by above-mentioned image view data that unit is taken into, when detecting throughput direction front end or the throughput direction rear end of workpiece according to the view data got by above-mentioned second imaging apparatus group, above-mentioned feed appliance image processing apparatus detects above-mentioned characteristic point according to the view data of the first imaging apparatus group got with this view data simultaneously, make above-mentioned workpiece processing unit for not detecting that the workpiece of above-mentioned characteristic point carries out work.
By being set to this structure, the second imaging apparatus group can be made as the synchro pick-up of the throughput direction front end or throughput direction rear end for detecting workpiece to play function, the detection of characteristic point is carried out when detecting throughput direction front end or the throughput direction rear end of workpiece, if detect characteristic point, be determined as outward appearance or posture that the outward appearance of this workpiece or posture are regulation, if do not detect characteristic point, be determined as outward appearance or posture that the outward appearance of this workpiece or posture are not regulations and make workpiece processing unit work.Therefore, it is possible to the workpiece forming the characteristic point of regulation to the part in certain surface within the short processing time easily and correctly carry out quality differentiation, thus can reliably get rid of unsuitable workpiece.On the other hand, the roughly all imaging apparatuss had by utilizing Surface scan camera, can by the position of the first imaging apparatus group and the second imaging apparatus group simply and be correctly set in suitable position.
Particularly, in above-mentioned concrete structure, preferably be configured to, the above-mentioned a part of imaging apparatus set by above-mentioned setup unit is taken continuously, also possesses pretreatment unit, this pretreatment unit can be taken into according to above-mentioned image the view data that unit is taken into immediately and differentiate above-mentioned workpiece, according to being determined as by above-mentioned pretreatment unit, the fine or not judgement unit of above-mentioned workpiece has occurred that the quality that the view data of workpiece carries out workpiece differentiates process.
Like this, above-mentioned a part of imaging apparatus is taken continuously, the whole workpiece carried and come can be taken thus.In addition, carrying out quality differentiation process according to being determined as the view data having occurred workpiece, not needing thus according to not occurring that the view data of workpiece is carried out quality and differentiated process, thus preventing from carrying out useless process.Thus, do not need the position in order to grasp workpiece and sensor is set in addition, thus suppressing cost increase and process to increase, and process can be differentiated to carrying whole workpiece of coming reliably to carry out quality.
Feed appliance of the present invention is used in above-mentioned feed appliance image processing apparatus, it is characterized in that possessing: feed appliance main body, and it has the transport path for conveying workpieces; Surface scan camera, it has the multiple imaging apparatuss arranged on the throughput direction and the direction orthogonal with the throughput direction of this workpiece of above-mentioned workpiece, takes the above-mentioned workpiece carried along above-mentioned transport path to obtain view data; Workpiece processing unit, this workpiece is got rid of from transport path the workpiece through being set in the workpiece processing site on above-mentioned transport path or carries out correcting posture at transport path by it; And instruction output unit, its fine or not judgement unit when above-mentioned workpiece is determined as when not being outward appearance or the workpiece of posture of regulation, exports the instruction for making above-mentioned workpiece processing unit work.
Therefore, time of having taken to quality differentiates process can be shortened for a workpiece and make the conveying high speed of workpiece, and the roughly all imaging apparatuss had by utilizing Surface scan camera, can by above-mentioned a part of imaging apparatus simply and be correctly set in suitable position.
the effect of invention
Above, the present invention according to the above description, a kind of feed appliance image processing apparatus and feed appliance can be provided, the roughly all imaging apparatuss possessed by utilizing Surface scan camera, above-mentioned a part of imaging apparatus can be set in simple and correct position, and only this part of imaging apparatus is used for shooting, image can be improved and be taken into being taken into speed and making the conveying high speed of workpiece of unit.
Accompanying drawing explanation
Fig. 1 is the end view of the feed appliance represented involved by the first execution mode of the present invention.
Fig. 2 is the plane section figure of the part representing this feed appliance.
Fig. 3 is the end view of the part representing this feed appliance.
Fig. 4 is the sequential chart of the action for illustration of this feed appliance.
Fig. 5 is the plane section figure of a part for the feed appliance represented involved by the second execution mode of the present invention.
Fig. 6 is the plane section figure of a part for the feed appliance represented involved by the 3rd execution mode of the present invention.
Fig. 7 is the flow chart of the process representing this feed appliance.
Fig. 8 is the end view of the feed appliance of the structure represented based on feed appliance in the past.
Fig. 9 is the sequential chart of the action for illustration of the feed appliance shown in Fig. 8.
description of reference numerals
2: Surface scan camera; 5: workpiece processing unit (rejected unit); 8: feed appliance image processing apparatus; 10: transport path; 30: setup unit; 31: image is taken into unit; 32: pretreatment unit; 33: the fine or not judgement unit (pose discrimination unit) of workpiece; 34: instruction output unit; 50: force section (air injection nozzle); 100: feed appliance; E
lthe coverage of 1: the first imaging apparatus group; E
lthe coverage of 2: the second imaging apparatus groups; Wa: the front end of workpiece; W: workpiece; Wm: characteristic point; W
u: specific face (upper surface); P2: workpiece processing site (eliminating position).
Embodiment
< first execution mode >
Hereinafter, with reference to the accompanying drawings of the first execution mode of the present invention.
As shown in Figure 1, are transport paths 10 of possessing along feed appliance main body 1 as the feed appliance 100 of an embodiment of the invention to the multiple workpiece Ws of not shown supply destination conveying as transported substance.
Feed appliance main body 1 is configured to comprise above-mentioned transport path 10 and driver element 11, by driver element 11, transport path 10 is vibrated, and carries the multiple workpiece W be on transport path 10 thus.
Surface scan camera 2 is provided with above on transport path 10, this Surface scan camera 2 has the highly sensitive multiple imaging apparatuss (cmos sensor (ComP1ementaryMetalOxideSemiconductor: complementary metal oxide semiconductors (CMOS))) arranged on the throughput direction (bearing of trend of transport path 10) and the direction orthogonal with the throughput direction of this workpiece W of workpiece W, takes the workpiece W of conveying on transport path 10.Surface scan camera 2 can be used for the Surface scan pattern of taking and a part of imaging apparatus (being row in the present embodiment) of only being arranged orthogonally by above-mentioned throughput direction and switch between the line sweep pattern of taking at all imaging apparatuss possessed by Surface scan camera 2, set the imaging apparatus group as a part of imaging apparatus used during online scan pattern by the setup unit 30 forming control device 3.Coverage (shooting line) E of the Surface scan camera 2 during line sweep pattern
lfor camera site (camera point) P1 shown in Fig. 2, the part of the throughput direction of workpiece W and the direction entirety orthogonal with the throughput direction of this workpiece W are taken.
In the present embodiment, the setting position of smear camera 2 carries out confirming, is set to Surface scan pattern when adjusting over there, when being set to line sweep pattern to when carrying the workpiece W come to carry out pose discrimination.Therefore, obtaining in the opportunity getting rid of posture unsuitable workpiece W, be important for the position of the imaging apparatus group of taking during online scan pattern, be set in suitable position like that below.First, by coverage (camera watch region) E of the Surface scan camera 2 during Surface scan pattern
ebe arranged at the position of the air injection nozzle 50 comprising Workpiece processing apparatus 5 described later.In addition, usually will calculate camera site (camera point) P1 and spraying the distance L (with reference to Fig. 2) between compressed-air actuated eliminating position P2 from air injection nozzle 50 described in the formulas described later (2) such as the transporting velocity of the time of control device 3 needed for the length of workpiece W, image procossing, workpiece W, setup unit 30 sets the position of imaging apparatus group according to the view data got by Surface scan camera 2.In addition, the transporting velocity of workpiece W is now set to set point.So in the present embodiment, the position of imaging apparatus group can be selected in multiple imaging apparatus, but also can be the structure that the position of imaging apparatus group is fixed.
The view data got by the Surface scan camera 2 during line sweep pattern pixel count compared with the view data got when Surface scan pattern is few and data volume is few, therefore, it is possible to be taken into unit 31 by image be immediately taken into control device 3.Surface scan camera 2 during line sweep pattern carries out action in the mode of carrying out continuously taking with fixed intervals from before workpiece W arrival camera site P1, repeatedly take during the workpiece W carried to downstream is by camera site P1, obtain multiple view data of the diverse location having occurred this workpiece W from the front end Wa of this workpiece W to rear end Wb on the whole respectively.The view data got is sent to control device described later (controller) 3 at every turn when once taking.
Control device 3 shown in Fig. 1 is made up of the common microcomputer unit possessing not shown CPU, memory, interface etc., suitable program is stored in memory, CPU successively reads this program, play with peripheral hardware resource collaboration as setup unit 30, image be taken into unit 31, pretreatment unit 32, pose discrimination unit 33, speed computing unit 35, instruction output unit 34 and opportunity control unit 36 effect.
Image is taken into unit 31 and the view data got by Surface scan camera 2 is taken into control device 3, is immediately taken into view data when each Surface scan camera 2 is taken during online scan pattern.Pretreatment unit 32 has binary conversion treatment portion 32a, end test section 32b and composograph data generating section 32c, when being taken into unit 31 via image and being taken into view data, binary conversion treatment portion 32a carries out the preliminary treatment of the regulations such as binary conversion treatment immediately for this view data each.In addition, test section 32b in end differentiates front end Wa and the rear end Wb of workpiece W by suitable image procossing based on view data.Such as, in view data, to occur between the part of workpiece W and the part (specifically transport path 10) occurring position except workpiece W that tone etc. is different, therefore, in the front end Wa of shooting workpiece W or the view data of rear end Wb, there is the part that color depth is different in the whole direction orthogonal from the throughput direction of workpiece W.End test section 32b detects front end Wa and the rear end Wb of the workpiece W occurred in (image discriminating) view data according to the difference etc. of this color depth (brightness).Or, also can be configured to: end test section 32b differentiates the R shape being in the bight of workpiece W based on view data, detect front end Wa and rear end Wb thus.And, composograph data generating section 32c will occur that the view data of the view data of the front end Wa of workpiece W to the rear end Wb occurring this workpiece W engages according to shooting order, generates composograph data as the two-dimensional image data occurring a roughly overall workpiece W.
Pose discrimination unit 33 as the fine or not judgement unit of workpiece such as carries out according to this composograph data the pose discrimination process differentiating process as quality being differentiated the posture of (image discriminating) workpiece W by pattern match.In addition, as the posture towards the opposite of the unsuitable workpiece of posture such as surface and back side reversion or fore-and-aft direction.Like this, image is taken into the feed appliance image processing apparatus 8 of the present invention that unit 31, pretreatment unit 32 and pose discrimination unit 33 form the posture differentiating workpiece W.
Speed computing unit 35 carry out using like this for the composograph data of pose discrimination to calculate the speed computing of the transporting velocity of workpiece W, specifically, according to the transporting velocity Vw (m/S) calculating workpiece W with following formula (1).
Vw=Lw1/S·A···(1)
At this, S is the sweep speed of Surface scan camera 2 and the shooting interval (sec) of Surface scan camera 2, A for Surface scan camera 2 namely single workpiece W roughly overall taken from the front end Wa of workpiece W to rear end Wb needed for shooting number of times (secondary), Lwl is throughput direction length (m) (with reference to Fig. 3) of workpiece W.Speed computing unit 35 is considered as workpiece W through the time needed for the P1 of camera site using as the shooting interval S of Surface scan camera 2 and the long-pending shooting required time of shooting number of times A, calculates the transporting velocity Vw of workpiece W according to the throughput direction length Lw1 of this shooting required time and workpiece W.About the throughput direction length Lwl of workpiece W, preset the throughput direction length of workpiece W in kind.In addition, the throughput direction length Lwl of workpiece W, the shooting interval S of Surface scan camera 2 is inputted via input unit 41.In addition, speed computing unit 35 has shooting number of times acquisition unit 35a, and shooting number of times acquisition unit 35a calculates shooting number of times A according to the pixel count of the pixel count by once taking obtained view data and composograph data.
The transporting velocity Vw of the workpiece W calculated like this, except the control on opportunity of the workpiece W of the eliminating fault for following explanation, is also presented in the display unit 40 shown in Fig. 1.In addition, also the transporting velocity Vw of the workpiece W calculated like this can be used as workpiece W is the judgement material being transferred or being in stopping.
When posture judging unit 33 is judged as posture inappropriate (fault), instruction output unit 34 exports the instruction being used for carrying out getting rid of process (eliminating action) to the rejected unit 5 as the workpiece processing unit shown in Fig. 1, this eliminating process (eliminating action) is set in getting rid of from transport path 10 as the workpiece W on the eliminating position P2 of workpiece processing site of transport path 10 using being in.Rejected unit 5 has and at least more sprays the compressed-air actuated air injection nozzle 50 as force section by the eliminating position P2 of conveyance direction downstream side than the throughput direction length Lwl (with reference to Fig. 3) of workpiece W to using above-mentioned camera site P1 for benchmark is set in, and gets rid of workpiece W by the compressed air sprayed from this air injection nozzle 50 to workpiece W force from transport path 10.Air injection nozzle 50 is such as formed by the hole of the sidewall 10a being arranged at transport path 10, sprays compressed air by the electrical instruction be transfused to as above-mentioned instruction.Preset the target location Pw (with reference to Fig. 3) making this force effect on the workpiecew, in the present embodiment, the throughput direction central authorities with air injection nozzle 50 workpiece W side are in opposite directions set as target location Pw.By making force act on this target location Pw, can suppress to horizontally rotate as the workpiece W getting rid of object when getting rid of from transport path 10 to move.In addition, eliminating process in the present invention comprise make workpiece W from transport path 10 fall be in the workpiece below transport path 10 by the process of receiving part etc., workpiece W is assigned to process etc. from any one transport path 10 getting rid of position P2 branch etc.
Opportunity, control unit 36 carried out according to the transporting velocity Vw of the workpiece W calculated by speed computing unit 35 opportunity that control command output unit 34 pairs of injection nozzles 50 export electrical instruction.Specifically, according to following formula (2), calculate from being determined as fault by pose discrimination unit 33 to referring to by making output unit 34 export stand-by time t α (sec) above-mentioned electrical instruction (with reference to Fig. 4), export the opportunity of electrical instruction according to this stand-by time t α control command output unit 34 pairs of air injection nozzles 50, force also can be made when the transporting velocity Vw of workpiece W changes from set point thus to act on above-mentioned target location Pw.
tα={(L-Lw2)/Vw}-tp-td···(2)
At this, Vw is the transporting velocity (m/S) (with reference to Fig. 3) of the workpiece W of conveying on transport path 10, and L is the coverage E from imaging apparatus group
lto the distance (m) (with reference to Fig. 3) getting rid of position P2, Lw2 is the distance (m) (with reference to Fig. 3) of the rear end Wb to target location Pw from workpiece W, and tp is image processing time (esc) (with reference to Fig. 4) required to the pose discrimination of above-mentioned pose discrimination unit 33 completes from above-mentioned image is taken into being taken into of unit 31.When the time needed for preliminary treatment, pose discrimination process and speed computing that is configured to is fixed all the time, image processing time tp becomes fixed value or set point.On the other hand, when correspondingly image processing time tp changes in the increase and decrease of the pixel count being configured to the composograph data caused with the change by transporting velocity Vw, in control device 3, the counting of image processing time tp is carried out.Td is that rejected unit 5 receives above-mentioned electrical instruction and rises and make force act on mechanicalness passing time (esc) workpiece W (with reference to Fig. 4) to by getting rid of process, is the setting parameter for each rejected unit 5.Above-mentioned distance L, passing time td etc. is inputted via input unit 41.
The action of the feed appliance 100 of structure as above is described with reference to the sequential chart shown in Fig. 4.Below, describe Surface scan camera 2 be set as line sweep pattern and after a unsuitable workpiece W quilt cover smear camera 2 shooting of posture, be excluded the action till unit 5 is got rid of.
When moment t01 takes the workpiece W of conveying on transport path 10, the view data got thus is taken into unit 31 via image and is immediately taken into (transmission), and binary conversion treatment portion 32a carries out the preliminary treatment such as binaryzation to this view data.In addition, end test section 32b detects front end Wa and the rear end Wb of workpiece W, detects the front end Wa of workpiece W in the view data that moment t01 gets.Also take successively with the interval of regulation after the shooting of moment t01, during each shooting, all immediately carry out being taken into and preliminary treatment of view data.Then, when the view data got based on the shooting by moment t02 identifies the rear end Wb of workpiece W by end test section 32b, at moment t03, composograph data generating section 32c starts to generate composograph data, and carries out the pose discrimination process undertaken by pose discrimination unit 33 and the speed computing undertaken by speed computing unit 35 according to these composograph data.In addition, use hardware (such as FPGA (field-programmablegatearray: field programmable gate array)) to carry out until the process of moment t03, the later process of moment t03 carries out with software mode by performing the program stored in memory.Afterwards, opportunity, control unit 36 control command output unit 34 made the moment t05 that have passed through stand-by time t α from moment t04 export electrical instruction.Then, spray compressed air from the air injection nozzle 50 of rejected unit 5 thus, have passed through the moment t06 of passing time td from moment t05, the force practical function produced by air is in workpiece W.In addition, suppose when the workpiece W after carrying out pose discrimination process posture suitably and be determined as the posture of regulation by pose discrimination process, do not carry out the process (output of electrical instruction and the injection from air injection nozzle 50) for being got rid of from transport path 10 by this workpiece W.
Like this, the unsuitable workpiece W of posture is excluded, and the workpiece W only having posture suitable is provided to and provides destination.
As mentioned above, the feed appliance image processing apparatus 8 of the first execution mode is applied to the feed appliance 100 of the camera possessing the workpiece W that shooting is transferred along transport path 10, be configured to, Surface scan camera 2 is adopted as camera, this Surface scan camera 2 has the multiple imaging apparatuss arranged on the throughput direction and the direction orthogonal with the throughput direction of this workpiece W of workpiece W, use these imaging apparatuss to obtain view data, and possess: setup unit 30, the a part of imaging apparatus forming row orthogonally with above-mentioned throughput direction in its multiple imaging apparatuss only had by above-mentioned Surface scan camera 2 is set as can be used in taking, image is taken into unit 31, and it is taken into accessed view data when only above-mentioned a part of imaging apparatus is for taking immediately from above-mentioned Surface scan camera 2, and the pose discrimination unit 33 of fine or not judgement unit as workpiece, it is based on the posture other places reason being taken into view data that unit 31 is taken into by above-mentioned image and carrying out differentiating as the quality of workpiece W process.
At this, the differentiation of the quality of workpiece W represents that whether outward appearance, the posture differentiating workpiece W be outward appearance, the posture of regulation.
A part of imaging apparatus can be only utilized by setup unit 30, be reduced by the pixel count once taking the view data obtained by Surface scan camera 2 thus, thus the speed that is taken into (transfer rate) that image is taken into unit 31 can be improved, therefore can shorten the time from shooting to pose discrimination process to a workpiece W and make the conveying high speed of workpiece W.On the other hand, the whole imaging apparatuss had by utilizing Surface scan camera 2, can make more widerly than line scan cameras to appear in view data, to appear at parts in this view data etc. for benchmark, can using as a part of imaging apparatus imaging apparatus group simply and be correctly set in suitable position.
Namely, in the present embodiment, feed appliance 100 is configured to possess the rejected unit 5 as workpiece processing unit, this rejected unit 5 as workpiece processing unit sprays compressed air to the workpiece W arriving the eliminating position P2 as workpiece processing site be set in transport path 10 as force from the air injection nozzle 50 as force section, thus this workpiece W is got rid of from transport path 10, according to the differentiation result of pose discrimination unit 33, rejected unit 5 is worked, and when all imaging apparatuss are used for taking, by the coverage E of Surface scan camera 2
ebe set in the position comprising air injection nozzle 50, this coverage E can be being occurred by setup unit 30
eview data on select and set the position of imaging apparatus group.
In order to get rid of the unsuitable workpiece W of posture, imaging apparatus group is important relative to the position of air injection nozzle 50, but observe there is air injection nozzle 50 view data while with air injection nozzle 50 for benchmark is to select the position of imaging apparatus group, the time of location significantly can be shortened thus.Specifically, the distance L that usually will obtain camera site P1 and get rid of between the P2 of position described in the above-mentioned formulas (2) such as the transporting velocity of the workpiece W predetermined is used as set point, the view data that Surface scan camera 2 by Surface scan pattern gets sets from the position getting rid of position P2 only separating distance L the position of imaging apparatus group, thus can by imaging apparatus group simply and be correctly set in suitable position.In addition, just in case when the transporting velocity of workpiece W changes in course of conveying, use the transporting velocity Vw etc. of the workpiece W obtained according to above-mentioned formula (1), obtain according to above-mentioned formula (2) until exported the suitable stand-by time t α of electrical instruction by instruction output unit 34, can adjust and spray compressed-air actuated opportunity.
And, be configured to, the imaging apparatus group set by above-mentioned setup unit 30 is taken continuously, also possesses pretreatment unit 32, this pretreatment unit 32 can be taken into according to image the view data that unit 31 is taken into immediately and differentiate workpiece W, according to being determined as by above-mentioned pretreatment unit 32, pose discrimination unit 33 occurs that the view data of workpiece W carries out the pose discrimination process of workpiece W.
At this, in order to carrying the whole workpiece W come to carry out pose discrimination process, needing reliably to take the whole workpiece W carrying and come, in order to realize such shooting, such as, considering that using transducer to obtain workpiece W arrives coverage E
linterior opportunity, but owing to needing sensor device in addition, therefore there is this problem of cost increase.On the other hand, in the present embodiment, a part of imaging apparatus is taken continuously, reliably can take the whole workpiece W carrying and come thus.In addition, occurring that the view data of workpiece W carries out pose discrimination process according to being determined as, not needing thus according to not occurring that the view data of workpiece W carries out pose discrimination process, thus preventing from carrying out useless process.Thus, do not need to arrange sensor device in addition, thus suppress cost increase and process to increase, and pose discrimination process can be carried out to carrying the whole workpiece W come.
< second execution mode >
Below, use Fig. 5 explanation as the feed appliance 110 of the second execution mode of the present invention.In addition, except structure described later, the feed appliance 110 of present embodiment is identical with the feed appliance 100 of above-mentioned first execution mode, therefore omits the record with feed appliance 100 same structure.
In the feed appliance 100 of the first execution mode, one row imaging apparatus group is only set during online scan pattern, but, the feed appliance 110 of present embodiment is configured to, during online scan pattern setting and the throughput direction of workpiece W formed orthogonally row the first imaging apparatus group and in the second imaging apparatus group more forming row than this first imaging apparatus group by the position of conveyance direction downstream side and above-mentioned throughput direction orthogonally, coverage (the first shooting line) E being positioned at the first imaging apparatus group can be taken
l1 or second coverage (the second shooting line) E of imaging apparatus group
lthe workpiece W of 2.In addition, in the present embodiment, rejected unit 5 has two air injection nozzles 50a, 50b, an air injection nozzle 50a is arranged at the coverage E of the first imaging apparatus group
l1 and the second coverage E of imaging apparatus group
lposition between 2, and another air injection nozzle 50b is arranged at the coverage E than the second imaging apparatus group
l2 more by the position of conveyance direction downstream side.And, be configured to, pose discrimination unit 33 (with reference to Fig. 1) carries out pose discrimination process according to the view data got by the first imaging apparatus group, its result, an air injection nozzle 50a is used to carry out eliminating process to being determined as the unsuitable workpiece W of posture, for the workpiece W be not excluded by an air injection nozzle 50a, the view data according to being got by the second imaging apparatus group carries out pose discrimination process again.Another air injection nozzle 50b is used to carry out eliminating process to being determined as the unsuitable workpiece W of posture in pose discrimination process again, eliminating process is not carried out to the workpiece W being determined as correct set in pose discrimination process again, is transported to not shown conveying destination.Structure other than the above is identical with the first execution mode.
As mentioned above, the feed appliance image processing apparatus of the second execution mode is applied to the feed appliance 100 possessing rejected unit 5, this rejected unit 5 is to arriving the eliminating position P2 be set on transport path 10, the workpiece W of P2 sprays compressed air, thus this workpiece W is got rid of from transport path 10, setup unit 30 sets the first imaging apparatus group of forming row with above-mentioned throughput direction orthogonally and in the second imaging apparatus group more forming row than above-mentioned first imaging apparatus group by the position of above-mentioned conveyance direction downstream side and above-mentioned throughput direction orthogonally from multiple imaging apparatus, be configured to, pose discrimination unit 33 (with reference to Fig. 1) carries out pose discrimination process according to the view data got by the first imaging apparatus group, and the view data according to being got by the second imaging apparatus group carries out pose discrimination process, according to the differentiation result of pose discrimination unit 33, rejected unit 5 is worked.
At this, when only utilizing a line scan cameras when obtaining the view data for differentiating process, produce shooting mistake because workpiece W to beat on transport path 10 because of vibration etc. to cause, thus fulfilling rate (correctly can differentiate posture and only workpiece W suitable for posture is transported to the probability of conveying destination) reduces sometimes.In order to eliminate this problem, also considering arrange two line scan cameras and carry out twice pose discrimination process, but causing cost increase when increasing the setting of numbers of line scan cameras.
On the other hand, in the present embodiment, after carrying out primary pose discrimination process according to the view data got by the first imaging apparatus group, secondary pose discrimination process can be carried out according to the view data got by above-mentioned second imaging apparatus group to the workpiece W be not excluded by an air injection nozzle 50a.Therefore, be configured to according to this differentiation result, rejected unit 5 be worked, thus compared with only carrying out the situation of a pose discrimination process, only the workpiece W being determined as the posture of regulation more stably can be transported to conveying destination, thus suppression cost increase, and fulfilling rate can be improved.
< the 3rd execution mode >
Below, use Fig. 6 explanation as the feed appliance 120 of the 3rd execution mode of the present invention.In addition, except structure described later, the feed appliance 120 of present embodiment is identical with the feed appliance 100 of above-mentioned first execution mode, therefore omits the record with feed appliance 100 same structure.
In the feed appliance 120 as the 3rd execution mode of the present invention shown in Fig. 6, not only differentiate that whether the specific face of workpiece W is towards prescribed direction, also differentiate that fore-and-aft direction also needs consistent workpiece W, is such as used in the upper surface W as certain surface as workpiece W simply
uon to define the diode of characteristic point (mark) Wm at throughput direction rear.
Present embodiment is configured in the same manner as the second execution mode, and set the first imaging apparatus group and the second imaging apparatus group during online scan pattern, these imaging apparatus groups were taken continuously on same opportunity.Coverage (the first shooting line) E of the first imaging apparatus group
l1 and second coverage (the second shooting line) E of imaging apparatus group
ldistance between 2 is set to the distance of the front end Wa to characteristic point Wm from workpiece W, at the coverage E of the first imaging apparatus group
l1 and the second coverage E of imaging apparatus group
lnamely close to the coverage E of the second imaging apparatus group between 2
lthe position of 2 is provided with air injection nozzle 50.In the present embodiment, the coverage E of the second imaging apparatus group is arrived at workpiece W
lin 2 and when detecting the front end Wa of workpiece W by end test section 32b, when detecting the above-mentioned characteristic point Wm of workpiece W according to the view data based on the second imaging apparatus group be meanwhile taken into, the posture being determined as this workpiece W is suitable, is determined as posture in addition inappropriate.Identical with the first execution mode than the above described structure.
The flow chart of use shown in Fig. 7 further illustrates the process to a workpiece W.Be taken into unit 31 via image and the view data simultaneously got by the first imaging apparatus group and the second imaging apparatus group is taken into control device 3 (step S1), carry out preliminary treatment by pretreatment unit 32, end test section 32b judges whether according to the view data got by the second imaging apparatus group the front end Wa (step S2) detecting workpiece W.When not detecting the front end Wa of workpiece W (step S2: "No"), turn back to step S1.When detecting the front end Wa of workpiece W (step S2: "Yes"), pretreatment unit 32 judges whether according to the view data got by the first imaging apparatus group the characteristic point Mw (step S3) detecting workpiece W.When detecting characteristic point Mw (step S3: "Yes"), pose discrimination unit 33 is judged as that the posture of this workpiece W suitably and not carries out eliminating process, process ends figure.When not detecting the characteristic point Wm of workpiece W (step S2: "No"), pose discrimination unit 33 is judged as that the posture of this workpiece W is inappropriate, calculates and sprays compressed-air actuated opportunity (step S4) by the air injection nozzle 50 of rejected unit 5.According to the view data got by the first imaging apparatus group or the second imaging apparatus group, in the same manner as the first execution mode, opportunity, control unit 36 calculated stand-by time T α according to the transporting velocity Vw of workpiece W, controlled thus.When exporting electrical instruction (step S5) through stand-by time T α instruction output unit 34, rejected unit 5 eliminating is determined as the unsuitable workpiece W (step S6) of posture, process ends figure.
As mentioned above, feed appliance image processing apparatus as the 3rd execution mode is configured to be applied to the feed appliance 120 possessing rejected unit 5, this rejected unit 5 sprays compressed air to the workpiece W arriving workpiece processing site P2, the P2 be set on transport path 10, thus this workpiece W is got rid of from transport path 10, be used in the upper surface W as certain surface as above-mentioned workpiece W
ua part define the workpiece of the characteristic point Wm of regulation, setup unit 30 from multiple imaging apparatus, set the first imaging apparatus group of forming row with above-mentioned throughput direction orthogonally and in the second imaging apparatus group more forming row than above-mentioned first imaging apparatus group by the position of above-mentioned conveyance direction downstream side and above-mentioned throughput direction orthogonally, this feed appliance image processing apparatus is adjusted to the coverage E being in the second imaging apparatus group in the throughput direction front end of workpiece W
lcharacteristic point Mw when in 2, this workpiece W formed appears at the coverage E of above-mentioned first imaging apparatus group
lin 1, this feed appliance image processing apparatus also possesses pretreatment unit 32, this pretreatment unit 32 can detect front end Wa on the throughput direction of workpiece W and characteristic point Wm according to being taken into by image view data that unit 31 is taken into, when detecting the front end Wa of workpiece W according to the view data got by above-mentioned second imaging apparatus group, view data according to the first imaging apparatus group got with this view data simultaneously detects above-mentioned characteristic point Wm, makes rejected unit 5 to not detecting that the workpiece W of characteristic point Wm carries out action.
At upper surface W
uon throughput direction rear or the front workpiece that is formed with characteristic point Wm in the structure of above-mentioned first and second execution modes, also can carry out pose discrimination, but in the structure of the first and second execution modes, process becomes complicated.Therefore, the second imaging apparatus group is made to play function as detected the synchro pick-up of the front end Wa of workpiece W, characteristic point Wm is detected when detecting the front end Wa of workpiece W, if detect characteristic point, be determined as the correct set of this workpiece W, if do not detect characteristic point, the posture being determined as workpiece W is inappropriate, easily can carry out pose discrimination thus within the short processing time.In addition, such as observe the view data that gets when Surface scan pattern and set by setup unit 30, the first imaging apparatus group and the second imaging apparatus group easily can be set in suitable position thus.
In addition, as the structure at two places of a shooting workpiece W, also consider use two line scan cameras, but, the size being used in a limit of the workpiece W of present embodiment is about 6mm, and the position can taking such close limit is difficult to configuration two line scan cameras.
Feed appliance 100,110,120 of the present invention uses above-mentioned feed appliance image processing apparatus 8, and it is characterized in that possessing: feed appliance main body 1, it has the transport path 10 for conveying workpieces W; Surface scan camera 2, it has the multiple imaging apparatuss arranged on the throughput direction and the direction orthogonal with the throughput direction of this workpiece W of above-mentioned workpiece W, takes the above-mentioned workpiece W that carries along above-mentioned transport path 10 to obtain view data; Rejected unit 5, the workpiece W of the workpiece processing site P2 through being set in above-mentioned transport path 10 gets rid of from transport path 10 by it; And instruction output unit 34, its when pose discrimination unit 33 be determined as posture be not regulation posture time, export the instruction for making above-mentioned rejected unit 5 work.In this feed appliance 100,110,120, the roughly all imaging apparatuss had by utilizing Surface scan camera 2, can by imaging apparatus group simply and be correctly set in suitable position, and only imaging apparatus group can be used for shooting, thus the transfer rate of view data can be improved and make the conveying high speed of workpiece W.
Above, describe an embodiment of the invention, but the concrete structure in each portion is not limited to above-mentioned execution mode.
Such as, in the first ~ three execution mode, to be determined as the unsuitable workpiece W of posture carry out from transport path 10 get rid of eliminating process, but also can be configured to following structure: correcting posture unit is set as workpiece processing unit to replace rejected unit 5, be set up to correct and be determined as the posture of the unsuitable workpiece W of posture being set in the corrective positioning on transport path 10.The Kong Laixiang workpiece W that correcting posture unit possesses via being arranged on the correcting posture position of transport path 10 sprays compressed-air actuated air injection nozzle, compressed air is sprayed from air injection nozzle, make to be positioned at the workpiece W correcting position reverse or rotate, posture correction thus.In addition, if can correct the posture of workpiece W, then correcting posture unit is not limited to this structure.Correcting posture unit is configured to spray compressed air when exporting electrical instruction from instruction output unit from air injection nozzle.
In addition, in the present embodiment, feed appliance image processing apparatus 8 in order to differentiate that the posture of workpiece W uses, but, also can in order to detect the outward appearance of the workpiece W such as silk-screen character on the shape of workpiece W, color, workpiece W and use.Feed appliance image processing apparatus in this situation is configured to the pose discrimination unit 33 that replaces differentiating the posture of workpiece W and suitably has the unit of the outward appearance checking workpiece W.
In addition, in the first ~ three execution mode, pretreatment unit 32 at every turn by image be taken into unit 31 be taken into view data time, immediately the preliminary treatment such as binary conversion treatment are carried out, but also can be configured to, a workpiece W be taken into end after to occurring that all images data of this workpiece W carry out the combination as pretreated binary conversion treatment and image.
And, in first, second execution mode, above-mentioned shooting number of times acquiring unit 42a uses the pixel count of composograph data in the calculating of shooting number of times A being applied to above-mentioned formula (1), but replace the pixel count of composograph data, also can use from the view data of the front end Wa occurring workpiece W to occur this workpiece W rear end Wb view data multiple view data in the aggregate value of pixel count.In addition, also can be configured to, in order to obtain shooting number of times A, the number of times captured by direct opposite smear camera 2 counts.
Further, in first, second execution mode, the view data got by Surface scan camera 2 is synthesized, but also can not carry out synthesizing and differentiating.In addition, also only quality differentiation can be carried out according to by once taking the view data obtained to a workpiece W.Further, also can not take continuously and the transducer that detection workpiece W arrives this situation is set in addition, taking when workpiece W arrives.
In addition, in this second embodiment, in the unsuitable situation of workpiece W determined according to the view data got by the first imaging apparatus group, in order to confirm whether this workpiece W is got rid of by an air injection nozzle 50a, also can use the second imaging apparatus group.In this case, according to the distance between the first imaging apparatus group and the second imaging apparatus group, obtain the coverage E of workpiece W through the first imaging apparatus group in advance
lthe coverage E of the second imaging apparatus group is arrived after 1
ltime till 2, when being determined as unsuitable workpiece W according to the view data got by the first imaging apparatus group and being got by the second imaging apparatus group, get rid of this workpiece W by another air injection nozzle 50b.
In addition, in this second embodiment, the whole workpiece W be configured to not being excluded by an air injection nozzle 50a carry out secondary pose discrimination process, but, in order to suppress process to increase, for a workpiece W, also can only when by being determined as correct set based on the pose discrimination process of the view data got by the first imaging apparatus group, taken by the second imaging apparatus group, carry out secondary pose discrimination process according to this view data.
Further, in this second embodiment, also can be configured to, be provided with two air injection nozzles 50a, 50b, but also can only at the coverage E than the second imaging apparatus group
l2 more arrange air injection nozzle by conveyance direction downstream side.In this case, to carrying the whole workpiece W come to carry out twice pose discrimination process respectively, carry out eliminating process to being at least determined as unsuitable workpiece W in any one pose discrimination process.
In addition, in the third embodiment, be configured to, detect the front end Wa of workpiece W according to the view data got by the first imaging apparatus group, but also can detect the rear end Wb of workpiece W according to the view data got by the first imaging apparatus group.In addition, in the structure of front end Wa detecting workpiece W, compared with the structure of the rear end Wb of detection workpiece W, opportunity from starting to take a workpiece W to differentiating the posture of this workpiece W is shifted to an earlier date, in the unsuitable situation of posture of this workpiece W, promptly can carry out eliminating action.
Further, imaging apparatus group is not limited to imaging apparatus and is only arranged in row, in the scope playing effect of the present invention, also can arrange along the above imaging apparatus of two row that the throughput direction of workpiece W is adjacent.
Other structure also can carry out various distortion without departing from the scope of spirit of the present invention.
Claims (6)
1. a feed appliance image processing apparatus, is applied to the feed appliance possessing camera, and the shooting of this camera is along the workpiece of transport path conveying, and the feature of this feed appliance image processing apparatus is,
Adopt Surface scan camera as above-mentioned camera, this Surface scan camera has the multiple imaging apparatuss arranged on the throughput direction and the direction orthogonal with the throughput direction of this workpiece of above-mentioned workpiece, obtains view data by these imaging apparatuss,
Further, above-mentioned feed appliance image processing apparatus possesses:
Setup unit, a part of imaging apparatus forming row orthogonally with above-mentioned throughput direction in its multiple imaging apparatuss only had by above-mentioned Surface scan camera is set as can be used in taking;
Image is taken into unit, and it is taken into accessed view data when only above-mentioned a part of imaging apparatus is for taking immediately from above-mentioned Surface scan camera; And
The fine or not judgement unit of workpiece, it differentiates process according to the quality being taken into view data that unit is taken into by above-mentioned image and carrying out workpiece.
2. feed appliance image processing apparatus according to claim 1, is characterized in that,
Above-mentioned feed appliance possesses workpiece processing unit, this workpiece processing unit exerts a force from force section to the workpiece arriving the workpiece processing site be set in above-mentioned transport path, thus this workpiece got rid of from above-mentioned transport path or carry out correcting posture at above-mentioned transport path
Above-mentioned feed appliance image processing apparatus is configured to, and makes above-mentioned workpiece processing unit work according to the differentiation result of the fine or not judgement unit of above-mentioned workpiece,
And, when nearly all imaging apparatus is all used for taking, the coverage of above-mentioned Surface scan camera is set in the position comprising above-mentioned force section, is selected in the view data occurring this coverage by above-mentioned setup unit and set the position of above-mentioned a part of imaging apparatus.
3. feed appliance image processing apparatus according to claim 1, is characterized in that,
Above-mentioned feed appliance possesses workpiece processing unit, and this workpiece, to the workpiece force arriving the workpiece processing site be set on above-mentioned transport path, is got rid of from above-mentioned transport path or carries out correcting posture at above-mentioned transport path by this workpiece processing unit thus,
Above-mentioned setup unit sets the first imaging apparatus group of forming row with above-mentioned throughput direction orthogonally and in the second imaging apparatus group more forming row than above-mentioned first imaging apparatus group by the position of above-mentioned conveyance direction downstream side and above-mentioned throughput direction orthogonally from multiple imaging apparatus
The fine or not judgement unit of above-mentioned workpiece carries out quality according to the view data got by above-mentioned first imaging apparatus group and differentiates process, and carries out quality differentiation process according to the view data got by above-mentioned second imaging apparatus group,
Above-mentioned feed appliance image processing apparatus makes above-mentioned workpiece processing unit work according to the differentiation result of the fine or not judgement unit of above-mentioned workpiece.
4. feed appliance image processing apparatus according to claim 1, is characterized in that,
Above-mentioned feed appliance possesses workpiece processing unit, and this workpiece, to the workpiece force arriving the workpiece processing site be set on above-mentioned transport path, is got rid of from above-mentioned transport path or carries out correcting posture at above-mentioned transport path by this workpiece processing unit thus,
The part being used in certain surface as above-mentioned workpiece defines the workpiece of the characteristic point of regulation,
Above-mentioned setup unit sets the first imaging apparatus group of forming row with above-mentioned throughput direction orthogonally and in the second imaging apparatus group more forming row than above-mentioned first imaging apparatus group by the position of above-mentioned conveyance direction downstream side and above-mentioned throughput direction orthogonally from multiple imaging apparatus
Above-mentioned feed appliance image processing apparatus is adjusted to the characteristic point this workpiece formed when the throughput direction front end of above-mentioned workpiece or throughput direction rear end are in the coverage of above-mentioned second imaging apparatus group and appears in the coverage of above-mentioned first imaging apparatus group
Also possess pretreatment unit, this pretreatment unit can detect the throughput direction front end of workpiece or throughput direction rear end and above-mentioned characteristic point according to being taken into by above-mentioned image view data that unit is taken into,
When detecting throughput direction front end or the throughput direction rear end of workpiece according to the view data got by above-mentioned second imaging apparatus group, above-mentioned feed appliance image processing apparatus detects above-mentioned characteristic point according to the view data of the first imaging apparatus group got with this view data simultaneously
Make above-mentioned workpiece processing unit for not detecting that the workpiece of above-mentioned characteristic point carries out work.
5. the feed appliance image processing apparatus according to any one in Claims 1 to 4, is characterized in that,
The above-mentioned a part of imaging apparatus set by above-mentioned setup unit is taken continuously,
Also possess pretreatment unit, this pretreatment unit can be taken into according to above-mentioned image the view data that unit is taken into immediately and differentiate above-mentioned workpiece,
According to being determined as by above-mentioned pretreatment unit, the fine or not judgement unit of above-mentioned workpiece has occurred that the view data of workpiece carries out the quality differentiation process of workpiece.
6. a feed appliance, uses the feed appliance image processing apparatus according to any one in Claims 1 to 5, it is characterized in that possessing:
Feed appliance main body, it has the transport path for conveying workpieces;
Surface scan camera, it has the multiple imaging apparatuss arranged on the throughput direction and the direction orthogonal with the throughput direction of this workpiece of above-mentioned workpiece, takes the above-mentioned workpiece carried along above-mentioned transport path to obtain view data;
Workpiece processing unit, this workpiece is got rid of from transport path the workpiece through being set in the workpiece processing site on above-mentioned transport path or carries out correcting posture at transport path by it; And
Instruction output unit, its fine or not judgement unit when above-mentioned workpiece is determined as when not being outward appearance or the workpiece of posture of regulation, exports the instruction for making above-mentioned workpiece processing unit work.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910179509.XA CN110077812A (en) | 2014-04-21 | 2015-04-14 | Feed appliance image processing apparatus and feed appliance |
CN201910179599.2A CN109981979A (en) | 2014-04-21 | 2015-04-14 | Feed appliance image processing apparatus and feed appliance |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014087621A JP6344031B2 (en) | 2014-04-21 | 2014-04-21 | Image processing apparatus for parts feeder and parts feeder |
JP2014-087621 | 2014-04-21 |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910179599.2A Division CN109981979A (en) | 2014-04-21 | 2015-04-14 | Feed appliance image processing apparatus and feed appliance |
CN201910179509.XA Division CN110077812A (en) | 2014-04-21 | 2015-04-14 | Feed appliance image processing apparatus and feed appliance |
Publications (2)
Publication Number | Publication Date |
---|---|
CN105049700A true CN105049700A (en) | 2015-11-11 |
CN105049700B CN105049700B (en) | 2019-06-21 |
Family
ID=54430596
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910179599.2A Withdrawn CN109981979A (en) | 2014-04-21 | 2015-04-14 | Feed appliance image processing apparatus and feed appliance |
CN201910179509.XA Pending CN110077812A (en) | 2014-04-21 | 2015-04-14 | Feed appliance image processing apparatus and feed appliance |
CN201510173788.0A Active CN105049700B (en) | 2014-04-21 | 2015-04-14 | Feed appliance image processing apparatus and feed appliance |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910179599.2A Withdrawn CN109981979A (en) | 2014-04-21 | 2015-04-14 | Feed appliance image processing apparatus and feed appliance |
CN201910179509.XA Pending CN110077812A (en) | 2014-04-21 | 2015-04-14 | Feed appliance image processing apparatus and feed appliance |
Country Status (4)
Country | Link |
---|---|
JP (1) | JP6344031B2 (en) |
KR (1) | KR102288639B1 (en) |
CN (3) | CN109981979A (en) |
TW (1) | TWI631063B (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107976201A (en) * | 2017-10-09 | 2018-05-01 | 汪腊新 | A kind of method that machining path is automatically generated based on face battle array 3D cameras |
CN108160530A (en) * | 2017-12-29 | 2018-06-15 | 苏州德创测控科技有限公司 | A kind of material loading platform and workpiece feeding method |
CN108620838A (en) * | 2017-03-16 | 2018-10-09 | Ykk株式会社 | Component delivery system and component delivery method |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111780682B (en) * | 2019-12-12 | 2024-06-21 | 天目爱视(北京)科技有限公司 | 3D image acquisition control method based on servo motor |
CN114082674B (en) * | 2021-10-22 | 2023-10-10 | 江苏大学 | Small particle agricultural product color selection method combining surface sweeping line sweeping photoelectric characteristics |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08119436A (en) * | 1994-10-21 | 1996-05-14 | Okura Yusoki Co Ltd | Delivery device |
JP2005064586A (en) * | 2003-08-13 | 2005-03-10 | Jai Corporation | Imaging apparatus for inspection/selection apparatus provided with imaging timing automatic detection function |
CN101539406A (en) * | 2009-05-06 | 2009-09-23 | 北京科技大学 | Method and device for measuring shape and size of workpiece with high-temperature end and low-temperature end on line |
CN101859717A (en) * | 2009-04-13 | 2010-10-13 | 株式会社日立高新技术 | Job processing apparatus, display substrate module assembly line or assemble method |
US20110285841A1 (en) * | 2010-05-20 | 2011-11-24 | Daiichi Jitsugyo Viswill Co., Ltd. | Appearance Inspection Apparatus |
CN202700832U (en) * | 2012-05-22 | 2013-01-30 | 无锡嘉华智慧交通科技有限公司 | Dynamic identification device of inferior products in production |
JP2013039981A (en) * | 2011-08-11 | 2013-02-28 | Sinfonia Technology Co Ltd | Workpiece sorter |
CN103056111A (en) * | 2012-12-21 | 2013-04-24 | 浙江大学 | Prawns quality detecting and classifying device based on machine vision technology |
CN103204375A (en) * | 2012-01-17 | 2013-07-17 | 东京威尔斯股份有限公司 | Appearance inspection apparatus of workpiece and appearance inspection method of workpiece |
CN103639121A (en) * | 2013-12-27 | 2014-03-19 | 天津市光学精密机械研究所 | Automatic sorting equipment of red date cockles |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101399570B1 (en) * | 2007-06-19 | 2014-05-30 | 쿠오리카프스 가부시키가이샤 | Vibration feeder, carrier and visual inspection apparatus |
KR101311852B1 (en) * | 2010-12-08 | 2013-09-27 | 엘아이지에이디피 주식회사 | Substrate transfer unit and inspecting apparatus using the same |
-
2014
- 2014-04-21 JP JP2014087621A patent/JP6344031B2/en active Active
- 2014-11-18 TW TW103139915A patent/TWI631063B/en active
-
2015
- 2015-02-04 KR KR1020150017431A patent/KR102288639B1/en active IP Right Grant
- 2015-04-14 CN CN201910179599.2A patent/CN109981979A/en not_active Withdrawn
- 2015-04-14 CN CN201910179509.XA patent/CN110077812A/en active Pending
- 2015-04-14 CN CN201510173788.0A patent/CN105049700B/en active Active
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08119436A (en) * | 1994-10-21 | 1996-05-14 | Okura Yusoki Co Ltd | Delivery device |
JP2005064586A (en) * | 2003-08-13 | 2005-03-10 | Jai Corporation | Imaging apparatus for inspection/selection apparatus provided with imaging timing automatic detection function |
CN101859717A (en) * | 2009-04-13 | 2010-10-13 | 株式会社日立高新技术 | Job processing apparatus, display substrate module assembly line or assemble method |
CN101539406A (en) * | 2009-05-06 | 2009-09-23 | 北京科技大学 | Method and device for measuring shape and size of workpiece with high-temperature end and low-temperature end on line |
US20110285841A1 (en) * | 2010-05-20 | 2011-11-24 | Daiichi Jitsugyo Viswill Co., Ltd. | Appearance Inspection Apparatus |
JP2013039981A (en) * | 2011-08-11 | 2013-02-28 | Sinfonia Technology Co Ltd | Workpiece sorter |
CN103204375A (en) * | 2012-01-17 | 2013-07-17 | 东京威尔斯股份有限公司 | Appearance inspection apparatus of workpiece and appearance inspection method of workpiece |
CN202700832U (en) * | 2012-05-22 | 2013-01-30 | 无锡嘉华智慧交通科技有限公司 | Dynamic identification device of inferior products in production |
CN103056111A (en) * | 2012-12-21 | 2013-04-24 | 浙江大学 | Prawns quality detecting and classifying device based on machine vision technology |
CN103639121A (en) * | 2013-12-27 | 2014-03-19 | 天津市光学精密机械研究所 | Automatic sorting equipment of red date cockles |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108620838A (en) * | 2017-03-16 | 2018-10-09 | Ykk株式会社 | Component delivery system and component delivery method |
CN108620838B (en) * | 2017-03-16 | 2021-06-29 | Ykk株式会社 | Component supply system and component supply method |
CN107976201A (en) * | 2017-10-09 | 2018-05-01 | 汪腊新 | A kind of method that machining path is automatically generated based on face battle array 3D cameras |
CN108160530A (en) * | 2017-12-29 | 2018-06-15 | 苏州德创测控科技有限公司 | A kind of material loading platform and workpiece feeding method |
Also Published As
Publication number | Publication date |
---|---|
KR20150121649A (en) | 2015-10-29 |
JP6344031B2 (en) | 2018-06-20 |
TWI631063B (en) | 2018-08-01 |
CN109981979A (en) | 2019-07-05 |
KR102288639B1 (en) | 2021-08-11 |
TW201540625A (en) | 2015-11-01 |
JP2015207917A (en) | 2015-11-19 |
CN110077812A (en) | 2019-08-02 |
CN105049700B (en) | 2019-06-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105049700A (en) | Image data processing apparatus for parts feeder and parts feeder | |
CN104338684B (en) | Feed appliance speed detector and feed appliance | |
CN110228693B (en) | Feeding device | |
US11972589B2 (en) | Image processing device, work robot, substrate inspection device, and specimen inspection device | |
KR20130085438A (en) | Image processing apparatus and image processing system | |
CN103042529A (en) | Workpiece takeout system, robot apparatus, and method for producing a to-be-processed material | |
JP2015216482A (en) | Imaging control method and imaging apparatus | |
US10875186B2 (en) | Robot system | |
JP2017121995A (en) | Conveyance object discrimination control system and conveyance device | |
TWI622112B (en) | Image processing device for component feeding and component feeder | |
JP6533810B2 (en) | Measuring machine | |
CN210625574U (en) | Ball stud vision detection system and ball stud detection equipment | |
JP2016108081A (en) | Image processing device for parts feeder and parts feeder | |
US20150029328A1 (en) | Electronic component mounting apparatus and electronic component mounting method | |
KR102242916B1 (en) | Method for synchronizing image capturing from external synchronization signal, and camera device implementing the method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |