GB2184233A - Inspection apparatus - Google Patents

Inspection apparatus Download PDF

Info

Publication number
GB2184233A
GB2184233A GB08630026A GB8630026A GB2184233A GB 2184233 A GB2184233 A GB 2184233A GB 08630026 A GB08630026 A GB 08630026A GB 8630026 A GB8630026 A GB 8630026A GB 2184233 A GB2184233 A GB 2184233A
Authority
GB
United Kingdom
Prior art keywords
inspection
features
plurality
articles
means
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB08630026A
Other versions
GB8630026D0 (en
Inventor
Emlyn Roy Davies
Adrian Ivor Clive Johnstone
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
National Research Development Corp UK
Original Assignee
National Research Development Corp UK
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to GB858530929A priority Critical patent/GB8530929D0/en
Priority to GB858530928A priority patent/GB8530928D0/en
Application filed by National Research Development Corp UK filed Critical National Research Development Corp UK
Publication of GB8630026D0 publication Critical patent/GB8630026D0/en
Publication of GB2184233A publication Critical patent/GB2184233A/en
Application status is Withdrawn legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30128Food products

Abstract

Inspection apparatus for the successive inspection of a plurality of articles having a common set of features of interest comprises an edge detector for the positional location of individual articles. The edge detector is fed from a scanner which captures an image of a region in the vicinity of a located article. It analyses the significance of features detected by the scanner and selectively controls the processing of the derived data in order preferentially to obtain data relevant to the features of interest, thereby speeding up processing time.

Description

SPECIFICATION Inspection apparatus This invention relates to industrial inspection apparatus and, in particular, to apparatus adapted for the rapid inspection of a plurality of components.

Industrial inspection involves the identification, location, counting, scrunity and measurementofproducts and components, often under conditions wheretheyare moving at moderate speed along aconveyor system. Products therefore have to be exampled in real time, and this imposes a major difficulty, sincethe processing rate required to analyse the necessary number of pixels is considerably more than can be coped with by a single conventional serial processor. In practice the processing rate is 50-100 times fasterthan a single central processor unit can cope with so special hardware has to be designed forthe purpose.

We have devised an Image-handling Multi-Processor (IMP) forthis purpose. IMP consists of a Versatile Modular Eurocard VME bus and crate, which holds a setofspecial co-procesor boards includng a frame store for image processing. The co-processors operate rapidly and enable the system to perform industrial inspection tasks in real time. The system is designed in an integrated way, so that the data transactions on the VME bus give only a limited overhead in terms of speed. In addition, the memory sub-systems are designed to operate at the maximum speed of wh ich the VME busiscapable.

According to the present invention there is provided inspection apparatus forthe susccessive inspection of a plurality or articles having a common set offeatures of interest, comprising detector meansforthe positional location of individual articles, scanning means to capture an image of a region in the vicinity of a located article and analysis means to analyse the significance of features detected by said scanning means, wherein selection means coupled to said positional location means is operable selectively to control the processing of data derived from said scanning means in order preferentially to obtain data relevant to said features of interest.

There is also provided an image location device for use in inspection apparatus forthe succesive inspection of a plurality of articles having a common set of features of interest, comprising detector means for deriving a plurality of electrical signals corresponding to the intensity ofthe optical signal at a plurality of positions in the vicinity of a predetermined position on an optical image, combining meansforcombining pairs of symmetrically-weighted groups of said electrical signals in accordance with a predetermined algorithm to derive a pair of electrical signals in dependentonthe intensityofthe optical signals at positions in the vicinity of said predetermined position and difference meansto derive an electrical signal dependent on the difference between said pair of electrical signals.

The IMP system is built around a frame store typically containing four image-planes of 128 x 128 bytes. The memory is configured so that access is as rapid asVME protocol will allow, viz 150 nsec. Aspecialtechnique is used to achievethisthroughput.

The purpose of the IMP system is to permit industrial inspection tasks to be undertaken in real time and at reasonable expense. Images from a line-scan orTV (vidicon) camera are digitised and fed to theframestore under computer control: they are then processed rapidly by special processing boards. These contain co-processors which operate under control of a hostcomputerand which can access image data autonomously via theVME bus. An arbitrator on the VME bus mediates between the host processor, the image display hardware and the various co-processors. IMP is a multi-processor system to the extent that a number of processors can perform theirvariousfunctions concurrently, though only one processor may access the bus at any onetime.This is not as severe a constraint as might bethought, since (a) pipelining of processestogetherwith provision of local memory enables use to be made of the parallel processing capability of the co-processors; and (b) many real time industrial inspection applications demand speeds that are high but not so high that careful design of the processors will not permit useful tasks to be done by this sort of system. ltshould be noted that the design of the co-processor system has in this case notonly been careful but also ingenious, sothatthe capability of IMP is substantiallygreaterthan that of many commercially available systems, while being significantly less complex and expensive.

The IMP system finds particular application forfood product inspection. Considerable numbers offood products have the characteristicthattheyare round. In our inspection work we needed to ensurethatwe could locate such products rapidly and then scrutinise them effectively for crucial features and defects. Food products tend to be mass produced in large numbers on continuously moving product lines. These lines are typically 1-2 metres wide, and contain 12-20 products across the width of a conveyor, which may be moving at 30-50cm per second. Thus the produotflow rate pasta given point in the line is typically 20 units per second. This means that one processor ofthe IMP type will have to cope with one item every 50 msec or so.It will have to find each product in the images it receives, and the scruti n ise it. An adequate resolution for a single product will be such that the product occupies a square of side 60 to 80 pixels. Thus location ofthe product is non-trivial, and scrunity is less trivial. It is seen that this sort of problem involves a lot of pixel accesses, but this need not be sufficient to 'tie up' the VME bus and cause data-flow problems.

Considerable attention has been devoted to the algorithms for locating and scrutinising the product and thetypes of processor needed to implementthem. Although algorithms orginallydeveloped for round food product inspection were the first to be implemented in hardware, they have now been generalised so that they are suitable for a much wider variety of products. First, any products which are round or have round holes can be located rapidly and with ease. Second, all such products can be inspected and closely scrutinised. Third, a number of features even of non-circular products can be located directly in their own right, and then products scrutinised in the vicinity of these features, or at related positions.A characteristic of the system is that it relies on recognition of certain features which are capable oftriggering the remainderof the system. In ourexperience,there are a very large number of objeotfeatures that can simply be locatedincluding holes, specks, dots, characters, corners, line intersections and so on. Thus the IMP system can be used in an extremely wide range of industrial inspection and robotics applications. In addition, there are cases when the object or some feature of it does not need to be located, since its presence and position is already known - e.g. in a robot gripper, at the bottom of a chute, or elsewhere. The IMP system is clearly capable of dealing with this simplified set of situations.Overall, it is seen that the IMP system is in practice of rather general utility.

The functions presently performed by the various processors are: (1) edge detection and edge orientation; (2) construction of radial intensity histograms in the vicinity of special features; (3) counting of pixels in various ranges of intensity and with various tags already attached to them; and (4) correlation ofthresholded patterns against internally generated parameters such as distance from the feature of interest. Otherfeatures are: (5) construction of angular intensity histograms; (6) construction of overall object intensity histograms, plus (7) construction of intensity histograms within a specified area; and (8) more general grey-scale sub-image correlation. These functions permitthe rapid estimatation of product centres, and the measurement of perimeter and area, e.g. that of chocolate cover over a biscuit.A novel feature that permits much of the processing to be carried out rapidly and efficiently is that of autoscan by a particular processor of an area of an image relative to a given starting point, coupled with the use of an internal bus which holds (x,y) and (r,6) co-ordinates ofthe currently accessed pixel relative to the starting pixel: this is aided by use of a look-up table on the processor board giving information realted to the pixel (x,y) or (r,()) co-ordinates.

Because the system is designed to use simple functions that can be relocated in the image atwill, itis highly flexible and efficient.

In addition to these advantages, the whole system is simply controlled in a high level language from a PDP-1 1 type of host processor, for which suitable interfaces have been devised. Alternatively, a 68000 or other host processors can be used. The important point here is that complex assembly level programming is not required at any stage. The only requirement imposed by the co-processors is thatthey should be initialised by RESET signals, operation should be started by suitable START pulses, and that certain information should be provided forthem in specific registers and memory locations. Data may be read out of them by the host or other processors, or they may write their data to other locations. Control is via a minimal number of registers which may each be given a high level name for programming purposes.

The co-processorfunctions listed above are special functions which have been found to take the bulk of the ; effort in practical industrial inspection systems. Since these functions are not actually general, they cannot on their own carry outthewhole of an inspection task. Thus 'glue' functionality is required between the given functions. This may be provided by the host processor. If this gets slow or is overloaded, the several software co-processors may be added to the IMP system; at presentthis is envisaged to be possible via DECT-1 1 processors, possibly working in conjunction with a bit-slice.It would not be possible for such processors to perform the whole function of IMP because they would operate too slowly for inspection purposes, unless at least 50 concurrently operating processors were added to the system. The IMP system is intended to overcome the needforthis sort of solution by doing the bulkofthe processing much more economically: thus a brute-force solution is replaced by a clever solution. However, for generality, one ortwo additional software co-processors form a useful adjunct to the remainder of the hardware.

An embodimentofthis invention will now be described byway ofexamplewith reference to the accompanying drawings in which :- Figure 1 is a schematic view of an image-handling multi-processor system; Figure2 is a schematic drawing of Processor II from the IMP system of Figure 1; Figure 3 is a schematic drawing of Processor Module Afor Processor II; Figure 4 is a schematic drawing of Processor Module B for Processor II; Figure is a schematic drawing of Processor Module C fo r Processor II; Figure 6is a schematic drawing of Processor Module D for Processor Ill; Figure 7 is a schematic drawing of Processor Module E for Processor IV;; Figure 8 is a schematic drawing of Processor Module Ffor Processor IV; In orderto carry out the inspection task, products first have to be located. Advantageously inspection can be carried out in two stages: (1 ) product location and (2) product scrutiny. Product location may conveniently be carried out using the generalised Hough transform. This may be performed directly on grey-scale images, for objects whose shapes are determined only from a look-up table. In orderto perform the Hough transform, edge location must be carried out. Processor I has been designed for this purpose, and is therefore able to deliver data from which special reference points in an image containing product may be found. It is crucial to IMPthatstarting reference points should be located because ofthe general nature of the Hough transform. It has been found that special reference points can normally be located in an image containing product, and hence this constitutes a sufficiently general procedure to form the basic starting point for the process of inspection.

Given that certain reference points have been located in an image, the next stage of inspection is to analyse the image in their vicinity, thereby enabling objects to be scrutinised. Processor II carries outthisfunction, using an autoscanning module which scans systematically overthe region of interest. Scanning over a sub-area ofthe image is profitablefortwo reasons (1 ) it speeds up processing by eliminating irrelevant areas; (2) it enables a variety of matching processes to be carried out, since the reference point will be of one orotherstandard type: this meansthatcomparison can be made with previously compiled data sets. The remainder of Processor II contains modules which aid methods by which matching may be achieved.

Processor II contains an internal bus which carries information about the position (x,y) ofthe current pixel relative to thatofthe reference point. In particularthis internal bus carries information on the (r,6) co-ordinates of the current pixel relative to the reference point, which it has obtained from a downloadable look-uptableheld in RAM. Only one reference point may be used at any one time, but the look-up table may contain several sets of additional information, each set being relevant to a particulartype of object orfeature.

For example, the additional information may include data on the ideal size of a particulartype of object, or other details. Thus Processor II may scan the image successively looking at several objects ofvarioustypes, and placing output information on each in its output memory. (The latterarrangementwill often be used to savetimeontheVMEbus).

Use ofthe internal (x,y)/(r,0)linformation bus (which has been designated as the 'Relative Location' or RL-bus) feeds data to the various Modules within Processor II. In particular, these can build up valuable radial and angular intensity histograms which provide rapid means of comparing the region near a special reference point with that expected for a ideal object. Normal intensity histograms of various ranges can also be generated, to aid this analysis, and correlations can be performed for the distribution of particular intensity values, relative to standard distributions.The emphasis here is on rapidly scrutinising specific regions ofthe image in various standard ways, in orderto save the host processorfrom the bulk ofthe processing. The host processor is still permitted to interrogate part of the image when Processor II leaves any detail unclear.

Processor I carries out a vital edge detection function: in fact it is optimised for the computation of Hough transforms. Forthis purpose it is insufficient to compute, and threshold edge magnitude - edge orientation also has to be determined locally. Processor I is designed to determine both edge magnitude and edge orientation. In addition Processor I is used in a novel and unique manner, namelythresholding edge magnitude at a high level. The reason forthis is (a) to find a significantly reduced number of edge points, thereby speeding up the whole IMP system, and (b) ensuring that the edge points that are located are of increased accuracy relative to an average edge point.This strategy is enhanced by incorporating within Processor I a double threshold on the pixel intensity value, so that points which are not halfway upthe intensity scale are eliminated, thereby speeding up processing further.

Processor I has more complex autoscan unit than Processor II, since it employs a 3 x 3 pixel window instead of a 1 x 1 window. It achieves additional speed-up by (a) saving input pixel data from the previous two pixels (i.e. it only takes in a new 1 x 3 sub-window for every new pixel), (b) pipelining its computation, and (c) saving its output data in its own local high-speed memory, where it is still accessable by the host processor.

The priority levels on theVME bus, starting with the highest, are: level3: executive (host) processor, which acts as system controller level2 hardwired and micro-coded co-processors, including Processor I and Processor II level 1: software co-processors, including DEC T-1 1 processors level 0: video display circuitry Software co-processors are more likely to be intelligent than hardware co-processors, since it is easierto build more complex functionality into software than into hardware. Thus hardware co-processors may not be interruptable and should if necessary be permitted to complete their assigned operations. Therefore they are assigned priority level 2 rather than level 1.Video display circuitry has the lowest priority, and is thus able to display images from VME bus memory only when no other activity is occurring on the bus.

Bus grants to processors at the same level of priority are daisy-chained, andthoseclosestto the arbitrator module have highest resulting priority.

During algorithm development, or in an industrial system when speed of processing is not at a premium, high level language notation of pixels is useful. The PPL2 notation for pixels within a 5 x 5 window is: P15 P14 P13 P12 P11 Pie P4 P3 P2 P10 P17 P5 PO P1 P9 P18 P6 P7 P8 P24 P19 P20 P21 P22 P23 In orderto employthis notation for pixels around the location (X,Y) in an image, it is necessary to perform a re-mapping operation. If this is carried out purely in software it will slowaccessto a miserable level. Inthe IMPframe store re-mapping is carried out automatically in a look-up table: IMP actually copes with windows of size up to 7 x 7 by this method, ratherthan 5 x 5 as in the above example. Clearly, the look-up operation will reduce speed slightly. However, for the two instances cited above, the minimal reduction in speed resulting from direct RAM look-up will be immaterial, and the gains in ease of programming will be very worthwhile. In fact the automatic re-mapping procedure adopted here has the advantage of using absolute rather than indexed addressing, which itself leads to aspeed-upin pixel access: this is not seen in currently available commercial systems where (say) a 68000 has direct access to all the image data in a huge block of contiguous memory.

The size of look-uptable required for re-mapping is that required to combine Xwith 6 bits of window placement information, and similarlyforY. for a 128 x 128 frame store this means two look-uptableseach having 7 + 5 = 13 address bits and 7 co-ordinate data bits plus 1 over-range data bit: and for a 256 x 256 frame store it means two tables have 8 + 6 = 14 address bits and 8 + 1 data bits. Four a 128 x 128 frame store two 8K x 8 EPROMS are sufficientforthe purpose.

Advantageously, apparatus in accordance with the invention may be used for processing signals from an edge detector. This enhances speed of processing by rapid selection of pixels which provide accurate orientation and location information and ignoring other pixels. The principle that is used for selecting pixels giving high location and orientation accuracy is to Iookforthose pixels where the intensity gradient is very uniform.This maybe achieved bythresholding an intensity gradient uniformity parameter at high level ora non-uniformity parameter at low level. This may be achieved by taking two symmetrically-weighted sums of pixels near a pixel location under consideration. Advantageously,these may be re-weighted sothattheywill be exactly equal if the locality has an intensity gradientwhich is exactly uniform.The difference of the sums provides a convenient non-uniformity parameter which may be detected by a threshold detector set to a convenient value.

i The method will be illustrated fora 3 x 3 neighbourhood, where the Sobel operator is being used to detect edges. Using thefollowing notation to describe the pixel intensities in the neighbourhood.

A B C D E F G H I we estimatethe (Sobel) xand ycomponents of intensity gradient as g#=(C+2F+I)-(A+2D+G) g#=(A+2B+C)-(C+2H+I) and intensity gradient can then be estimated as g = [ 9,2 + gy2 ] 1/2 or by a suitable approximation. Edge orientation may be deduced from the relative values of 9x and gy using the arctangentfunction.

Symmetric sums of pixel values that may be used for computing gradient uniformity are sl=A+C+G+I s2 = B + D + F + H S3 = 4E Thus possible non-uniformity parameters would be Us = S2 - 53 u2 = S3 - S1 u3 = s# - Itwill be clearto one skilled in the artthatthe effect of the uniformity detector is to removefrom consideration by a later object locator a good proportion of these edge points where noise is significant or the edge orientation measurement would otherwise innacurate and atthe same time speed upthealgorithm.

ltwill,furthermore, operate in any size of neighbourhood, notjustthe 3 x 3 one illustrated by the Sobell edge detector. It is not restricted to use with a Sobell edge detector - others such as Prewett 3 x 3 edge detector may be employed. Othersets of symmetric combinations of sums of weights should be employed and any Iinearornon-linearcombination ofthese could be used to detect non-uniformity. (E.g. in the 3 x 3 case one could well use us + up + U3.) In largerneighbourhoodsthere are many possible uniformity operators.

Onefunction ofthe uniformity detector is the elimination of noisy locations. Another is the exclusion from consideration of points which are not in their expected position due, forexample, to malformation of an object which is being inspected. The uniformity detector would also be able to improve edge orientation accuracy by eliminating cases when a 'step' edge didn't pass closely enough through the centre of a neighbourhood. It here is any offset, accuracy deteriorates but the uniformity operator improves the probability of detecting this.

The uniformity detector may be used to eliminate edge (or apparent edges) locations which are subject to noise. This noise can arise within any of the pixels in the neighbourhood: e.g. with a Sobel detector, noise in anyone ofthe pixels in the neighbourhood (exceptthe central one!) would reduce the edge orientation accuracy, and the iniformity operator could attempt to detectthis. (Ofcourse, it might sometimes fail - e.g. if two pixels were subjectto noise and the effect cancelled itself out in the uniformity ofthe operator but not in the edge orientation operator itself).

It will furthermore be apparentthatthe screener can act by pre-screening or post-screening or simply in parallel. Parallel screening would be appropriate if special VLSI chips were to be designed, since this would be the fastest operating option. Otherwise pre- or post-screening would be useful for progressively cutting down the data to be handled by a later object detector. Note also that the whole thing could be done in one VLSI chip, ratherthan having one such chipforedge detection and one for uniformity screening. The important point is that the uniformity detector can be used to speed upthe object detection algorithm by reducing the number of points considered.

Other features arethatthe invention permits used of radial histograms and correlation on the fly. Be defining crucial areas in the picture which are relevant for hardware pre-selectsforcomputer removal of redundant information it creates a progressive hierarchy of extraction of data and thus speeds upthe inspection process. It does not waste time because it knows where to check any relevantfeatu res of image, which are not necessarily only edge points. The computer recalculates edge points on the basis of stored data and of measurements from the scanner. Because it is aware of the background points it does not take them into consideration.

The invention finds particular application in industrial inspection of mass produced products such asfood items (biscuits, pizzas, cakes and pies and other products of uniform size and shape). It may also be used for the forensic tasks such as number plate recognition and for optical mark and optical character recognition. It is not restricted to optical techniquesforimage capture and may employ other methods, such as sonar, infra-red ortactile sensing.

Claims (8)

1. Inspection apparatus forthe successive inspection of a plurality of articles having a common set of features of interest, comprising detector means forthe positional location of individual articles, scanning meansto capture an image of a region inthevicinity of a located article and analysis meansto analysethe significanceoffeaturesdetected by said scanning means, wherein selection means coupled to said positional location means is operable selectively to control the processing of data derived from said scanning means in order preferentiallyto obtain data relevant to said features of interest.
2. Inspection apparatus forthe successive inspection of a plurality of articles having a common set of features of interest as claimed in Claim 1 wherein said selection means includes an intensity gradient uniformity detector which serves as an intensity threshold detector.
3. Inspection apparatus forthe successive inspection of a plurality of articles having a common set of features ofinterestas claimed in Claim 2 including rejection meansto reject from further analysis the signals derived from certain of said features.
4. Inspection apparatusforthe successive inspection ofa pluralityofarticles having a common set of features ofinterestas claimed in Claim 3wherein the rejection means is adapted to rejectsignalswhich are subject to noise greater than a predetermined level.
5. Inspection apparatus for the successive inspection of a plurality of articles having a common set of features of interest as claimed inClaim 3wherein the rejection means is adapted to rejectsignalswhich to not conform to a predetermined profile.
6. Inspection apparatusforthe successive inspection of a plurality of articles having a common set of features of interest as claimed in Claim 5 wherein the profile is determined by the location of a step edge relative to the centre of a population of measured intensities.
7. Inspection apparatus for the successive inspection of a plurality of articles having a common set of features of interest substantially as herein described with reference to and as shown in the accompanying drawings.
8. An image location device for use in inspection apparatus in accordance with any one ofthe preceding claims comprising detector means for deriving a plurality of electric signals corresponding to the intensity of the optical signal at a plurality of positions in the vicinity of a predetermined position on an optical image, combining meansforcombining pairs of symmetrically-weighted groups of said electrical signals in accordance with a predetermined algorithm to derive a pair of electrical signals dependent on the intensity of the optical signals at positions in the vicinity of said predetermined position and difference means to derive an electrical signal dependent on the difference between said pair of electrical signals.
GB08630026A 1985-12-16 1986-12-16 Inspection apparatus Withdrawn GB2184233A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB858530929A GB8530929D0 (en) 1985-12-16 1985-12-16 Inspection apparatus
GB858530928A GB8530928D0 (en) 1985-12-16 1985-12-16 Image enhancer

Publications (2)

Publication Number Publication Date
GB8630026D0 GB8630026D0 (en) 1987-01-28
GB2184233A true GB2184233A (en) 1987-06-17

Family

ID=26290124

Family Applications (1)

Application Number Title Priority Date Filing Date
GB08630026A Withdrawn GB2184233A (en) 1985-12-16 1986-12-16 Inspection apparatus

Country Status (4)

Country Link
EP (1) EP0289500A1 (en)
JP (1) JPS63503332A (en)
GB (1) GB2184233A (en)
WO (1) WO1987003719A1 (en)

Cited By (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5317387A (en) * 1989-04-14 1994-05-31 Hengel Cornelis G Van Method of and apparatus for non-destructive composite laminate characterization
WO2005015985A3 (en) * 2003-08-11 2005-05-12 Icerobotics Ltd Improvements in or relating to milking machines
US8393296B2 (en) 2011-04-28 2013-03-12 Technologies Holdings Corp. Milking box with robotic attacher including rotatable gripping portion and nozzle
US8590488B2 (en) 2010-08-31 2013-11-26 Technologies Holdings Corp. Vision system for facilitating the automated application of disinfectant to the teats of dairy livestock
US8671885B2 (en) 2011-04-28 2014-03-18 Technologies Holdings Corp. Vision system for robotic attacher
US8683946B2 (en) 2011-04-28 2014-04-01 Technologies Holdings Corp. System and method of attaching cups to a dairy animal
US8746176B2 (en) 2011-04-28 2014-06-10 Technologies Holdings Corp. System and method of attaching a cup to a dairy animal according to a sequence
US8800487B2 (en) 2010-08-31 2014-08-12 Technologies Holdings Corp. System and method for controlling the position of a robot carriage based on the position of a milking stall of an adjacent rotary milking platform
US8885891B2 (en) 2011-04-28 2014-11-11 Technologies Holdings Corp. System and method for analyzing data captured by a three-dimensional camera
US8903129B2 (en) 2011-04-28 2014-12-02 Technologies Holdings Corp. System and method for filtering data captured by a 2D camera
US9043988B2 (en) 2011-04-28 2015-06-02 Technologies Holdings Corp. Milking box with storage area for teat cups
US9049843B2 (en) 2011-04-28 2015-06-09 Technologies Holdings Corp. Milking box with a robotic attacher having a three-dimensional range of motion
US9058657B2 (en) 2011-04-28 2015-06-16 Technologies Holdings Corp. System and method for filtering data captured by a 3D camera
US9107379B2 (en) 2011-04-28 2015-08-18 Technologies Holdings Corp. Arrangement of milking box stalls
US9149018B2 (en) 2010-08-31 2015-10-06 Technologies Holdings Corp. System and method for determining whether to operate a robot in conjunction with a rotary milking platform based on detection of a milking claw
US9161512B2 (en) 2011-04-28 2015-10-20 Technologies Holdings Corp. Milking box with robotic attacher comprising an arm that pivots, rotates, and grips
US9161511B2 (en) 2010-07-06 2015-10-20 Technologies Holdings Corp. Automated rotary milking system
US9215861B2 (en) 2011-04-28 2015-12-22 Technologies Holdings Corp. Milking box with robotic attacher and backplane for tracking movements of a dairy animal
US9258975B2 (en) 2011-04-28 2016-02-16 Technologies Holdings Corp. Milking box with robotic attacher and vision system
US9265227B2 (en) 2011-04-28 2016-02-23 Technologies Holdings Corp. System and method for improved attachment of a cup to a dairy animal
US9357744B2 (en) 2011-04-28 2016-06-07 Technologies Holdings Corp. Cleaning system for a milking box stall
US9681634B2 (en) 2011-04-28 2017-06-20 Technologies Holdings Corp. System and method to determine a teat position using edge detection in rear images of a livestock from two cameras
US10111401B2 (en) 2010-08-31 2018-10-30 Technologies Holdings Corp. System and method for determining whether to operate a robot in conjunction with a rotary parlor
US10127446B2 (en) 2011-04-28 2018-11-13 Technologies Holdings Corp. System and method for filtering data captured by a 2D camera
US10357015B2 (en) 2011-04-28 2019-07-23 Technologies Holdings Corp. Robotic arm with double grabber and method of operation

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0355377B1 (en) * 1988-08-05 1994-01-26 Siemens Aktiengesellschaft Method for testing optically flat electronic component assemblies
DE3886539D1 (en) * 1988-10-17 1994-02-03 Siemens Ag Method for detecting the spatial position and orientation of previously known body.
DE3886538D1 (en) * 1988-10-17 1994-02-03 Siemens Ag A method for two-dimensional position and orientation detection of previously known bodies.

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB1252108A (en) * 1969-06-26 1971-11-03
GB1381520A (en) * 1971-09-23 1975-01-22 Netherlands Postal Telecommuni Automatic postal address detection
GB1568216A (en) * 1975-10-20 1980-05-29 Sangamo Weston Apparatus and method for parts inspection
EP0041870A1 (en) * 1980-06-10 1981-12-16 Fujitsu Limited Pattern position recognition apparatus
GB2112130A (en) * 1981-12-04 1983-07-13 British Robotic Syst Component identification systems
GB2151829A (en) * 1983-12-19 1985-07-24 Ncr Canada Document processing system and method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE3587220D1 (en) * 1984-01-13 1993-05-06 Komatsu Mfg Co Ltd Identification method of contour lines.

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB1252108A (en) * 1969-06-26 1971-11-03
GB1381520A (en) * 1971-09-23 1975-01-22 Netherlands Postal Telecommuni Automatic postal address detection
GB1568216A (en) * 1975-10-20 1980-05-29 Sangamo Weston Apparatus and method for parts inspection
EP0041870A1 (en) * 1980-06-10 1981-12-16 Fujitsu Limited Pattern position recognition apparatus
GB2112130A (en) * 1981-12-04 1983-07-13 British Robotic Syst Component identification systems
GB2151829A (en) * 1983-12-19 1985-07-24 Ncr Canada Document processing system and method

Cited By (107)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5317387A (en) * 1989-04-14 1994-05-31 Hengel Cornelis G Van Method of and apparatus for non-destructive composite laminate characterization
WO2005015985A3 (en) * 2003-08-11 2005-05-12 Icerobotics Ltd Improvements in or relating to milking machines
US9161511B2 (en) 2010-07-06 2015-10-20 Technologies Holdings Corp. Automated rotary milking system
US9648843B2 (en) 2010-08-31 2017-05-16 Technologies Holdings Corp. Automated system for applying disinfectant to the teats of dairy livestock
US10111401B2 (en) 2010-08-31 2018-10-30 Technologies Holdings Corp. System and method for determining whether to operate a robot in conjunction with a rotary parlor
US9980458B2 (en) 2010-08-31 2018-05-29 Technologies Holdings Corp. System and method for controlling the position of a robot carriage based on the position of a milking stall of an adjacent rotary milking platform
US9894876B2 (en) 2010-08-31 2018-02-20 Technologies Holdings Corp. Automated system for applying disinfectant to the teats of dairy livestock
US8707905B2 (en) 2010-08-31 2014-04-29 Technologies Holdings Corp. Automated system for applying disinfectant to the teats of dairy livestock
US8720383B2 (en) 2010-08-31 2014-05-13 Technologies Holdings Corp. Vision system for facilitating the automated application of disinfectant to the teats of dairy livestock
US8720382B2 (en) 2010-08-31 2014-05-13 Technologies Holdings Corp. Vision system for facilitating the automated application of disinfectant to the teats of dairy livestock
US8726843B2 (en) 2010-08-31 2014-05-20 Technologies Holdings Corp. Automated system for applying disinfectant to the teats of dairy livestock
US9888664B2 (en) 2010-08-31 2018-02-13 Technologies Holdings Corp. Automated system for applying disinfectant to the teats of dairy livestock
US8800487B2 (en) 2010-08-31 2014-08-12 Technologies Holdings Corp. System and method for controlling the position of a robot carriage based on the position of a milking stall of an adjacent rotary milking platform
US8590488B2 (en) 2010-08-31 2013-11-26 Technologies Holdings Corp. Vision system for facilitating the automated application of disinfectant to the teats of dairy livestock
US8807086B2 (en) 2010-08-31 2014-08-19 Technologies Holdings Corp Automated system for applying disinfectant to the teats of dairy livestock
US9775325B2 (en) 2010-08-31 2017-10-03 Technologies Holdings Corp. Automated system for applying disinfectant to the teats of dairy livestock
US9763424B1 (en) 2010-08-31 2017-09-19 Technologies Holdings Corp. Vision system for facilitating the automated application of disinfectant to the teats of dairy livestock
US9737043B2 (en) 2010-08-31 2017-08-22 Technologies Holdings Corp. Automated system for applying disinfectant to the teats of dairy livestock
US9706747B2 (en) 2010-08-31 2017-07-18 Technologies Holdings Corp. Automated system for applying disinfectant to the teats of dairy livestock
US9686961B2 (en) 2010-08-31 2017-06-27 Technologies Holdings Corp. Automated system for moving a robotic arm along a rotary milking platform
US9686962B2 (en) 2010-08-31 2017-06-27 Technologies Holdings Corp. Vision system for facilitating the automated application of disinfectant to the teats of dairy livestock
US9648839B2 (en) 2010-08-31 2017-05-16 Technologies Holdings Corp. System and method for determining whether to operate a robot in conjunction with a rotary milking platform based on detection of a milking claw
US9560832B2 (en) 2010-08-31 2017-02-07 Technologies Holdings Corp. Automated system for applying disinfectant to the teats of dairy livestock
US9549531B2 (en) 2010-08-31 2017-01-24 Technologies Holdings Corp. Automated system for applying disinfectant to the teats of dairy livestock
US9126335B2 (en) 2010-08-31 2015-09-08 Technologies Holdings Corp. Automated system for applying disinfectant to the teats of dairy livestock
US9149018B2 (en) 2010-08-31 2015-10-06 Technologies Holdings Corp. System and method for determining whether to operate a robot in conjunction with a rotary milking platform based on detection of a milking claw
US9516854B2 (en) 2010-08-31 2016-12-13 Technologies Holdings Corp. Vision system for facilitating the automated application of disinfectant to the teats of dairy livestock
US10327414B2 (en) 2010-08-31 2019-06-25 Technologies Holdings Corp. System and method for controlling the position of a robot carriage based on the position of a milking stall of an adjacent rotary milking platform
US9480238B2 (en) 2010-08-31 2016-11-01 Technologies Holdings Corp. Vision system for facilitating the automated application of disinfectant to the teats of dairy livestock
US8807085B2 (en) 2010-08-31 2014-08-19 Technologies Holdings Corp. Automated system for applying disinfectant to the teats of dairy livestock
US9474248B2 (en) 2010-08-31 2016-10-25 Technologies Holdings Corp. Automated system for applying disinfectant to the teats of dairy livestock
US9247709B2 (en) 2010-08-31 2016-02-02 Technologies Holdings Corp. System and method for controlling the position of a robot carriage based on the position of a milking stall of an adjacent rotary milking platform
US9462782B2 (en) 2010-08-31 2016-10-11 Technologies Holdings Corp. System and method for controlling the position of a robot carriage based on the position of a milking stall of an adjacent rotary milking platform
US9462781B2 (en) 2010-08-31 2016-10-11 Technologies Holdings Corp. Automated system for moving a robotic arm along a rotary milking platform
US9439392B2 (en) 2010-08-31 2016-09-13 Technologies Holdings Corp. Automated system for applying disinfectant to the teats of dairy livestock
US9433184B2 (en) 2010-08-31 2016-09-06 Technologies Holdings Corp. Automated system for applying disinfectant to the teats of dairy livestock
US9374975B2 (en) 2011-04-28 2016-06-28 Technologies Holdings Corp. System and method of attaching cups to a dairy animal
US9282720B2 (en) 2011-04-28 2016-03-15 Technologies Holdings Corp. Arrangement of milking box stalls
US9326480B2 (en) 2011-04-28 2016-05-03 Technologies Holdings Corp. Milking box with robotic attacher
US9357744B2 (en) 2011-04-28 2016-06-07 Technologies Holdings Corp. Cleaning system for a milking box stall
US9282718B2 (en) 2011-04-28 2016-03-15 Technologies Holdings Corp. Milking box with robotic attacher
US9374976B2 (en) 2011-04-28 2016-06-28 Technologies Holdings Corp. Milking box with robotic attacher, vision system, and vision system cleaning device
US9374979B2 (en) 2011-04-28 2016-06-28 Technologies Holdings Corp. Milking box with backplane and robotic attacher
US9374974B2 (en) 2011-04-28 2016-06-28 Technologies Holdings Corp. Milking box with robotic attacher
US9402365B2 (en) 2011-04-28 2016-08-02 Technologies Holdings Corp. Milking box with robotic attacher
US9271471B2 (en) 2011-04-28 2016-03-01 Technologies Holdings Corp. System and method for analyzing data captured by a three-dimensional camera
US9439390B2 (en) 2011-04-28 2016-09-13 Technologies Holdings Corp. System and method of attaching cups to a dairy animal
US9265227B2 (en) 2011-04-28 2016-02-23 Technologies Holdings Corp. System and method for improved attachment of a cup to a dairy animal
US9462780B2 (en) 2011-04-28 2016-10-11 Technologies Holdings Corp. Vision system for robotic attacher
US9258975B2 (en) 2011-04-28 2016-02-16 Technologies Holdings Corp. Milking box with robotic attacher and vision system
US9253959B2 (en) 2011-04-28 2016-02-09 Technologies Holdings Corp. System and method of attaching cups to a dairy animal
US9468188B2 (en) 2011-04-28 2016-10-18 Technologies Holdings Corp. System and method of attaching cups to a dairy animal
US9215861B2 (en) 2011-04-28 2015-12-22 Technologies Holdings Corp. Milking box with robotic attacher and backplane for tracking movements of a dairy animal
US9183623B2 (en) 2011-04-28 2015-11-10 Technologies Holdings Corp. System and method for filtering data captured by a 3D camera
US9480236B2 (en) 2011-04-28 2016-11-01 Technologies Holdings Corp. System and method of attaching a cup to a dairy animal according to a sequence
US9171208B2 (en) 2011-04-28 2015-10-27 Technologies Holdings Corp. System and method for filtering data captured by a 2D camera
US9485955B2 (en) 2011-04-28 2016-11-08 Technologies Holdings Corp. System and method of attaching cups to a dairy animal
US9491924B2 (en) 2011-04-28 2016-11-15 Technologies Holdings Corp. Milking box with robotic attacher comprising an arm that pivots, rotates, and grips
US9504224B2 (en) 2011-04-28 2016-11-29 Technologies Holdings Corp. Milking box with robotic attacher
US9510554B2 (en) 2011-04-28 2016-12-06 Technologies Holdings Corp. System and method for improved attachment of a cup to a dairy animal
US9161512B2 (en) 2011-04-28 2015-10-20 Technologies Holdings Corp. Milking box with robotic attacher comprising an arm that pivots, rotates, and grips
US9107379B2 (en) 2011-04-28 2015-08-18 Technologies Holdings Corp. Arrangement of milking box stalls
US9549529B2 (en) 2011-04-28 2017-01-24 Technologies Holdings Corp. Robotic attacher and method of operation
US9107378B2 (en) 2011-04-28 2015-08-18 Technologies Holdings Corp. Milking box with robotic attacher
US9582871B2 (en) 2011-04-28 2017-02-28 Technologies Holdings Corp. System and method for filtering data captured by a 3D camera
US9615537B2 (en) 2011-04-28 2017-04-11 Technologies Holdings Corp. Milking box with backplane responsive robotic attacher
US9474246B2 (en) 2011-04-28 2016-10-25 Technologies Holdings Corp. Milking box with robotic attacher
US9648840B2 (en) 2011-04-28 2017-05-16 Technologies Holdings Corp. Milking robot with robotic arm, vision system, and vision system cleaning device
US9058657B2 (en) 2011-04-28 2015-06-16 Technologies Holdings Corp. System and method for filtering data captured by a 3D camera
US9681635B2 (en) 2011-04-28 2017-06-20 Technologies Holdings Corp. Milking box with robotic attacher
US9681634B2 (en) 2011-04-28 2017-06-20 Technologies Holdings Corp. System and method to determine a teat position using edge detection in rear images of a livestock from two cameras
US9686959B2 (en) 2011-04-28 2017-06-27 Technologies Holdings Corp. Milking box with robotic attacher
US9686960B2 (en) 2011-04-28 2017-06-27 Technologies Holdings Corp. Milking box with robotic attacher
US9049843B2 (en) 2011-04-28 2015-06-09 Technologies Holdings Corp. Milking box with a robotic attacher having a three-dimensional range of motion
US9043988B2 (en) 2011-04-28 2015-06-02 Technologies Holdings Corp. Milking box with storage area for teat cups
US8903129B2 (en) 2011-04-28 2014-12-02 Technologies Holdings Corp. System and method for filtering data captured by a 2D camera
US9706745B2 (en) 2011-04-28 2017-07-18 Technologies Holdings Corp. Vision system for robotic attacher
US8885891B2 (en) 2011-04-28 2014-11-11 Technologies Holdings Corp. System and method for analyzing data captured by a three-dimensional camera
US9737048B2 (en) 2011-04-28 2017-08-22 Technologies Holdings Corp. Arrangement of milking box stalls
US9737042B2 (en) 2011-04-28 2017-08-22 Technologies Holdings Corp. System and method of attaching cups to a dairy animal
US9737041B2 (en) 2011-04-28 2017-08-22 Technologies Holdings Corp. System and method of attaching cups to a dairy animal
US9737039B2 (en) 2011-04-28 2017-08-22 Technologies Holdings Corp. Robotic attacher and method of operation
US9737040B2 (en) 2011-04-28 2017-08-22 Technologies Holdings Corp. System and method for analyzing data captured by a three-dimensional camera
US9743635B2 (en) 2011-04-28 2017-08-29 Technologies Holdings Corp. System and method of attaching cups to a dairy animal
US9756830B2 (en) 2011-04-28 2017-09-12 Technologies Holdings Corp. Milking box with robotic attacher
US8826858B2 (en) 2011-04-28 2014-09-09 Technologies Holdings Corp. Milking box with robotic attacher
US9763422B2 (en) 2011-04-28 2017-09-19 Technologies Holdings Corp. Milking box with robotic attacher
US8813680B2 (en) 2011-04-28 2014-08-26 Technologies Holdings Corp. Milking box with robotic attacher
US9883654B2 (en) 2011-04-28 2018-02-06 Technologies Holdings Corp. Arrangement of milking box stalls
US8746176B2 (en) 2011-04-28 2014-06-10 Technologies Holdings Corp. System and method of attaching a cup to a dairy animal according to a sequence
US8683946B2 (en) 2011-04-28 2014-04-01 Technologies Holdings Corp. System and method of attaching cups to a dairy animal
US9901067B2 (en) 2011-04-28 2018-02-27 Technologies Holdings Corp. Robotic attacher and method of operation
US9930861B2 (en) 2011-04-28 2018-04-03 Technologies Holdings Corp. Milking box with robotic attacher
US9980459B2 (en) 2011-04-28 2018-05-29 Technologies Holdings Corp. Milking box with robotic attacher comprising an arm that pivots, rotates, and grips
US8671885B2 (en) 2011-04-28 2014-03-18 Technologies Holdings Corp. Vision system for robotic attacher
US9980460B2 (en) 2011-04-28 2018-05-29 Technologies Holdings Corp. System and method for improved attachment of a cup to a dairy animal
US8651051B2 (en) 2011-04-28 2014-02-18 Technologies Holdings Corp. Milking box with robotic attacher
US10127446B2 (en) 2011-04-28 2018-11-13 Technologies Holdings Corp. System and method for filtering data captured by a 2D camera
US10143179B2 (en) 2011-04-28 2018-12-04 Technologies Holdings Corp. Milking box with a robotic attacher having a three-dimensional range of motion
US10172320B2 (en) 2011-04-28 2019-01-08 Technologies Holdings Corp. System and method of attaching a cup to a dairy animal according to a sequence
US10303939B2 (en) 2011-04-28 2019-05-28 Technologies Holdings Corp. System and method for filtering data captured by a 2D camera
US8393296B2 (en) 2011-04-28 2013-03-12 Technologies Holdings Corp. Milking box with robotic attacher including rotatable gripping portion and nozzle
US10327415B2 (en) 2011-04-28 2019-06-25 Technologies Holdings Corp. System and method for improved attachment of a cup to a dairy animal
US10349618B2 (en) 2011-04-28 2019-07-16 Technologies Holdings Corp. System and method of attaching a cup to a dairy animal according to a sequence
US10357015B2 (en) 2011-04-28 2019-07-23 Technologies Holdings Corp. Robotic arm with double grabber and method of operation
US10362759B2 (en) 2011-04-28 2019-07-30 Technologies Holdings Corp. Milking box with robotic attacher
US10373306B2 (en) 2011-04-28 2019-08-06 Technologies Holdings Corp. System and method for filtering data captured by a 3D camera

Also Published As

Publication number Publication date
WO1987003719A1 (en) 1987-06-18
GB8630026D0 (en) 1987-01-28
EP0289500A1 (en) 1988-11-09
JPS63503332A (en) 1988-12-02

Similar Documents

Publication Publication Date Title
US5579444A (en) Adaptive vision-based controller
Piper et al. On fully automatic feature measurement for banded chromosome classification
EP0338677B1 (en) Image processing method for shape recognition
US4547800A (en) Position detecting method and apparatus
US4395698A (en) Neighborhood transformation logic circuitry for an image analyzer system
US4075604A (en) Method and apparatus for real time image recognition
US4013999A (en) Single read station acquisition for character recognition
EP1484595B1 (en) Color space transformations for use in identifying objects of interest in biological specimens
US3214574A (en) Apparatus for counting bi-nucleate lymphocytes in blood
US4759074A (en) Method for automatically inspecting parts utilizing machine vision and system utilizing same
US20080187220A1 (en) Device and method for fast computation of region based image features
JP3264932B2 (en) Method and apparatus for separating the foreground from the background in the image containing the text
EP0145725B1 (en) Method of and apparatus for real-time high-speed inspection of objects for identifying or recognizing known and unknown portions thereof, including defects and the like
US6267296B1 (en) Two-dimensional code and method of optically reading the same
EP0333921A2 (en) Cell image processing method and apparatus therefor
US5214744A (en) Method and apparatus for automatically identifying targets in sonar images
US5698833A (en) Omnidirectional barcode locator
US4969198A (en) System for automatic inspection of periodic patterns
US5379353A (en) Apparatus and method for controlling a moving vehicle utilizing a digital differential analysis circuit
US5220621A (en) Character recognition system using the generalized hough transformation and method
JP2930618B2 (en) Evaluation method and evaluation apparatus of the cell image
Park et al. Fast connected component labeling algorithm using a divide and conquer technique.
US6839454B1 (en) System and method for automatically identifying sub-grids in a microarray
Müller et al. Visual search for singleton feature targets within and across feature dimensions
US5933519A (en) Cytological slide scoring apparatus

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)