GB2414545A - Method for identifying and sorting objects - Google Patents
Method for identifying and sorting objects Download PDFInfo
- Publication number
- GB2414545A GB2414545A GB0412027A GB0412027A GB2414545A GB 2414545 A GB2414545 A GB 2414545A GB 0412027 A GB0412027 A GB 0412027A GB 0412027 A GB0412027 A GB 0412027A GB 2414545 A GB2414545 A GB 2414545A
- Authority
- GB
- United Kingdom
- Prior art keywords
- region
- data
- component data
- spatial periodicity
- components
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07D—HANDLING OF COINS OR VALUABLE PAPERS, e.g. TESTING, SORTING BY DENOMINATIONS, COUNTING, DISPENSING, CHANGING OR DEPOSITING
- G07D3/00—Sorting a mixed bulk of coins into denominations
- G07D3/14—Apparatus driven under control of coin-sensing elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/42—Global feature extraction by analysis of the whole pattern, e.g. using frequency domain transformations or autocorrelation
- G06V10/431—Frequency domain transformation; Autocorrelation
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07D—HANDLING OF COINS OR VALUABLE PAPERS, e.g. TESTING, SORTING BY DENOMINATIONS, COUNTING, DISPENSING, CHANGING OR DEPOSITING
- G07D5/00—Testing specially adapted to determine the identity or genuineness of coins, e.g. for segregating coins which are unacceptable or alien to a currency
- G07D5/005—Testing the surface pattern, e.g. relief
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Image Processing (AREA)
Abstract
An object such as a gaming chip 1 is passed on a conveyor 2 underneath a detecting head 3. The detecting head 3 comprises a camera 4 and a light source 5, such as an array of LEDs. The light source is pulsed on and off to illuminate the chip as it passes the camera, which takes an image. The image is analysed to determine the identity of the gaming chip. This may be done by analysing pixels from one or more predetermined regions from the image of the chip: for example, many gaming chips have symmetrical patterns on their surfaces, and pixel data from these can be combined and used to determine the periodicity of the pattern, which is characteristic of the type of chip. The pixel data from the predetermined regions of chip 1 being imaged is compared to pixel data from corresponding regions of known chips, which allows the chip 1 to be identified. Based on the comparison results, the chips are sorted by actuating one of solenoids 9a - 9e, thus sorting the chip into an appropriate chip bin 10a - 10e.
Description
24 1 4545
OBJECT IDENTIFICATION METHOD AND APPARATUS
This invention generally relates to a method and apparatus for identifying an object based on an image of the ob ject.
The automated identification of an object is a problem which arises m many environments. One such application is the sorting of objects as they move along a conveyor. A specific example of this is a system for sorting gaming chips in a casino.
Figure 5 illustrates some typical gaming chips which are coloured and include various patterns applied to them. Such objects pose problems for identification sorting since they are not of a single and uni form colour and comprise multiple colours with multiple different possible patterns applied thereon.
in the prior art various systems have been disclosed which attempt to provide a solution to this problem. For example, US 6075217 discloses a chip sorting apparatus in which the identity of a chip is determined by analysing the spectrum of light reflected by the chip. A similar method is disclosed in US 5933244 in which the colour of a chip is sensed at a plurality of places or orientations and integrated in order to provide a colour measurement which is then compared with a test chip. WO 99/60353 describes a method of detecting the colour of an article such as a chip by taking colour histograms for pixels of an hnage of the chip.
All these prior art methods rely purely on the colour of the object being identified and do not take into consideration spatial information.
Thus an object of an aspect to the present invention is to provide a method and apparatus for identifying an object In which spatial information is used for discrimination.
The first aspect of the present invention provides a method and apparatus for identifying an object in which combined pixel data for pixels in each of a plurality of predetermined regions in an image of the object are determined and compared for each region with predetermined combined pixel data for a corresponding region for at least one object having a known identity. The identity of the object is thus determined in dependence upon the result of the comparison.
In accordance with this aspect of the present invention data for pixels in each region of an image of the object arc combined and then compared with combined pixel data for a corresponding region of a reference object. In this way spatial information in the object is used in the identification ol the object.
In one embodiment of the present invention the comparison of the combined pixel data for each region comprises determining the difference between the determined combined pixel data for the region and the predetermined combined pixel data for the corresponding region for the objects of known identity, and combining the determined differences for the or each known object to determine a combined difference for the or each respective known object. The identity indicator is then determined based on the combined difference for the or each object of known identity.
The combined difference can be determined as a weighted combination, with di Brent weights applied to the determined differences for the regions. These weights can be preselected or can be dependent upon the number of pixels in each predetermined region, thereby increasing the sgmficance of larger image regions.
Another aspect of the present invention provides a method of identi fymg an object in which spatial periodicity component data of the pixels and at least one predetermined region of an image of the object are determined. This spatial periodicty component data comprises spatial periodicity components in only one direction. The spatial periodicity component data is compared with predetermined spatial periodicity component data for at least one object of an identity and the identity of the object is determined in dependence upon the results of the companson.
Thus this aspect of the present invention provides an apparatus method in which spatial periodic features on the object are utilised for identification of the object.
In one embodiment the spatial periodicity components comprise angular periodicity components. Thus in this embodiment the object is presumed to have some degree of rotational symmetry whereby there is a circular repetitive pattern appearing on the object.
In another embodiment ol the present invention the spatial periodicity components comprise linear periodicity components. Thus In this embodiment of the present invention the object is assumed to have some degree of translational symmetry whereby there is a pattern on the object which repeats in a line.
The spatial periodicity components can comprise any components which reflect the spatial repetitiveness of the features on the object. For example, the components can comprise spatial harmonics i.e. spatial frequency components. Alternatively, other waveform bases can be used.
The spatial periodicity components utilised can be preselected. They need not be the lowest spatial frequency components but instead can be any selected harmonics chosen according to the likelihood of their presence in the image.
In one embodiment the determination of the spatial periodicity component data for each region includes the determination of phase independent spatial periodic component data such as the magnitude or power. The comparison ol the combined pixel data for each region comprises comparing the phase independent spatial periodicity component data for each region with predetermined phase independent spatial periodicity component data for a corresponding region for the objects of known identity.
In one embodiment of the present invention the image comprises colour pixel data, comprising a plurality of colour pixel components. The spatial periodicity component data is determined for each of the colour pixel components for the pixels in each region. In this embodiment the spatial periodicity component data can be detcrmmed by accumulating separate colour pixel components for pixels in each region.
In another embodiment to the present invention the spatial periodicity component pixel data is determined by combining colour pixel components for each pixel and determining spatial periodicity component data from the combined colour pixel components for pixels in each region.
Another aspect to the present invention provides a method of sorting objects in which the objects are identified by comparing spatial discrimination parameters and sorting in accordance with the determined identity.
The present invention can be implemented in hardware, a combination of dedicated hardware and software or a general purpose computer and software. The present invention thus encompasses a computer program for controlling a computer, processor, or processing apparatus to carry out the identification and sorting method.
The computer program can be provided on any suitable carrier medium. The carrier medium can comprise a transient medium such as a signal, e.g. an electrical, optical, microwave, acoustic, or a magnetic signal carrying the code, or a storage medium such as a floppy disc, hard disc, CD ROM, or programmable memory device.
Embodiments to the present invention will now be described with reference to the accompanying drawings, in which: Figure I is a schematic diagram of a sorting apparatus for sorting gaming chips in accordance with an embodiment of the present invention; Figure 2 is a under side view of the chip recognition head in the embodiment of the Figure 1; Figure 3 is a plan view of the structure of the conveyor in the embodiment of Figure 1; Figure 4 is a schematic diagram of the circuitry provided for chip identification in the embodiment of Figure 1; Figure 5 is a diagram of three different gaming chips; Figure 6 is a diagram illustrating the four regions of interest used in the chip identification method and the polar coordinates used for the determination of the spatial periodicity components; Figure 7 is a timing diagram illustrating the timing for the acquisition of the nnage data; Figure 8 is a diagram illustrating the raw image data output from the camera m the form of a Bayer mosaic before processing to generate RGB pixel image data; Figure 9 is a flow diagram illustrating the chip identification process; Figure 10 is a diagram illustrating the combined pixel data comprising spatial periodicity component data for each region of interest and for each colour component (RGB); Figure 11 Is a flow diagram illustrating the chip detect process in the flow diagram of Figure 9; Figure 12 is a flow diagram illustrating the chip learning process in the flow diagram of Figure 9; Figure 13 is a flow diagram illustrating the compare process in the flow di agrams of Figures 11 and 1 2; and Figure 14 is a diagram illustrating the principles behind the automatic alignment process.
The present invention will now be described with reference to the sorting of gaming chips. The present invention is not however limited to gaming chips and can be applied to the identi fication and sorting of any object.
Figure 1 is a schematic illustration of a chip sorting apparatus sorting gaming chips in a casino. Gaming chips 1 are conveyed along a conveyor 2 and pass underneath a chip recognition head 3 which comprises a camera 4 and a light source 5.
The light source 5 can be seen in more detail in Figure 2. Eight white LEDs 6 are provided circumferentially around the camera 4 to project light downwards onto the chips I being conveyed by the conveyor 2. The white light LEDs 6 provide a bright white light for the taking of an image by the camera 4. These LEDs 6 are controlled to operate in a strobe like manner to switch on whenever a gaming chip l passes Into an imaging field of the camera 4. The LEDs 6 are controlled to operate by a microprocessor 12 which receives signals from a shaft encoder 7. The shaft encoder 7 provides information on the position of the conveyor 2 and the microprocessor 12 uses this information to generate a trigger signal for the LEDs 6.
The timmg of the signals is illustrated in Figure 7. The microprocessor 12 generates a strobe input signal which causes the LED current to rise. The first l O milliseconds the camera 4 waits to allow for the wanm up of the LEDs 6. A signal is then generated to the camera to initiate the acquisition of the image which takes place during the remaining part of the strobe input signal. A readout signal is generated for the camera to cause the readout of the data for processing.
The output of the chip recognition head 3 comprises a chip identifier and this is transmitted to a sorting controller 8. The sorting controller 8 controls the operation of solenoids 9a, 9b, 9c, 9d, and 9e. The solenoids 9a to 9c are positioned under the conveyor 2 and output the chips to receiving chip bins lea, lOb, lOc, lOd and lOe respectively. Each of the chip bins I Oa to I Oe is for receiving a respective type of chip.
Thus the chip controller 8 operates to receive the chip identifier signal from the chip recognition head 3 and in dependence on the signal controller 8, respective solenoid 9a to 9e ejects a gaming chip 1 Tom the conveyor 2 into a respective gaming chip bin l Oa to lOe.
Figure 3 is a plan view illustrating the structure of the conveyor 2. The conveyor 2 is comprises of a sequence of chip pockets 10. The pockets are fonned to have spokes 11 to support the gaming chips I and to allow the solenoids 9a to 9c to operate therethrough. The solenoids operate through sprung pins 13 just ahead of the centre of each pocket.
Figure 4 Is a schematic diagram of the circuitry provided as circuitry accompanying the camera 4 in the chip recognition head 3 in the embodiment of Figure l. The image data output from the camera 4 is input mto a field programmable gate array (FPGA) l 5. Electronic erasable programmable read only memory (EEPROM) 16 is provided for access by the FPGA 15. Random access memory (RAM) l 7 is provided to be used as working memory by the FPGA I S during the processing of the Image data from the camera 4. A microprocessor 18 is provided to receive the output of the FPGA and to perform further processing on the data. The microprocessor 18 is provided with a universal serial bus (USB) port to enable the output of image data to a computer for testmg and manual alignment purposes. The microprocessor 18 is also provided with a serial output port for the output of the chip identification tag which is input to the controller 8.
In this embodiment image data from the camera 4 can be processed by the FPGA 15 and the microprocessor 18. Additionally, or alternatively, the image data lrom the camera 4 can be output to a programmed computer for the processing of the image data. The computer can then generate the chip identification tag for input to the controller 8 for control of the sorting of the gaming chips.
Figure 5 illustrates three different gaming chips which could require sorting using the sorting apparatus of Figure 1. As can be seen, the patterns applied to the gaming chips have rotational symmetry and there is a repeated pattern applied to at least some of the surfaces of the chip. It has been recognised that gaming chips generally have certain regions which are distinctive and can thus be used for identification.
Figure 6 illustrates these regions. Gaming chips often include repeated patterns around an outer circumferential track region 20. The third chip illustrated in Figure 5 includes such a pattern. Also as can be seen in all three chips illustrated in Figure 5, there is an mner track region 21 which can carry a different repeated pattern. The region between the outer track 20 and the inner track 21 i.e. region 22 can obtain a repeated pattern or can simply be of a particular colour. A central region 23 often contains graphic information, as illustrated in the third example of Figure 5.
Thus m this embodiment of the present Invention regions of the object are selected so that regions may provide different information to facilitate the identification of the gaming chip. The region of interest template is stored in the EEPROM 16. This template can then be applied to each image of a chip to assist in the spatial discrimination of information carried by the chip to facilitate identification.
The operation of the invention will now be described with reference to Figures 8 to 14.
Figure 8 illustrates a partial output of the camera 4. The output of the camera is an image array comprising 1280 x 1024 pixels, each pixel comprising a 10 bit brightness level. The pixels however are covered by a Bayer colour mosaic filter compnsmg groups of four pixels R. G. G. and B. These are illustrated in Figure 8.
These pixels are output from the camera 4 on a 10 bit bus at a pixel clock rate 24 MHz.
The FPGA 1 S uses a group of four pixels by buffering alternate lines of red/green pixels so that the red/green pixels can be combined with the greenlblue pixels to form a single RGB pixel. There is thus no interpolation or extrapolation and the resolution of the image is thus reduced to 640 x 512 pixels of RGB 30 bit colour. In this embodiment the FPGA 15 does not however have to wait for the whole frame to be output by the camera 4 before processing the image data. Once the RGB data for pixel has been determined from a Bayer mosaic (i.e. after a one line delay) the FPGA 15 can begin to process the data as will be described in more detail with reference to the flow diagram of Figure 9.
In order to carry out the chip identification process, the EEPROM 16 must be loaded with the bit file i.e. the firmware for the FPGA 15. Also the EEPROM 16 must store a map of the regions of interest. In one embodiment this comprises, for each pixel, a region of interest identifier, a value for sine 0, sine 2d, sine 3, cosine it, cosine 2, and cosine 30. Thus the EEPROM 16 stores an array of 640 x 512 x 6 pixels comprising spatial frequency coefficients to be applied to the RGB pixel data for each pixel. However, since the storage requirements for such an array is large, as an alternative, the EEPROM 16 can instead store an angle 0 for each pixel and a table of cosine and sine values for a limited number of angles e.g. 256. Thus in order to determine the coefficients, the FPGA 15 can simply look up the angle 0 for a pixel, and use the sine and cosine tables to determine the 6 spatial frequency coefficients for the pixel. This method greatly reduces the storage requirements of the EEPROM 16 without introducing a significant calculation overhead in the FPGA 15.
Initially, the EEPROM 16 contains no reference data for known identified gaming chips. However, once the process is undertaken to learn game chips, reference data for the known gaming chips is stored in the EEPROM 16. The form of this data will be described in more detail hereinafter.
The operation of this embodiment for the identification and learning of the identity of gaming chips will now be described with reference to the flow diagram of Figure 9.
The process starts at Step S30 and the process awaits a new frame of data to be output from the camera 4. The FPGA 15 receives the pixel data input from the camera 4 in Step S32 and reduces the Bayer mosaic to produce a single ROB data set for a pixel based on the four pixels of the Bayer mosaic cell. The FPGA 15 retrieves the region of interest identifier and the coefficients for the pixel from the EEPROM 16 in Step S33.
The coefficients comprise sine it, sine 20, sine 3d, cosine , cosine 2p, and cosine 30.
The coefficients are multiplied by each of values R. G. and B. to generate a 3 x 8 array for the pixel in Step S34. An array of generated pixels is then accumulated by adding it to a previous array generated for pixels in the same region of interest (Step S35). This process is repeated (Step S36) for each pixel in the frame until all 640 x 512 pixels have been processed and the arrays have been accumulated for the four regions of interest.
The result of this accumulation process is illustrated in Figure 10. For each region of interest, and for each colour component for each region of interest eight data elements are stored comprising, an accumulated DC colour component, an accumulated power i.e. DC2 component, and accumulated spatial frequency components for all of the pixels in the region of interest.
The four arrays illustrated in Figure 10 are output from the FPGA 15 to the microprocessor 18 (Step S37). The processing then performed on the accumulated data for the frame depends upon the mode of operation of the sorting apparatus (Step S38).
In an initial phase the sorting apparatus needs to learn data for the identity of an empty fillip pocket. If this mode is selected, in Step S39 the four arrays are stored in the EEPROM 16 and labelled with a pocket tag to identify that the data can be used to Identify an empty pocket. The process then returns to Step S31 to await a new frame.
If the apparatus is set into a chip learning mode (Step S40) the process of Figure 12 is carried out.
As can be seen in Figure 12, the four arrays input the microprocessor 18 in Step S37 are compared to the four arrays stored for a pocket in Step S39 to detennine a score (Step S400). If these determine that the score is below an empty pocket match threshold (S401) it estimates that the pocket is empty i.e. no chip is detected and the process returns to Step S3 1 to await a new frame. If however the pocket match threshold is exceeded in Step S401i.c. the four arrays do not match the four arrays for the pocket, the input four arrays for the front are stored in the EEPROM 16 with a new chip tag in Step S402 to be used for future chip identification. The microprocessor 18 then outputs a new chip tag on the serial output port to cause the controller 8 to fire the solenoid 9a to cause the chip to be ejected from the chip pocket of the conveyor 2 into the chip bin lOa. Thus this confirms that the chip learmug process has to be successful and that the chip will in future be sorted into the chip bin lOa. The process then returns to Step S3 1 to await a new frame.
The chip learning process Step S40 can be repeated for as many chips as there are chip bins. In this embodiment five chip bins I Oa to toe are provided. Thus the sorting apparatus can sort five different chips.
If the apparatus is set in Step S38 in the chip detection mode, the chip detect process Step S41 is undertaken. The chip detect process is illustrated in more detail in Figure 11. The four arrays determined for the frame are compared to the four arrays stored for each chip and for the pocket to determine a score for each chip and for the pocket in Step S410. Thus the comparison process generates a number of scores: for the chip and for the pocket. It is then determined in Step S411 whether the best score is below a threshold. If not the detection process is rejected in Step S412 and the process returns to Step S3 1 to await a new frame. The purpose of this process is to avoid inaccurate recognition and incorrect firing of solenoids for empty pockets. If for whatever reason the chip does not adequately match the learned chips, it will not be sorted.
If the best score is below a threshold (S411) it is then determined whether the best score indicates a match with the pocket. If so in Step S413 the process returns to Step S3 1 since the matching process indicates that the pocket is empty. If the best score is not the score for a pocket, the chip tag corresponding to the best score is output by the microprocessor 18 over the serial output port in Step S414 to cause the controller 8 to fire the corresponding solenoid to cause the chip to be ejected from the conveyor 2 into the correct corresponding chip bin 10 a to I Oe. The process then returns to Step S3 I to await a new frame.
Figure 13 illustrates the comparison process of Steps S410 and S400 in Figures 1 1 and 12 in more detail.
In Step S50, for each array the coefficients are squared and the corresponding sine and cosine coefficients are summed after being squared. Thus the resulting coefficients are reduced from 7 to 4. These comprise the DC component, the sum of squares of the sine O and cosine O components, the sum of the squares of the sine 23 and cosine 2d components, and the sum of the squares of the sine 3≈and the cosine 30 components.
For each region of interest (i.e. for each array) the absolute or squared difference between each coefficient of the array and the reference arrays read from the EEPROM 16 are determined. The result is thus a difference array for each region of interest and a set of arrays for each learned chip and/or pocket. In Step S52 the differences for all the coefficients and for all regions of interest are summed in order to determine a score. It is this score which is used in the comparison steps S411, S413 and S401 in Figures 11 and 1 2.
The summation used in Step S52 can be a weighted summation. In order to attach varied significance to each of the regions of interest, the difference arrays for each region of interest can be differently weighted. Also, or in addition, the spatial frequency components for a region of interest can be differently weighted between different regions of interest and/or different colours in the same region of interest. This enables the enhancement of the impact of certain spatial periodicity components appearing on the surface of the game chip. In this way the recognition technique can be tailored to suit the appearance of the gaming chips to use in the sorting apparatus.
In this embodiment, as illustrated in Figure] 0, although three harmonics are used for the spatial frequency components, the present invention is not limited to the use of three and can encompass the use of any numbers. Also, the present invention is not limited to the use of the first three spatial frequency harmonics and any spatial frequency harmonics can be selected as suitable for the identification of a gaming chip.
Further, although in this embodiment the colour data RGB is used, any colour code can be used for providing colour information in addition to spatial information e.g. HSL, His, or CMY. Further, the present invention is not limited to the use of colour data and is also applicable to the use of monochrome or non-visible ultraviolet or infrared data.
In the embodiment of the present invention described hereinabove, the region of interest illustrated in Figure 6 comprises separate regions of interest. However, the present invention encompasses the use of overlapping regions of interest whereby a pixel can belong to more than one region of interest. In such an embodiment the data stored in the EEPROM for the pixels must be able to identify that pixel belongs to more than one region of interest. Thus in the accumulation process of Step S35 in Figure 9, the 3 x 8 array generated will be accumulated into more than one region of interest array illustrated in Figure] 0.
In the embodiment described above the spatial frequency components are determined for each coloured component separately. However, the present invention is not related to this and the spatial frequency components can be formed from a combination of colour component data i.e. in Figure 10, instead of using R G and B separately as the three colour components, in addition, or instead, combined colour components can be used such as a combination of red and green, a combination of green and blue, and a combination of red and blue.
The present invention is also not hmited to the determination of spatial periodcity components. The present invention encompasses the generation of combined pixel data for regions of interest for example by generating a combination of colour components such as determining covariance parameters for R G and B i.e. RxG, GxB, and BxR. These parameters of each pixel are accumulated for each region of interest.
In the embodiment illustrated in Figure l 0, in addition to the DC component, the square of the DC component i.e. the power is determined and stored as a component in the array additional to the spatial frequency components. The power component can be used in conjunction with the DC component to determine a standard deviation. The DC component comprises an indicator of the mean.
In the embedment described above, the object to be identified comprises a gaming chip. The present invention is however not limited to a gaming chip and also not limited to a circular article. The gaming chip has circular patterns and thus the symmetry thus enables the use of angular frequency coefficients as the spatial periodicity components. The present invention is however applicable to the formation of other one dimensional spatial transform series. For example, if an article has a linear repeat pattern, i.e. translation or symmetry, the linear periodieity components can be used.
In the embodiment described hereinabove with reference to Figure 1, a problem occurs in the alignment of the gaming chips and chip pockets of the conveyor with the image field of the camera 4. One method of alignment is an alignment process in which the image output from the microprocessor 18 Is viewed on a computer and a signal is sent to the microprocessor l 2 accompanying the shaft encoder 7 to shift the synchronizing signal to the chip recognition head 3. However, using the spatial periodicity components it is possible to provide for the automatic alignment of an object.
Figure l 4 illustrates the misalignment of a gaming chip 70 with a region of interest 7 l. The gaming chip 70 has an outer band 72 corresponding to the region of interest 71. The game chip is translationally displaced with regard to the region of interest 71 in the image field. The image data from the camera 4 when processed to generate the spatial periodicity components will enable the identification of this translation by the comparison of the in-phase and quadrature components. The sine and cosine components should be balanced when the chip is aligned with the region of interest 7 1.
In the embodiment illustrated in Figure 14, since the spatial periodicitycomponents comprise angular periodicity components, translational misaligned can be detected and this can be used as a feedback signal to the microprocessor 12 for the iterative alignment of the region of interest and the chip. This process works both for chips and for pocket detection. As can be seen in Figure 3, the pockets are separated by solid spaces. Thus the presence of the spaces in the region of interest due to misalignment will cause the imbalance of the in-phase and quadrature components in this manner as for the case illustrated in Figure 14.
In an alternative embodiment to the present invention, where the object has translational symmetry, i.e. a linear pattern, the spatial periodicity components comprise a linear periodicity components and thus rotational misalignment can be detected in a similar manner to that described with reference to Figure 14.
In the embodiments described hereinabove the spatial periodicity components stored need not comprise the sine and cosine component. The components can be stored which do not keep the phase information. For example, the power components can be stored facilitating direct comparison for identification purposes.
Although the present invention as described hereinabove with reference to specific embodiments, it will be apparent to the skilled person in the art that modifications lie within the spirit and scope of the present invention.
Claims (103)
- CLAIMS: 1. A method of identifying an object comprising: determiningcombined pixel data for pixels in each of a plurality of predetermined regions in an image of said object; comparing the determined combined pixel data for each region with predetermined combined pixel data for a corresponding region for at least one object of known identity; and determining an identity indicator for the object in dependence upon the result of the comparison.
- 2. A method according to claim 1, wherein said comparison of said combined pixel data for each region comprises determining the difference between the determined combined pixel data for the region and predetermined combined pixel data for the corresponding region for said at least one object of known identity, and combinhlg the detennincd differences for the or each known object to determine a combined difference for the or each respective known object; and said identity indicator is determined based on the combined difference for the or each object of known identity.
- 3. A method according to claim 2, wherein said combined difference is determined as a weighted combination, with different weights being applied to the determined differences for said regions.
- 4. A method according to claim 3, wherein said weights are preselected.
- 5. A method according to claim 3, wherein said weights are dependent upon a number of pixels in each predetermined region.
- 6. A method according to claim 1, wherein said combined pixel data for at least one predetermined region comprises spatial periodicity component data for pixels in said at least one predetermined region.
- 7. A method according to claim 6, wherein said spatial periodcity component data comprises spatial periodicity components in only one direction.
- 8. A method according to claim 7, wherein said spatial periodicity components comprise angular periodicity components.
- 9. A method according to claim 7, wherein said spatial periodicity components comprise linear periodicity components.
- 10. A method according to any one of claims 6 to 9, wherein said spatial periodicity component data comprises preselected spatial periodicity components.
- 11. A method according to any one of claims 6 to] O. wherein the determination of the spatial periodicity component data for each region includes the determination of phase independent spatial periodicity component data, and the comparison of the combined pixel data for the regions comprises comparing the phase independent spatial periodicity component data for each region with predetermined phase independent spatial periodicity component data for a corresponding region for said at least one object of known identity.
- 12. A method according to claim 11 wherein the phase independent spatial periodicity component data comprises magnitude data.
- 13. A method according to any one of claims 6 to 12, wherein the comparison of the combined pixel data comprises determining the difference between the spatial penodcty component data for the region and predetermined spatial periodicity component data for the corresponding region for said at least one object of known identity, and combining the detennined differences for the or each object of known identity to determine a combined difference for the or each respective known object; and said identity indicator is determined based on the combined difference for the or each object of known identity.
- 14. A method according to claim 13, wherein said combined difference is determined as a weighted combination.
- 15. A method according to claim 14, wherein different weights are applied to the determined differences for said regions.
- 16 A method according to claim 14 or claim 15, wherein different weights are applied to the determined differences for components of said spatial periodicity component data in at least one said region.
- 17. A method according to claim 15, wherein said weights are preselected.
- 18. A method according to claim 15, wherein said weights are dependent upon a number of pixels in each predetermined region.
- 19. A method according to any one of claims 6 to 18, wherein said spatial periodicity component data includes a zero spatial frequency component.
- 20. A method according to claim 19, wherein said spatial periodicity component data includes a power component derived from said zero spatial frequency component.
- 21. A method according to any preceding claim, wherein said image comprises colour pixel data comprising a plurality of colour pixel components, and said combined pixel data is determined by combining separate colour pixel components for pixels in each said region.
- 22. A method according to claim 21, wherein said combined pixel data is determined by accumulating separate colour pixel components for pixels m each said region.
- 23. A method according to any one of claims I to 20, wherein said image comprises colour pixel data comprising a plurality of colour pixel components, and said combined pixel data is detennined by combining colour pixel components for each pixel and combining the combined colour pixel components for pixels in each said region.
- 24. A method according to any preceding claim, wherein at least two of said predetermined regions overlap, and some of said combine pixel data belongs to more than one region.
- 25. A method according to any preceding claim, wherein said predetermined regions of said image do not include at least one predetermined undesirable region of said Image.
- 26. A method or sorting objects comprising the method according to any preceding claim for identifying an object, and sorting the object in accordance with the determined identity indicator.
- 27. A method of determining predetermined combined pixel data for a plurality of regions in an image of an object of known identity for use in the method of any preceding claim, the method comprising: for each of a plurality of like objects predetermined to be of the same known identity, determining combined pixel data for pixels in each of a plurahty ol predetermined regions in an image of a known object; and averaging the determined combined pixel data each region ol the plurality of objects to determine the predetermined combined pixel data for each region.
- 28. A method according to claim 27, Including storing the predetermined combined pixel data.
- 29. A method of aligning an object in an image field comprising: determining spatial periodieity component data for pixels in at least one region of the image field, said spatial periodieity component data comprising in-phase and quadrature components; comparing the in-phase and quadrature components for the or each region; and controlling the alignment of the object in the image field in dependence upon the result of the comparison.
- 30. A method according to claim 29, wherein the spatial periodicity component data for the or each region is accumulated for the pixels in the or each region.
- 31. A method according to claim 29 or claim 30, wherein the spatial periodicity component data comprises angular periodicity component data, and said alignment comprises translational alignment.
- 32. A method according to claim 29 or claim 30, wherein the spatial periodicity component data comprises translational periodicity component data, and said alignment comprises rotational alignment.
- 33. A method of identifying an object comprising: detennining spatial periodicity component data for pixels in at least one predetermined region in an image of said object, said spatial periodicity component data compusmg spatial periodicity components in only one direction; comparing the determined spatial periodicity component data with predetermined spatial periodicity component data for at least one object of known identity; and determimng an identity indicator for the object in dependence upon the result of the comparison.
- 34. A method according to claim 33, wherein said spatial periodicity components comprise angular periodicity components.
- 35. A method according to claim 33, wherein said spatial periodicity components comprise linear periodicity components.
- 36. A method according to any one of claims 33 to 35, wherein said spatial periodicity component data comprises preselected spatial periodicity components.
- 37. A method according to any one of claims 33 to 36, wherein the determination of the spatial periodicity component data for the or each region includes the determination of phase independent spatial periodicity component data, and the comparison of the combined pixel data for the or each region comprises comparing the phase independent spatial periodicity component data for the or each region with predetermined phase independent spatial perodicity component data for a corresponding region for said at least one object of known identity.
- 38. A method according to claim 37 wherein the phase independent spatial penodicity component data comprises magnitude data.
- 39. A method according to any one of claims 33 to 38, wherein the comparison of the spatial periodicity component data comprises determining the difference between the spatial periodicity component data for the region and predetermined spatial periodicity component data for the corresponding region for said at least one object of known identity, and combining the determined differences for the or each known object to determine a combined difference for the or each respective object of known identity; and said identity indicator is determined based on the combined difference for the or each object of known identity.
- 40. A method according to claim 39, wherein said combined difference is determined as a weighted combination.
- 41. A method according to claim 40, wherein different weights are applied to the determined differences for the or each region.
- 42. A method according to claim 3'3 or claim 40, wherein different weights are applied to the determined differences for components of said spatial periodicity component data in the or each region.
- 43. A method according to claim 41 or claim 42, wherein said weights are preselected.
- 44. A method according to claim 41, wherein said weights are dependent upon a number of pixels in each predetermined region.
- 45. A method according to any one of claims 33 to 44, wherein said spatial periodicity component data includes a zero spatial frequency component.
- 46. A method according to claim 45, wherein said spatial periodicity component data includes a power component derived from said zero spatial frequency component.
- 47. A method according to any one of claims 33 to 46, wherein said image comprises colour pixel data comprising a plurality of colour pixel components, and said spatial periodicity component data is determined for each of the colour pixel components for pixels in the or each region.
- 48. A method according to claim 47, wherein said spatial periodicity component data Is determined by accumulating separate colour pixel components for pixels in the or each region.
- 49. A method according to any one of claims 33 to 46, wherein said wage comprises colour pixel data comprising a plurality of colour pixel components, and said spatial periodicity component pixel data is determined by combining colour pixel components for each pixel and determining spatial periodicity component data from the combined colour pixel components for pixels in the or each region.
- SO. A method of sorting objects comprising the method according to any one of claims 33 to 49 for identifying an object, and sorting the object in accordance with the determined identity indicator.
- 51. A carrier medium carrying computer readable code for controlling a computer to carry out the method of any preceding claim.
- 52. A computer apparatus for identifying an object comprising: a program memory containing program code; and a processor for reading and executing the program code stored in said program memory; wherein said program code is adapted to control the processor to carry out the method of any preceding claim.
- 53. Apparatus for identifying an object comprising: pixel data determining means for determining combined pixel data tor pixels in each of a plurality of predetermined regions in an image of said object; comparing means for comparing the determined combined pixel data for each region with predetenmincd combined pixel data for a corresponding region for at least one object of known identity; and identity determining means for determining an identity indicator for the object in dependence upon the result of the comparison.
- 54. Apparatus according to claim 53, wherein said comparing means is adapted to determine the difference between the determined combined pixel data for the region and predetermined combined pixel data for the corresponding region for said at least one known object, and combine the determined differences for the or each known object to determine a combined difference for the or each respective known object; and said identity determimng means is adapted to determine the identity indicator based on the combined difference for the or each object of known identity.
- 55. Apparatus according to claim 54, wherein said comparing means is adapted to determine said combined difference as a weighted combination, with different weights being applied to the determined differences for said regions.
- 56. Apparatus according to claim 55, wherein said weights are preselected.
- 57. Apparatus according to claim 55, wherein said weights are dependent upon a number of pixels in each predetermined region.
- 58. Apparatus according to claim 53, wherein said pixel data determining means is adapted to determine said combined pixel data for at least one predetermined region as spatial periodicity component data for pixels in said at least one predetermined region.
- 59. Apparatus according to claim 58, wherein said pixel data determining means is adapted to determine said spatial periodicity component data as spatial periodicity components in only one direction.
- 60. Apparatus according to claim 59, wherein pixel data determining means is adapted to determine said spatial periodicity components as angular periodicity components.
- 61. Apparatus according to claim 59, wherein pixel data determining means is adapted to determine said spatial periodicity components as linear periodicity components.
- 62. Apparatus according to any one of claims 58 to 61, wherein pixel data dctcrmming means is adapted to determine said spatial periodicity component data as preselected spatial periodicity components.
- G3. Apparatus according to any one of claims 58 to 62, wherein said pixel data detennining means is adapted to determine said spatial periodicity component data for each region to include phase independent spatial periodicity component data, and said comparing means is adapted to compare the phase independent spatial periodicity component data for each region with predetermined phase independent spatial periodicity component data for a corresponding region for said at least one object of known identity.
- 64. Apparatus according to claim 63 wherein the phase independent spatial periodicity component data comprises magnitude data.
- 65. Apparatus according to any one of claims 58 to 64, wherein said comparing means Is adapted to determine the difference between the spatial periodicity component data for the region and predetermined spatial periodicity component data for the corresponding region for said at least one object of known identity, and to combine the determined differences for the or each object of known identity to determine a combined difference for the or each respective known object; and said identity determining means is adapted to determine said identity indicator based on the combined difference for the or each object of known identity.
- 66. A method according to claim 65, wherein said comparing means is adapted to determine said combined difference as a weighted combination.
- 67. Apparatus according to claim 66, wherein said comparing means is adapted to apply different weights to the determined differences for said regions.
- 68. Apparatus according to claim 66 or claim 67, wherein said comparing means is adapted to apply different weights to the determined differences for components of said spatial periodicity component data in at least one said region.
- 69. Apparatus according to claim 67, wherein said weights are preselected.
- 70. Apparatus according to claim 67, wherein said weights are dependent upon a number of pixels in each predetermined region.
- 71. Apparatus according to any one of claims 58 to 70, wherein said pixel data determining means is adapted to determine said spatial periodicity component data to include a zero spatial frequency component.
- 72. Apparatus according to claim 7], wherein said pixel data determining means is adapted to determine said spatial penodicty component data to include a power component derived from said zero spatial frequency component.
- 73. Apparatus according to any one of claims 53 to 72, wherein said image comprises colour pixel data comprising a plurality of colour pixel components, and said pixel data determining means is adapted to determine said combined pixel data by combining separate colour pixel components for pixels in each said region.
- 74. Apparatus according to claim 73, wherein said pixel data determining means is adapted to determine said combined pixel data by accumulating separate colour pixel components for pixels in each said region.
- 75. Apparatus according to any one of claims 53 to 72, wherein said image comprises colour pixel data comprising a plurality of colour pixel components, and said pixel data dctennining means is adapted to determine said combined pixel data by combining colour pixel components for each pixel and combining the combined colour pixel components for pixels m each said region.
- 76. Apparatus according to any one of claims 53 to 75, wherein at least two of said predetermined regions overlap, and some of said combine pixel data belongs to more than one region.
- 77. Apparatus according to any one of claims 53 to 76, wherein said predetermined regions of said image do not include at least one predetermined undesirable region of said image.
- 78. Apparatus or sorting objects comprising the apparatus for identifying an object, according to any one of claims 53 to 77, and sorting means for sorting the object in accordance with the determined identity indicator.
- 79. Apparatus according to any one of claims 53 to 78, including predetermined pixel data determining means for determimng predetermined combined pixel data for a plurality of regions in an image of an object of known identity; said predetermined pixel data determining means comprising means for determining combined pixel data for pixels in each of a plurality of prcdetemlined regions in an image of a known object for each of a plurality of dike objects predetermined to be of the same known identity; and means for averaging the detenmmed combined pixel data each region of the plurality of objects to determine the predetermined combined pixel data for each region.
- 80. Apparatus according to claim 79, including storing means for storing the predetermined combined pixel data.
- 81. Apparatus for aligning an object in an image field comprising: determining means for determining spatial periodicity component data for pixels in at least one region of the image field, said spatial periodicity component data comprising in-phase and quadrature components; comparing means for comparing the in-phase and quadrature components for the or each region; and controlling means for controlling the alignment of the object in the image field in dependence upon the result o f the comparison.
- 82. Apparatus according to claim 81, wherein said determining means is adapted to accumulate the spatial periodicity component data for the or each region for the pixels in the or each region.
- 83. Apparatus according to claim 81 or claim 82, wherein the spatial periodicity component data comprises angular periodicity component data, and said alignment comprises translational alignment.
- 84. Apparatus according to claim 81 or claim 82, wherein the spatial periodicity component data comprises translational periodicity component data, and said alignment comprises rotational alignment.
- 85. Apparatus for identifying an object comprising: pixel data determining means for determining spatial periodicity component data for pixels in at least one predetermined region in an image of said object, said spatial periodicity component data comprising spatial periodicity components in only one direction; comparing means for comparing the determined spatial periodicity component data with predetermined spatial periodicity component data for at least one object of known identity; and identity determining means for determining an identity indicator for the object in dependence upon the result of the comparison.
- 86. Apparatus according to claim 85, wherein said spatial periodicity components comprise angular periodicity components.
- 87. Apparatus according to claim 85, wherein said spatial periodicity components comprise linear periodicity components.
- 88. Apparatus according to any one of claims 85 to 88, wherein said spatial periodicity component data comprises preselected spatial periodicity components.
- 89. Apparatus according to any one of claims 85 to 88, wherein said pixel data determining means is adapted to include in the determination of the spatial periodicity component data lor the or each region the determination of phase independent spatial periodicity component data; and said comparing means is adapted to compare the phase independent spatial periodicity component data for the or each region with predetermined phase independent spatial periodicity component data for a corresponding region for said at least one object of known identity.
- 90. Apparatus according to claim 89 wherein the phase independent spatial periodicity component data comprises magnitude data.
- 91. Apparatus according to any one of claims 85 to '3(), wherein said comparing means Is adapted to determine the difference between the spatial periodicity component data for the region and predetermined spatial periodicity component data for the corresponding region for said at least one object of known identity, and to combine the determined differences for the or each known object to determine a combined difference for the or each respective object of known identity; and said identity determining means is adapted to determine said identity indicator based on the combined difference for the or each object of known identity.
- 92. Apparatus according to claim 91, wherein said comparing means is adapted to determine said combined difference as a weighted combination.
- 93. Apparatus according to claim 92, wherein said comparing means is adapted to apply different weights to the determined differences for the or each region.
- 94. Apparatus according to claim 91 or claim 92, wherein said comparing means is adapted to apply di fferent weights to the determined differences for components of said spatial periodcity component data in the or each region.
- 95. Apparatus according to claim 93 or claim 94, wherein said weights are preselected.
- 96. Apparatus according to claim 93, wherein said weights are dependent upon a number oPpixels in each predetermined region.
- 97. Apparatus according to any one of claims 85 to 96, wherein said pixel data determining means is adapted to determine said spatial periodicity component data to include a zero spatial frequency component.
- 98. Apparatus according to claim 97, wherein said pixel data determining means is adapted to determine said spatial periodicity component data to include a power component derived from said zero spatial frequency component.
- 99. Apparatus according to any one of claims 85 to 98, wherein said image comprises colour pixel data comprising a plurality of colour pixel components, and said pixel data determining means is adapted to determine said spatial periodicty component data for each of the colour pixel components for pixels in the or each region.
- 100. Apparatus according to claim 99, wherein said pixel data determining means is adapted to determine said spatial periodicity component data by accumulating separate colour pixel components for pixels in the or each region.
- 101. A method according to any one of claims 85 to 100, wherein said image comprises colour pixel data comprising a plurality of colour pixel components, and said pixel data determining means is adapted to determine said spatial periodicity component pixel data by combining colour pixel components for each pixel and detenmming spatial periodcty component data from the combined colour pixel components for pixels m the or each region.
- 102. Apparatus for sorting objects comprising the apparatus for identifying an object according to any one of claims 85 to 101, and sorting means for sorting the object in accordance with the determined identity indicator.
- 103. Apparatus for identifying an object comprising: logic which determines combined pixel data for pixels in each of a plurality of predetermined regions in an image of said object; logic which compares the determined combined pixel data for each region with predetermined combined pixel data for a corresponding region for at least one object of known identity; and logic which determines an identity indicator for the object in dependence upon the result of the comparison.l O4. Apparatus for identifying an object comprising: logic which determines spatial periodicity component data for pixels in at least one predetermined region in an image of said object, said spatial periodicity component data comprising spatial periodicity components in only one direction; logic which compares the determined spatial periodicity component data with predetermined spatial periodieity component data for at least one object of known identity; and logic which determines an identity indicator for the object in dependence upon the result of the comparison.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB0412027A GB2414545A (en) | 2004-05-28 | 2004-05-28 | Method for identifying and sorting objects |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB0412027A GB2414545A (en) | 2004-05-28 | 2004-05-28 | Method for identifying and sorting objects |
Publications (2)
Publication Number | Publication Date |
---|---|
GB0412027D0 GB0412027D0 (en) | 2004-06-30 |
GB2414545A true GB2414545A (en) | 2005-11-30 |
Family
ID=32671280
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB0412027A Withdrawn GB2414545A (en) | 2004-05-28 | 2004-05-28 | Method for identifying and sorting objects |
Country Status (1)
Country | Link |
---|---|
GB (1) | GB2414545A (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2012089353A1 (en) * | 2010-12-28 | 2012-07-05 | "Novotech" Elektronik Gmbh | Apparatus and method for coin recognition |
US8371945B2 (en) | 2007-12-18 | 2013-02-12 | Aristocrat Technologies Australia Pty Limited | Gaming machine and a network of gaming machines |
US8845427B2 (en) | 2008-07-14 | 2014-09-30 | Aristocrat Technologies Australia Pty Limited | Gaming system and method of gaming |
GB2570882A (en) * | 2018-02-06 | 2019-08-14 | Tcs John Huxley Europe Ltd | Token sorting apparatus |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4692943A (en) * | 1983-12-30 | 1987-09-08 | Dr. Ludwig Pietzsch Gmbh | Method of and system for opto-electronic inspection of a two-dimensional pattern on an object |
US5073955A (en) * | 1989-06-16 | 1991-12-17 | Siemens Aktiengesellschaft | Method for recognizing previously localized characters present in digital gray tone images, particularly for recognizing characters struck into metal surfaces |
US5261008A (en) * | 1990-08-07 | 1993-11-09 | Yozan, Inc. | Fingerprint verification method |
US20030091238A1 (en) * | 2001-10-23 | 2003-05-15 | Laurent Plaza | Detection of a pattern in a digital image |
US6688449B1 (en) * | 1999-12-10 | 2004-02-10 | Unirec Co., Ltd. | Image pickup device and pattern identification apparatus |
-
2004
- 2004-05-28 GB GB0412027A patent/GB2414545A/en not_active Withdrawn
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4692943A (en) * | 1983-12-30 | 1987-09-08 | Dr. Ludwig Pietzsch Gmbh | Method of and system for opto-electronic inspection of a two-dimensional pattern on an object |
US5073955A (en) * | 1989-06-16 | 1991-12-17 | Siemens Aktiengesellschaft | Method for recognizing previously localized characters present in digital gray tone images, particularly for recognizing characters struck into metal surfaces |
US5261008A (en) * | 1990-08-07 | 1993-11-09 | Yozan, Inc. | Fingerprint verification method |
US6688449B1 (en) * | 1999-12-10 | 2004-02-10 | Unirec Co., Ltd. | Image pickup device and pattern identification apparatus |
US20030091238A1 (en) * | 2001-10-23 | 2003-05-15 | Laurent Plaza | Detection of a pattern in a digital image |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8371945B2 (en) | 2007-12-18 | 2013-02-12 | Aristocrat Technologies Australia Pty Limited | Gaming machine and a network of gaming machines |
US8702524B2 (en) | 2007-12-18 | 2014-04-22 | Aristocrat Technologies Australia Pty Limited | Gaming machine and a network of gaming machines |
US8845427B2 (en) | 2008-07-14 | 2014-09-30 | Aristocrat Technologies Australia Pty Limited | Gaming system and method of gaming |
WO2012089353A1 (en) * | 2010-12-28 | 2012-07-05 | "Novotech" Elektronik Gmbh | Apparatus and method for coin recognition |
GB2570882A (en) * | 2018-02-06 | 2019-08-14 | Tcs John Huxley Europe Ltd | Token sorting apparatus |
GB2570882B (en) * | 2018-02-06 | 2022-03-02 | Tcs John Huxley Europe Ltd | Token sorting apparatus |
Also Published As
Publication number | Publication date |
---|---|
GB0412027D0 (en) | 2004-06-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
AU2017254953B2 (en) | Methods for automatically generating a card deck library and master images for a deck of cards, and a related card processing apparatus | |
AU744958B2 (en) | Method of detecting colours | |
US20240005726A1 (en) | System for counting quantity of game tokens | |
CN102422328B (en) | Method for a banknote detector device, and a banknote detector device | |
CA2681817C (en) | Lumber inspection method, device and program | |
US8682038B2 (en) | Determining document fitness using illumination | |
EP0294497A1 (en) | Apparatus for identifying postage stamps | |
WO1998018117A2 (en) | Machine vision calibration targets and methods of determining their location and orientation in an image | |
EP2359313A1 (en) | Method and system for item identification | |
US7840030B2 (en) | Method for determining the change in position of an object in an item of luggage | |
JPH08279042A (en) | Method and apparatus for certification of documents | |
GB2414545A (en) | Method for identifying and sorting objects | |
Brusey et al. | Techniques for obtaining robust, real-time, colour-based vision for robotics | |
CN113033286A (en) | Method and device for identifying commodities in container | |
JPH09229646A (en) | Object recognition method | |
Schwarz et al. | A robust and calibration-free vision system for humanoid soccer robots | |
Schmid et al. | Ground texture based localization: do we need to detect keypoints? | |
Heryanto et al. | The Real-time Color and Contour Detection for Automatic Inspection Machine | |
Wallace et al. | The visual classification of magazines | |
Ahmed et al. | Scalable target selection logic | |
Duong | Dishware identification and inspection for automatic dishwashing operations | |
Jones et al. | Robotic sorting of paper items from a random pile | |
KR20200101599A (en) | Picking system using real sense | |
JPH09245215A (en) | Method for determining feeding direction of paper money |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WAP | Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1) |