CA2217369A1 - Two-camera system for locating and storing indicia on conveyed items - Google Patents
Two-camera system for locating and storing indicia on conveyed items Download PDFInfo
- Publication number
- CA2217369A1 CA2217369A1 CA002217369A CA2217369A CA2217369A1 CA 2217369 A1 CA2217369 A1 CA 2217369A1 CA 002217369 A CA002217369 A CA 002217369A CA 2217369 A CA2217369 A CA 2217369A CA 2217369 A1 CA2217369 A1 CA 2217369A1
- Authority
- CA
- Canada
- Prior art keywords
- camera
- image
- mark
- parcel
- indicia
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/24—Aligning, centring, orientation detection or correction of the image
- G06V10/245—Aligning, centring, orientation detection or correction of the image by locating a pattern; Special marks for positioning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V30/00—Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
- G06V30/10—Character recognition
- G06V30/14—Image acquisition
- G06V30/1429—Identifying or ignoring parts by sensing at different wavelengths
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V30/00—Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
- G06V30/40—Document-oriented image-based pattern recognition
- G06V30/42—Document-oriented image-based pattern recognition based on the type of document
- G06V30/424—Postal images, e.g. labels or addresses on parcels or postal envelopes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V30/00—Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
- G06V30/10—Character recognition
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Artificial Intelligence (AREA)
- Character Input (AREA)
- Sorting Of Articles (AREA)
Abstract
A two-camera over-the-belt optical character recognition (OCR) system (10) for reading the destination addresses on parcels (14) carried on a conveyor belt (12). The parcels (14) bear an orientation defining fluorescent ink fiduciary mark (34) that is typically stamped on the parcel in the same area as the destination address (33). The fiduciary mark (34) is located approximately in the center of the destination address block (32) and oriented in the same direction as the underlying destination address. The first camera (19), a low resolution CCD camera, captures an image of the fiduciary mark (34). A host computer (30) stores an image of the fiduciary mark, determines the position and orientation of the fiduciary mark, and defines a region of interest (40) about the fiduciary mark. The second camera (22), a high resolution CCD camera, captures a grey-scale image of the conveyor (12) and the parcels (14) carried on the conveyor. The host computer (30) extracts and stores the portion of the high resolution image that is within the region of interest (40) including the text defining the destination address of the parcel. The host computer (30) then rotates and displays the text image and/or transmits the text image to a text reader.
Description
, TWO CAMERA SYSTEM FOR LOCATING AN-D STORING INDICIA ON
CONVEYED ITEMS
15 Reference to Related Application This application is a continuation-in-part of the commonly owned pending U.S.
Patent Application No. 08/419,176, "Method for Locating the Position and Orientation of a Fiduciary Mark" filed April 10, 1995, inventors James S. Morton and James V. Recktenwalt.
Technical Field The present invention relates to image processing, and more particularly relates to over-the-belt optical character recognition systems. Specifically, the present invention relates to 25 a two camera system for reading the destination address on packages moving along a conveyor.
Background of the Invention For years, machines have been used to scan parcels as they move along a 30 conveyor. Over-the-belt optical character recognition (OCR) systems have been recelltly developed that can capture an image of the surface of a parcel as it moves along a conveyor, and then create and process a representation of the image. The filnd~m~nt:~l physical components of an OCR system are a sensor, an analog-to-digital (A/D) converter, and a computer comprising a central processing unit (CPU) and a memory. The individual physical components of an OCR
35 system are all well known in the art, and many alternative embodiments of each of the individual physical components are commercially available, with differing cost and performance characteristics. Much effort goes into finding the most efficient combinations of components for particular applications, and in the development of computer software programs that process the images created by these familiar physical components.
CONVEYED ITEMS
15 Reference to Related Application This application is a continuation-in-part of the commonly owned pending U.S.
Patent Application No. 08/419,176, "Method for Locating the Position and Orientation of a Fiduciary Mark" filed April 10, 1995, inventors James S. Morton and James V. Recktenwalt.
Technical Field The present invention relates to image processing, and more particularly relates to over-the-belt optical character recognition systems. Specifically, the present invention relates to 25 a two camera system for reading the destination address on packages moving along a conveyor.
Background of the Invention For years, machines have been used to scan parcels as they move along a 30 conveyor. Over-the-belt optical character recognition (OCR) systems have been recelltly developed that can capture an image of the surface of a parcel as it moves along a conveyor, and then create and process a representation of the image. The filnd~m~nt:~l physical components of an OCR system are a sensor, an analog-to-digital (A/D) converter, and a computer comprising a central processing unit (CPU) and a memory. The individual physical components of an OCR
35 system are all well known in the art, and many alternative embodiments of each of the individual physical components are commercially available, with differing cost and performance characteristics. Much effort goes into finding the most efficient combinations of components for particular applications, and in the development of computer software programs that process the images created by these familiar physical components.
Charge-coupled device (CCD) sensor arrays are often used in OCR systems. A
CCD camera consists of an array of electronic "pixels," each of which stores an accumulated charge according to the amount of light that strikes the pixel. A CCD camera is used to quickly capture an image of the surface of a parcel as it moves along a conveyor. The image is converted 5 into digital format which may be stored as a bit map in a computer memory. The CCD array is then cleared as the charge within each pixel is read, and the array is ready to capture the image of another parcel or section of a parcel. In this manner, a single CCD carnera is used to scan a great many parcels.
Computers that may be used to process the images captured by CCD cameras 10 vary in computation speed and other parameters. Generally, a faster computer is more expensive than a slower computer, a computer with a large memory capacity is more expensive than a computer with a smaller memory capacity, and a special purpose computer is more expensive than a general purpose computer. There is therefore a financial motivation to use low speed, low memory, general purpose computers whenever such are suitable for a particular purpose.
Parcel delivery companies, such as United Parcel Service (UPS), could make extensive use of OCR reader systems. UPS ships millions of parcels every day. If OCR
systems were used by parcel delivery companies such as UPS they would generate an enormous amount of computer data. There is therefore a need to limit the amount of image data that must be saved for processing by a text reader. There is also a need for computer systems that can 20 quickly and accurately process the images created by CCD cameras. For example, computer systems have been developed that can attempt to read the destination address written on certain parcels, so that the parcels may be correctly routed to their destinations when the address is successfully read. Reading text is a sophisticated task, and a system capable of doing so is commensurately sophisticated and may comprise expensive equipment such as a high resolution 25 CCD camera and a high speed, storage, and processing, or special purpose computer.
To the extent that less expensive equipment can perform less sophisticated tasksin an OCR system, more expensive equipment can be cle~ t~-d to reading text. Determining the location and orientation of the destination address on a package moving along a conveyor is an example of a function required of an OCR system that has been performed with sophisticated 30 equipment than that is also used to read text. There is therefore a need for a system using low cost equipment such as a low resolution CCD camera and a general purpose computer to determine the location and orientation of the destination address on a package moving along a conveyor.
Miette, U.S. Patent No. 5,103,489, describes a label, and a method and device 35 for locating addresses on mailed articles to be sorted. The system uses a preprinted mailing label including an address locating mark located in a known relation to the area on the label where the destination address is likely to be written. The address locating mark is a black ink circle including a rotation specific image inside the circle. A space for the destination address is included on the label to the side and below the mark. The surface of the package is imaged with Wo 96/326g2 PCT/US96105019 a single camera system, and the location and orientation of the mark is ascertained. The destination address is then read in the expected area, and at the expected orientation, with reference to the location and orientation of the clPtectecl mark.
The system described by Miette reduces the amount of data that must be 5 processed by the text reader. The address locating mark is located on a mailed article in a known relationship to the position and orientation of the destination address. The text reader may then process only the data within a relatively small area and at an orientation defined with respect to the mark. The system described by Miette is used to scan closely arranged items such as magazines exiting an unstacking device. In such a system, the items to be scanned are of 10 uniform size and shape, and little time elapses between consecutive items. Moreover, the entire surface of every item must be searched for the mailing label. Therefore, Miette is not concerned with limiting the amount of high resolution CCD camera data that must be stored for subsequent processing.
The single camera system described by Miette would have ~ significant 15 disadvantage if applied to parcels moving along a conveyor because the address locating mark could not be used to limit the amount of CCD camera data that must be stored in a computer memory for processing by the text reader. A high resolution CCD camera sc~nning a moving conveyor, such as one suitable for use with a text reader, generates a tremendous amount of data. Most of the data is a useless image of the conveyor and the non-text bearing areas of the 20 parcels moving along the conveyor; only a small percentage of the data includes the important destination address bearing portions of the parcels. The single camera system described by Miette would require storing all of the data generated by the high resolution CCD camera even though only a small portion of the data would be processed by a text reader. The system described by Miette also relies on preprinted address labels. It would be advantageous if an 25 OCR system for a parcel handling system was not limited to reading addresses written on preprinted labels.
Kizu et al., U.S. Patent No. 4,516,265, describes a two camera system that reads the postal (zip) codes on envelopes traveling on an envelope transport system. The two camera system includes a low resolution pre-scanner that coarsely scans the surface of the 30 envelope. The position of the destination address block is determined from the coarse scan, and the coordinates of the destination address block with respect to the leading edge of the envelope are then passed to a second camera system. The second camera system scans the destination address block by first detecting the leading edge of the envelope. The second camera then starts scanning when the destination address block reaches the second camera and stops sc~nning 35 when the destination address block moves past the second camera. A postal code reader then processes the high resolution scan and detects and reads the postal code.
The two camera system described by Kizu et al. does not use an address locating mark to determine the position and orientation of the destination address block on a letter Rather, Kizu et Gl. relies on the positional relation and sizes of the indicia bearing areas (i.e stamp and post mark, destination address block, and return address block) on the envelope. The position and size of the destination address block is then defined with respect to the leading edge of the envelope. The timing of the operation of the high resolution camera is then controlled to limit the amount of the surface of the envelope that is scanned with the high resolution camera.
S Thus, Kizu et al. relies on the envelope, and the destination address on the envelope, being properly oriented on the envelope transport system. Kizu et al. also relies on the envelope having a well defined leading edge.
The two camera system described by Kizu et al. would therefore not be suitable for an over-the-belt OCR system for arbitrarily positioned parcels bearing a~ d,ily positioned 10 destination address labels moving along a conveyor. The system described by Kizu et al. cannot ascertain the position and orientation of the destination address blocks on parcels with arbitrarily positioned destination address labels, such as those conveyed in a parcel handling system.
Sirnilarly, the system described by Kizu et al. does not have a method for reading the destination address on a parcel that does not have a well defined leading edge, such as a parcel with soft or 15 irregular edges.
IBM technical disclosure bulletin Vol. 30, ~o. 11 entitled "System For Determining Form Alignment" describes a document handwriting recognition system that uses fiducial marks on the back side of a form. The fiducial marks may be associated with pre-assigned handwriting areas on the front of the form. A user places the form on a tablet with the 20 fiducial marks facing toward the tablet and writes in the pre-assigned areas. A scanner in the tablet detects the position and orientation of the fiducial marks and a pressure sensor records the handwriting. The fiducial marks described by the IBM disclosure document are not used in connection with parcels moving on a conveyor.
Therefore, after Miette and Kizu et al. there remains a need for an OCR system 25 which minimizes the amount of high resolution CCD camera data that must be stored for processing by a text reader. There also remains a need for a two camera OCR system that can ascertain the orientation of albill~ily positioned destination blocks on articles moving along a conveyor, such as those conveyed in a parcel handling system. There also remains a need for a two camera OCR system that can read information on a parcel that does not have a well defined 30 leading edge.
Summary of the Invention The present invention meets the above objectives by providing a two camera 35 system for reading indicia on conveyed items. The plerell~d embodiment is an over-the-belt OCR system for reading the destination addresses on parcels carried on a conveyor belt. The parcels bear an orientation defining fluorescent ink fiduciary mark that is typically stamped on the parcel in the so that it is superimposed relative to the destination address. The fiduciary mark is located approximately in the center of the destination address block and oriented in the same S
direction as the underlying destination address. The first camera, a low resolution CCD camera, captures an image of the fiduciary mark. A host computer stores an image of the fiduciary mark, determines the position and orientation of the fiduciary mark, and defines a region of interest about the fiduciary mark. The second camera, a high resolution CCD camera, captures a grey-scale image of the conveyor and the parcels carried on the conveyor. The host computer extracts and stores the portion of the high resolution image that is within the region of interest including the text defining the destination address of the parcel. The host computer then rotates and displays the rotated text image and/or transmits the text image to a text reader.
Although the preferred embodiment of the present invention is described in the 10 context of an over-the-belt OCR system using a fluorescent ink fiduciary mark, those skilled in the art will appreciate that the principles of the present invention may be applied to many other contexts in which a two camera system can read indicia on a substrate.
Generally described, the present invention is a method and system for reading indicia on a substrate by determining the position and orientation of a predefined mark that is 15 superimposed relative to the indicia. A first image of the substrate is captured with a first camera system, and the first image is stored in a computer memory. The position of a the mark within the first image is determined, and a region of interest is defirled in association with the mark. A
second image of the substrate is then captured with a second camera system, and the portion of the second image that is within the region of interest is stored in a computer memory. In this 20 manner, the host computer only stores data from the second camera for the area within the region of interest.
According to one feature of the present invention, the presence of indicia of a first type is detected in the image captured by the first camera system. In response to detecting indicia of the first type, the host computer begins storing the first image. Upon detecting the absence of 25 indicia of the first type for a predetermined period, the host computer stops storing the first image. In this manner, the host computer only stores data from the first camera that corresponds to indicia of the first type.
According to another feature of the present invention, the image captured by thesecond camera comprises indicia of a second type. The first image is composed of indicia other 30 than indicia of the second type, and the second image is composed of indicia other than indicia of the first type. Typically, the indicia of the first type is a pre~efined mark and the indicia of the second type is text, such as a destination address on a parcel. The host computer determines the position and orientation of the mark and uses this information to store an image of the text and to rotate the text for further processing by a character recognition processor.
In the preferred over-the-belt OCR system, the indicia of the first type is a fluorescent ink orientation defining fiduciary mark, and the indicia of the second type is the destination address on a parcel. The fiduciary mark and the destination address occupy the same area of the parcel. According to one feature of the preferred embodiment, the first camera is a low resolution CCD camera and the second camera is a high resolution CCD camera. According to another feature of the preferred embodiment, the orientation of the fiduciary mark is determined.
It is a further object of the present invention to provide a two camera OCR system that can read information on a parcel that does not have a well defined leading edge. That S the present invention and the preferred embodiments thereof improve over the drawbacks of the prior art and accomplish the objects of the invention set forth above will become apparent from the following detailed description of the preferred embodiments, claims, and drawings. Further objects and advantages of the present invention may become apparent from a review of the following detailed description of the preferred embodiments, claims, and drawings.
(remainder of this page intentionally left blank) W 096/32692 PCTrUS96/05019 Brief D~scription of the Drawings FIG. 1 is a diagram of an two camera over-the-belt Optical Character Recognition(OCR) system.
FIG. 2 is a diagram of a parcel including a fluorescent ink fiduciary mark located within the destination address block of the parcel.
FIG. 3 is a diagram of a parcel including a region of interest defined around a fiduciary mark.
FIG. 4 is a functional block diagram illustrating the flow of information in a two 10 camera OCR system.
FIG. 5 is a logical flow diagram for a two camera over-the belt OCR system.
FIG. 6 is a logical flow diagram for a computer-implemented routine for storing a low resolution image from data captured with the first camera of a two camera OCR system.
FIG. 7 is a logical flow diagram for a computer-implemented routine for 15 processing black/white data captured with the first camera of a two camera OCR system.
FIG. 8 is a logical flow diagram for a computer-implemented routine for creatinga low resolution image from black/white data captured with the first camera of a two camera OCR system.
FIG. 9 is a logical flow diagram for a co~ ulel-implemented routine for storing a 20 high resolution image from data captured with the second camera of a two camera OCR system.
Description of the Preferred Embodiments Turning first to the nomenclature of the specification, the detailed description25 which follows is represented largely in terms of processes and symbolic representations of operations performed by conventional co~llpuLel components, including a central processing unit (CPU), memory storage devices for the CPU, and connected pixel-oriented display devices.
These operations include the manipulation of data bits by the CPU and the maintenance of these bits within data structures resident in one or more of the memory storage devices. Such data 30 structures impose a physical org~ni7~tion upon the collection of data bits stored within computer memory and represent specific electrical or m~gnetic ele.ml~.nt!i. These symbolic representations are the means used by those skilled in the art of computer programming and computer construction to most effectively convey teachings and discoveries to others skilled in the art.
For the purposes of this discussion, all or a portion of a process may be a 35 sequence of computer-executed steps leading to a desired result. These steps generally require physical manipulations of physical quantities. Usually, though not nece~s~3rily, these quantities take the form of electrical, magnetic, or optical signals capable of being stored, transferred, combined, cornpared, or otherwise manipulated. It is conventional for those skilled in the art to refer to these signals as bits, values, elements, symbols, characters, terms, objects, numbers, CA 022l7369 l997-lO-03 096/32692 PCT~US96/05019 records, files, or the like. It should be kept in mind, however, that these and sirnilar terms are associated with applv~liate physical quantities for computer operations, and that these terms are merely conventional labels applied to physical quantities that exist within and during operation of the COlll~u~
It should also be understood that manipulations within the computer are often referred to in terms such as adding, comparing, moving, etc. which are often associated with manual operations performed by a human operator. It must be understood that no such involvement of a human operator is n~ces~ry or even desirable in the present invention. The operations described herein are machine operations performed in conjunction with a human operator or user who interacts with the computer. The machines used for performing the operation of the present invention include general purpose digital computers, work stations, or other similar co~ ,uLi~lg devices.
In addition, it should be understood that the programs, processes, methods, etc.described herein are not related or limited to any particular computer or ap~aratus. Rather, various types of general purpose machines may be used with programs constructed in accordance with the teachings described herein. Similarly, it may prove advantageous to construct specialized apparatus to perform the method steps described herein by way of dedicated computer systems with hard-wired logic or programs stored in nonvolatile memory, such as read only memory.
Referring now to the drawings, in which like numerals refer to like elements in the several views, FIG. l shows a two camera over-the-belt optical character recognition (OCR) system 10 including a preferred embodiment of the present invention. The OCR system 10 includes a conveyer 12 which carries parcels 14a through 14n. Conveyer 12 is preferably 18 inches wide and may carry approximately 3,600 parcels per hour, and may move at a rate of l00 feet per minute. Parcels 14a through 14n may vary in height, may have soft or irregular edges, and may be arbitrarily oriented on the conveyor 12. The preferred configuration shown in FIG.
1 having an l 8 inch wide conveyor belt is adapted for h~n~lling relatively small parcels. It will be appreciated that alternative embodiments of the present invention that are adapted for h~n~lling larger or smaller parcels may be constructed in accordance with the principles of the present invention.
Generally described, the two camera OCR system 10 is operational to ascertain the position and orientation of a fluorescent ink fiduciary mark located within the destination address block on the surface of a parcel 14, to capture an image of the text comprising the destination address in a region of interest defined with respect to the position of the fiduciary mark, to rotate the text image by an angle defined with respect to the orientation of the fiduciary mark, and to transmit the rotated text image to a display device or to a computer-implemented text reader. The preferred fiduciary mark is described with more particularity with respect to FIGS. 2 and 3. Although the preferred embodiment of the present invention is described in the context of an over-the-belt OCR system using a fluorescent ink fiduciary mark, those skilled in , WO 96/32692 PcTluss6lo5ol9 the art will appreciate that the principles of the present invention may be applied to virtually any type of two camera system for reading indicia on a substrate.
The conveyor 12 moves a parcel 14 through the field of view of a low resolution CCD camera 16. The low resolution camera 16 is preferably a low resolution, monochrome, 256 pixel line-scan type camera such as a Thompson TH7806A, TH793 lD. An ultraviolet light source 18 in conjunction with a reflector 19 illllmin~tes the parcel 14 as it is conveyed through the viewing area of the low resolution camera 16, which captures an image of the surface of the parcel 14. The low resolution camera 16 is fitted with a commercially available optical filter 20 that passes yellow/green light and ~ttt-nll~tes light in other portions of the visible spectrum. The low resolution camera 16 is thus configured to be responsive to yellow/green light such as that emitted by fluorescent ink exposed to ultraviolet light. More specifically, the optical filter 20 causes the low resolution camera 16 to be responsive to the yellow/green light emitted from the commercially available National Ink No. 35-48-J (Fluorescent Yellow) in response to ultraviolet light.
lS The conveyor 12 then moves the parcel 14 through the field of view of a high resolution CCD camera 22. The high resolution camera 22 is preferably a monochrome, pixel line-scan type camera such as one using a Kodak KLI-500 1 CCD chip. A white light source 24 in conjunction with a reflector 25 illllmin~tes the parcel 14 as it is conveyed through the viewing area of the second camera 22, which captures an image of the surface of a parcel 14. The high resolution camera 22 is responsive to a grey-scale light pattern such as that reflected by black ink text on the surface of the parcel 14. The high resolution camera 22 is relatively unresponsive to light reflected by fluorescent ink illuminated by white light. More specifically, the commercially available National Ink No. 35-48-J (Fluorescent Yellow) is substantially invisible to the high resolution camera 22 when illl-min~t~ by the white light source 24.
Each camera 16 and 22 may be pointed towards a first mirror (not shown), which may be pointed towards a second mirror (not shown), which may be pointed at the conveyor 12 to alter the optical path from the camera to the conveyor 12. The optical path from the low resolution camera 16 to the conveyor 12 is preferably 52 inches, and the optical path from the high resolution camera 22 to the conveyor 12 is preferably 98 inch~s. These parameters may be varied somewhat without unduly affecting the performance of the present invention. Those skilled in the art will appreciate that mirror systems can be used to increase the optical path length of a camera system while accommodating a smaller physical distance between the camera and the object to be imaged. See, for example, Smith et al., U.S. Patent No.
5,308,960, incorporated herein by reference.
The cameras 16 and 22 are positioned approximately 52 inches apart and 25 inches above the center of the conveyer 12 and each have an 18 inch wide field of view at the conveyor. In the preferred configuration shown in FIG. l, the low resolution camera 16 remains focused at a height of four inches above the conveyor. The focal height of four inches above the conveyor 12 corresponds to the average expected height of parcels 14a through 14n r W 096132692 PCTrUS96105019 handled by the OCR system 10. The low resolution camera 16 has a field of view that is approximately 16 inches wide at its focal height. The high resol-ltion camera 22 uses an optical path equalizer to focus on parcels of different heights. The preferred optical path equalizer is described in the commonly owned pending and allowed U.S. Patent Application No.
08/292,400, "Optical Path Equalizer" filed August 18, 1994, inventors Johannes A. S. Bjorner and Steven L. Smith, incorporated herein by reference.
A belt encoder 26 provides a signal indicating the speed of the conveyor 12 to avideo processor 28 and to the high resolution camera 22. The belt encoder 26 is a standard belt driven opto-mechanical encoder. The video processor 28 controls the operation of the low resolution camera 22 and sequentially transmits a one-bit ~i.e., black/white) vide~o signal corresponding to the image captured by the low resolution camera 16 to a host computer 30.
The high resolution camera 22 transmits an eight-bit grey-scale video signal corresponding to the image captured by the high resolution camera 22 to the host computer 30. The host computer 30 is preferably a general purpose computer comprising a standard microprocessor such as Heurikon HKV4d 68040 CPU board and a special purpos~ high speed image acquisition and procccsing board set such as the lS0/40 series manufactured ~y Tm~ging Technologies, Inc.
of Bedford Massachusetts. The operation of the cameras 16 and 22, the video processor 28, and the host coln~,ul~l 30 are described with more particularity with respect to FIG. 4.
Referring to FIG. 2, a parcel 14 bears a destination address block 32 cont~iningtext 33 defining the llestin~tion address of the parcel 14. A fluorescent ink fiduciary mark 34 is located approximately in the center of the destination address block 32 in the same area as the text 33 defining the ~iestin~tion address. As shown in FIG. 2, the text 33 may be oriented at an angle with respect to the OCR coordinate system 41.
FIG. 3 shows the configuration of the preferred fiduciary mark 34 which comprises two fluorescent ink non-overlapping circles of different diameter. As used herein, a circle means either an annulus or the area bounded by an annulus. The fiduciary mark 34 includes a large circle 35 and a small circle 37 oriented such that a vector 38 from the center of large circle 36 to the center of the small circle 37 is oriented approximately in the same direction as underlying text 33. The position of the fiduciary mark 34 is defined to be the mid-point 39 of the vector 38. It will be clear to one skilled in the art that alternative embodiments might include locating the fiduciary mark elsewhere on the parcel in a known relation to text bearing area 32, or in a different known relationship to the underlying text. The fiduciary mark 34 is typically stamped on a parcel using a conventional ink stamp after the clestin:~tion address 33 has been affixed to the parcel. It will be appreciated that the fiduciary mark 34 might be carried on a label, preprinted upon the parcel, or might be carried upon a transparent envelope into which an address label is placed.
For the preferred fiduciary mark 34, the diameter of large circle 36 is approximately 3/4 of an inch, the diameter of small circle is approximately 7/16 of an inch, and the distance separating them is approximately 1/4 of an inch. It is noted that a limit is imposed W 096/32692 PCTrUS96/05019 upon the size of the fiduciary mark 34 by the resolution of ihe low resolution camera 16. For example, the fiduciary mark 34 may be made ~maller if the lu~ resolution camera 16 has a higher resolution, and the resolution of camera 16 may be reduced if the fiduciary mark 34 is made larger. Acceptable performance is observed for the preferred configuration shown in FIG.
S 1 when the above fiduciary mark parameters are used. ~lternative embodiments might vary the size of the components of fiduciary mark 34 somewhat ~ithout unduly affecting the ~~elru.lllance of the present invention.
FIG. 3 also shows a region of interest 40 defined with respect to the fiduciary mark 34. The region of interest 40 is defined in terms of the high resolution camera to be a lk b~y lk square (i.e., 1,024 pixels by 1,024 pixels, which is equivalent to four inches by four inches). The region of interest 40 is orthogonally oriented with respect to the OCR coordinate system 41 and centered on the defined position 39 of the fiduciary mark 34. The host computer 30 creates and stores a bit map image of a small area comprising the fiduciary mark 34 from the data captured by the low resolution camera 16 as a parcel 14 passes through the field of view of the low resolution camera 16. The host computer 30 then determines the position and orientation of the fiduciary mark 34 and defines the region of interest 40 with respect to the position 39 of the fiduciary mark 34. The host computer then creates and stores a high resolution text image within the region of interest 40 from the data captured by the high resolution camera 22 as the parcel 14 passes through the field of view of the high resolution camera 22. In this manner, only a relatively small portion of the data captured by the high resolution camera 22 is saved for processing by the host computer 30.
FIG. 4 shows the connections among the belt encoder 26, the low resolution camera 16, the video processor 28, the host computer 30, and the high resolution camera 22.
The belt encoder 26 supplies a signal indicating the speed of the conveyor 12 to the video processor 28 and the high resolution camera 22. The Yi~o processor 28 provides a power supply 44 and a line clock signal 45 to the low resolution c~ne~a 16. The line clock signal 45 is used to trigger cycles of the low resoll}tion camera 1~ (i e.1 ~xposures of the line of CCD
pixels comprising the low resolution camera 16). Each cycle ca~tures a row of the image of the surface of a parcel 14 as it moves past the low resolutîon camera 16. The belt encoder 26 is selected to provide a pulse for each cycle of ~he high resolutioll camera 22. The high resolution camera 22 uses 4,096 pixels and has an exposure width of ~6 inches. The belt encoder 26 is therefore selected to provide 4,096 pulses for every 16 inches of travel of the conveyor belt 12.
The low resolution camera includes 256 pixels and also has an exposure width of 16 inches.
The line clock signal 45 for the low resolution camera 16 therefore includes one trigger pulse for every 16 pulses of the belt encoder 26 so that the high resolution camera 22 has 16 cycles for each cycle of the low resolution camera 16. In this manner, the line images captured by the cameras 16 and 22 may be assembled by the host computer 30 into two-dimensional images with the correct aspect ratios (i.e., the ratio of the length of an image to its width).
W O 96/32692 PCTrUS96/05019 The low resolution camera 16 transmits a pixel clock signal 46 and a raw video signal 47 to the video processor 28. The raw video signal 47 comprises a sequence of analog line signals, each line signal comprising 256 CCD signals created by a cycle of the low resolution camera 16. The pixel clock signal 46 comprises 256 pixel pulses per cycle, one pixel S pulse for each of the 256 CCD signals read during a cycle of the low resolution camera 16. The video processor 28, which is a standard one-bit A/D converter, includes a filter 48 that sharpens the raw video signal 47, an amplifier 49 that increases the magnitude of the raw video signal, and a comparator S0 that compares each CCD signal of the raw video signal 47 to a threshold value to produce a black/white video signal 54. As used herein, a black/white video signal is a pixelized signal with uniform foreground and background pixel values wherein each foreground pixel is "on" and each background pixel is "off". The threshold value is selected in view of the filter 20 so that foreground pixels are only produced by fluorescent indicia, such as that produced by fluorescent ink illnmin~t~l by ultraviolet light. Thus, the low resolution camera 16 captures an image of the fiduciary mark 34 without capturing an image of th~ underlying text lS 33.
The line clock signal 45 and the pixel clock signal 46 are transmitted from the video processor 26 to the host computer 30. It will be appreciated that these signals may be used to generate x,y coordinates in the OCR coordinate system 41 for each pixel of the black/white video signal 54. The black/white video signal 54 is transmitted from the video processor 28 to the host computer 30 where it is initially captured, one line at a time, in a 256 by one-bit FIFO low resolution line buffer 56. The CPU of the host computer 30 is interrupted each cycle of the line clock 45 to read the contents of the low resolution line buffer 56. The CPU first checks the status of a buffer empty bit 57 to determine whether the low resolution line buffer 56 contains data (i.e., whether any foreground pixels are included in the most recent line image captured by the low resolution camera 16), and reads the contents of the low resolution line buffer 56 only if the buffer empty bit 57 is not set.
The host computer 30 receives a new line image from the low resolution camera 16 each cycle of the line clock 45 and determines whether each line image contain~ foreground pixels. If a line image contains foreground pixels, the host computer 30 computes the x,y coordinates for the foreground pixels using the line clock signal 45 (for the x coordinate3 and the pixel clock signal 46 (for the y coordinate~ and places the x,y coordinates in the low resolution line buffer 56. The low resolution line buffer 56 captures and ml m~-nt~rily (i.e., the duration of a cycle of the line clock 45) retains x,y coordinates of the foreground pixels captured by the low resolution camera 16. The host computer sequentially reads the low resolution line buffer 56 and creates and stores a two-dimensional bit map image from the foreground pixel data read from the low resolution line buffer 56 in a general purpose computer memory 58. Those skilled in the art will appreciate that a signal from a belt encoder indicating the speed of a conveyor, a line-scan CCD camera, a FIFO buffer, and a general purpose computer memory can be used to produce and control the aspect ratio of a two-dimensional computer image of an object conveyed Wo 96/32692 PCT/US96/0501s past the camera. See, for example, Shah et al., U.S~ Patent No. 5,291,564, which is incorporated by reference.
The host computer 30 creates and stores a separate image for the fiduciary mark 34 on each parcel conveyed past the low resolution camera 16. Provided that a partially 5 completed image of a fiduciary mark 34 has not been constructed, a new image is started each time foreground pixels are found in the low resolution line buffer S6. An image is considered complete when no foreground pixels are found in the low resolution line buffer 56 for 16 consecutive cycles of the line clock 45. The distance traveled by the conveyor 12 in 16 cycles of the line clock 45 is approximately four times the distance separating the large circle 36 and l0the small circle 37 of a nominal fiduciary mark 34. In this manner, the host computer 30 creates and stores a separate foreground pixel image for the fiduciary mark 34 on each parcel 14a through 14n. Those skilled in the art will appreciate that foreground images may be created and stored by the host computer 30 in various formats including bit maps and run-length encoded im;~ges.
15The host computer 30 counts the cycles of the lir~e clock 45 in a rolling counter 51. The rolling counter 51 is used to keep track of the positions of parcels in the x coordinate of the OCR coordinate system 41 (i.e., the direction in which the conveyor 12 travels). The host computer 30 determines the position and orientation of a fiduciary mark 34 and defines a region of interest 40 around the mark. The host computer 30 then determines the number of cycles of 20 the line clock 45 required for the region of interest 40 to reach the field of view of the high resolution camera 22, adds this number to the value in the counter 51, and stores the result as a trigger value 52 for storing data subsequently captured by the high resolution camera 22. The host computer 30 determines that the region of interest 40 has reached the field of view of the high resolution camera 22 when the value in the counter 51 is ec~ual to the trigger value 52. The 2S host computer 30 then extracts from the high resolution buffer 68 the data within the region of interest 40 until the counter 51 increments a predeterrnined num~er times (64 increments in the preferred configuration) and clears the trigger value 52. The extracted region of interest data is stored in the general purpose memory 58.
The host computer 30 provides a power su~ply ~ to the high resolution camera 30 22 which provides a line clock signal 62, a pixel c~ock signal ~ and a grey-scale video signal 66 to the host computer. The line clock sign~ ~2 and ~~e pi~el clock signal 64 for the high resolution camera 22 are analogous to the line clock signal 45 and the pixel clock signal 46 for the low resolution camera 16, except that they produce 16 cycles for each cycle of the low resolution camera. It will be appreciated that these signals may be used to generate x,y 35 coordinates in the OCR coordinate system 41 for each pixel o~ the grey-scale video signal 66.
The operation of a conventional CCD camera such as the high resolution camera 22 in producing a grey-scale video signal such as the grey-scale video signal 66 is well known in the art and will not be further described herein.
wo 96l32692 Pcrluss6lo5ol9 The grey-scale video signal 54 is transmitted from the high resolution camera 22to the host computer 30 where it is initially captured, one line 2t a time, in a 4,096 by eight-bit high resolution line buffer 68. The high resolution line buffer 68 captures and momentarily (i.e:, the duration of-a cycle of the line clock 62) retains the grey-scale pixel data captured during a cycle of the high resolution camera 22. The host computer 30 sequentially reads the high resolution buffer 68, extracts data that is within the region of interest 40, and creates and stores a two-dimensional image of the region of interest 40 in the general purpose memory 58 of the host co~ u~er 30. Thus, only the high resolution data within the region of interest 40 is stored for processing by the host computer 30. A suitable method and system for producing a computer image from eight-bit grey-scale pixel data is described in the commonly owned pending U.S. Patent Application No. 08/380,732, "Method and Apparatus for Separating Foreground From Background in Images Containing Text" filed January 31, 1995, inventors Michael C. Moed and Izrail S. Gorian, which is incorporated by reference. Those skilled in the art will appreciate that grey-scale images may be created and stored by the host computer 30 in various formats including bit maps and run-length encoded images.
FIG.5 is a logical flow diagram of the image processing method 500 performed by the two camera over-the-belt OCR system 10. In routine 510, a low resolution CCD camera 16 captures an image of a fluorescent ink fiduciary mark 34 located within the destination address block 32 on a parcel 14 (see FIG. 2). The steps associated with routine 510 will be described more particularly with reference to FIGS. 6-8. In the next step 520, the position and orientation of the fiduciary mark 32 is determined. In the next step 530, a region of interest 40 is defined about the fiduciary mark 34 (see FIG. 3). In routine 540, a high resolution CCD
camera 22 captures a grey-scale image of the parcel 14 as it is conveyed through the field of view of the high resolution camera 22, and a host computer 30 extracts and stores a grey-scale image of the region of interest 40. The steps associated with routine 540 are described with more particularity with reference to FIG. 9. In step 550 the text image stored in step 540 is rotated to coincide with the x axis of the OCR coordinate system 41. In step 560, the text image is displayed on a display device such as a computer screen and/or C.ent to a computer-implemented text reader.
A suitable method and system for step 520 is described in the commonly owned pending U.S. Patent Application No.08/419,176, ~'Method for Locating the Position and Orientation of a Fiduciary Mark" filed April l0, 1995, inventors James Stephen Morton and James Vincent Recktenwalt, which is incorporated by reference. A suitable method and system for step 550 is described in the commonly owned pending U.S. Patent Application No. (to be assigned), "Method and System for Fast Rotation of Run-Length Encoded Images" filed July 26, 1995, inventors Jie Zhu, Michael C. Moed, and Izrail S. Gorian, which is incorporated by reference.
FIG. 6 shows the steps associated with routine 510 (from FIG. 5) for storing a low resolution image captured with the first camera system 16. In step 602, the video , 096/32692 PCTrUS96/05019 processor 28 receives a signal from the belt encoder 26 indicating ~he speed of the conveyor belt 12. In step 604, the signal from the belt encoder 26 is used to generate a line clock signal 45.
The line clock signal 45 for the low resolution camera l~ includes one pulse for each 16 pulses of the signal from the belt encoder 26. In step 606, the line ~lock signal 45 triggers a cycle of the low resolution camera 16. In step 608, the raw video data ~7 and pixel clock signal 46 are returned from the low resolution camera 16 to the video ~roces~or 28. The raw video signal 47 is then filtered in step 610, amplified in step 612, and compare~ to a threshold in step 614 to produce a black/white video signal 54 in step 616. The b~ack/white video signal 54 is then tr~ncmittç~l from the video processor 28 to a low resolution line buffer 56 in the host computer 30. The host computer 30 generates x,y coordinates for foreground pixel data in the black/white video signal 54 using the line clock signal 45 (for the x coordinate) and the pixel clock signal 46 (for the y coordinate). The coordinates of the foreground pixels are then captured in the low resolution line buffer 56 and Illtim~ttqly stored for processing in the general purpose memory 58 of the host computer 30.
The low resolution image stored in the general purpose memory 58 is then processed by the host computer 30 in routine 618, which is described with more particularity with respect to FIGS. 6-8. The low resolution camera 16 continuously produces an image of the conveyor belt 12 and the parcels carried by the conveyor belt 14 which is converted to the black/white video signal 54. The host computer 30 generates x,y coordinates for the foreground pixels (i.e., pixels exposed by the light produced by the fluorescent ink fiduciary mark) of the black/white video signal 54 and stores them in the general purpose memory 58 in the form of a two-dimensional bit map. A separate bit map is constructed for each fiduciary mark 34.
FIG. 7 shows the steps of routine 618 (fr~m FIG. 6) for processing the black/white video data 54 stored in the low resolution line ~s~ffer 56. In step 702, the host computer 30 receives the black/white video data 54 from the vi~eo processor 28. In decision step 704, it is determined whether there are any ~oregro~nd pi~cels in the black/white video signal 54. If there are no foreground pixels in the blacklwhite ~rideo signal 54, the "no" branch is followed from step 704 to step 710 in which the CPU of ~h~ host computer 30 is interrupted.
If there are foreground pixels in the black/white video signal ~;4, the "yes" branch is followed from step 704 to step 706 in which the x,y coordinates oi the ~reground pixels are generated.
The x,y coordinates are defined with respect to the OCR coordinate system 41 with the y coordinate defined relative to a edge of the conveyor 12 and the x coordinate defined relative to the value in the counter 51. The line clock signal 45 is used to generate the x coordinate, and the pixel clock signal 46 is used to generate the y coordinate. Step 706 is followed by step 708 in which the x,y coordinates of the foreground pixels are captured and momentarily retained in the low resolution line buffer 56.
~ he "no" branch of decision step 704 and the "yes" branch (via steps 706 and 708) are followed by step 710 in which the CPU of the host computer 30 is interrupted. The CPU is interrupted each cycle of the line clock 45 regardless of whether foreground pixel data has been placed in the low resolution buffer 56 so that blank lines may be built into the bit map image of the fiduciary mark 34. In response to the interrupt signal generated during step 710, the low resolution line buffer S6 may be read in step 712. In routine 714, a two ~im~ n~ional bit map image of a fiduciary mark 34 is created. The steps associated with routine 714 are described with more particularity with reference to FIG.8.
FIG. 8 shows the steps of routine 714 (from FIG. 7) for creating a low resolution image of a fiduciary mark 34. The first step of routine 714 is decision step 802 in which the empty bit 57 of the low resolution buffer 56 is checked. The empty bit 57 may be a software register, or it may be a pin that is latched to a pre~l~tl~rrnined voltage. The empty bit 57 in~ic~tes whether there is data in the low resolution buffer. If the buffer empty bit 57 is not set, the "no" branch is followed from step 802 to step 804 in which the contents of the low resolution line buffer 56 are read and the buffer is cleared. Step 804 is followed by decision step 806 in which it is determined whether there is a partial image of a fiducialy mark stored in l S the general purpose memory 58 of the host computer 30. If there is no partial image stored, the "no" branch is followed to step 808 in which a new fiduciary mark image is started. If there is a partial image stored, the "yes" branch is followed from step 806 to step 810 in which the data from the low resolution line buffer 56 is added to the partial image. Steps 808 and 810 are followed by step 812 in which the computer implemented routine 510 loops back to step 602 (FIG. 6) to begin another cycle.
Referring again to step 802, if the empty bit is set, the '~yes" branch is followed from step 802 to decision 814 in which it is determined whether the low resolution line buffer 56 has been empty for 16 consecutive cycles of the line clock 45 (i.e., 16 consecutive blank lines). If there have not been 16 consecutive blank lines, the "no" branch is followed to decision step 816 in which it is determined whether a partial image is stored. If a partial image is not stored, the "no" branch is followed to the continue step 812 in which the computer implemented routine 510 loops back to step 602 to begin another cycle. If a partial image is stored, step 816 is followed by step 818 in which a blank line is added to the partially stored image. Step 818 is then followed by the continue step 812 in which the computer implemented routine 510 loops back to step 602 to begin another cycle. Referring again to step 814, if there have been 16 consecutive blank lines read from the low resolution line buffer 5~, the "yes"
branch followed from step 814 to the decision step 820 in which it is determined whether a partial image is stored. If a partial image is not stored, the "no" branch is followed from step 820 to the continue step 812 in which the computer implemented routine 510 loops back to step 602 to begin another cycle. If a partial image is stored, the "yes" branch is followed from step 820 to step 822 in which the low resolution image is deemed to be complete. Step 822 is then followed by the continue step 812 in which the computer implemented routine 510 loops back to step 602 to begin another cycle.
, wo 96/32692 PCT/US96/OS019 Upon reaching step 822, a complete image of a fiduciary mark 34 has been stored in the general purpose memory 58 of the host computer 30. Referring to FIG. 5, steps 520 through 560, which require a complete image of a fiduciary mark 34, are then initiated. It will be appreciated that the colllpuLel-implemented routine 510 loops back to step 602 to each cycle of the line clock 45, whereas the steps 520 through 560 are only completed from time to time in response to a complete image of a fiduciary mark 34 having been constructed pursuant to routine 510. Thus step 822, which indicates that routine 510 has stored an image of a complete fiduciary mark 34 in the general purpose memory 58, is followed by steps 520 through 560, which require a complete image of a fiduciary mark 34, as well as a loop back to step 602 to for another cycle of the line clock 45.
FIG. 9 shows the steps associated with routine 540 (from FIG. 5) for storing a high resolution image captured with the second camera system 22. In step 902, the high resolution camera 22 receives a signal from the belt encoder 26 indicating the speed of the conveyor b~it 12. In step 904 the signal from the belt encoder 26 is used to generate a line clock signal 62 for the high resolution camera. In step 906, the line clock 62 triggers a cycle of the high resolution line camera 22 for each pulse of the signal received from the belt encoder 26 In step 908 the grey-scale video image 66 captured by a cycle of the high resolution line camera 22 is captured and momentarily (i.e., for the duration of one cycle of the line clock 62) retained in the high resolution line buffer 68.
Step 90B is followed by decision step 910 in which it is determined whether the data in the high resolution line buffer 68 corresponds to a line which is in the region of interest 40. Routine 540 determines the number of cycles of the line clock 45 required for the region of interest 40 to reach the field of view of the high resolution camera 22, adds this number to the value in the counter 51, and stores the result as a trigger value 52 for storing data subsequently captured by the high resolution camera 22. Routine 540 then determines that the region of interest 40 is within the field of view of the high resolution camera 22 when the value in the counter 51 is equal to the trigger value 52. The region of interest 40 remains in the field of view of the high resolution camera 22 for a predefined number of cycles of the line clock 45 (i.e., 64 cycles in the preferred configuration shown n FIG. 1 corresponding to a four inch wide region of interest and 16 cycles of the line clock 45 per inch of travel of the conveyor 12). The host computer 30 therefore extracts from the high resolution buffer 68 the data within the region of interest 40 until the counter 51 increments a predetermined number times (64 increments in the preferred configuration~ and clears the trigger value 52. The extracted region of~interest data is stored in the general purpose memory 58.
Referring again to step 910, if the current line is not within the region of interest 40, the "no" branch is followed from step 910 back to step 902 for another cycle of the high resolution carnera 22. If the line is within the region of interest 40, the "yes" branch is followed from step 910 to step 912 in which the portion of the data stored in the high resolution line buffer 68 which is within the general interest 40 is extracted from the line buffer 68 and stored W 096/32692 PCT~US96/05019 in the general purpose computer memory 58 of the host computer 30. Step 912 is followed by the decision step 914 in which it is determined whether the line stored in the general purpose computer memory 58 in step 912 was the last line of the region of interest 40. If it was the last line, the "no" branch is followed back to step 902 for another cycle of the high resolution line camera 22. If it was the last line in the region of interest 40, the high resolution image of the region of interest 40 is deemed to be complete and the "yes" branch is followed to the continue step 916 which causes the image processing method 500 to exit routine 540. The continue step 916 is then followed by step 550 of the image processing method 500 shown on FIG. 5.
In view of the foregoing, it will be appreciated that the low resolution camera 16 captures a new line image each cycle of the line clock 45. The host co~ uLer 30 only computes the x,y coordinates for the foreground pixels in the resulting black/white video data stream 54 for ultimate storage in the general purpose memory 58. Similarly, the high resolution camera 22 captures a new grey-scale line image which is captured in the high resolution line buffer 56 each cycle of the line clock 62. The host computer 30 then extracts from the high resolution line buffer 56 for ultimate storage in the general purpose memory 58 only the data that is within the region of interest 40. Thus, the two camera OCR system 10 limits the amount of low resolution data and the amount of high resolution data that must be stored in the general purpose memory 58 for processing by the host computer 30.
Those skilled in the art will appreciate that many different combinations of hardware will be suitable for practicing the present invention. Many commercially available substitutes, each having somewhat different cost and performance characteristics, exist for each of the physical components listed above. For example, the host computer 30 could be hard-wired logic, reconfigurable hardware, an application specific integrated circuit (ASIC), or some other equivalent means for implem.-nting a set of inst~uctions.
It should be understood that the foregoing relates only to the preferred embodiment of the present invention, and that numerous changes may be made therein without departing from the spirit and scope of the invention as defined by the following claims.
CCD camera consists of an array of electronic "pixels," each of which stores an accumulated charge according to the amount of light that strikes the pixel. A CCD camera is used to quickly capture an image of the surface of a parcel as it moves along a conveyor. The image is converted 5 into digital format which may be stored as a bit map in a computer memory. The CCD array is then cleared as the charge within each pixel is read, and the array is ready to capture the image of another parcel or section of a parcel. In this manner, a single CCD carnera is used to scan a great many parcels.
Computers that may be used to process the images captured by CCD cameras 10 vary in computation speed and other parameters. Generally, a faster computer is more expensive than a slower computer, a computer with a large memory capacity is more expensive than a computer with a smaller memory capacity, and a special purpose computer is more expensive than a general purpose computer. There is therefore a financial motivation to use low speed, low memory, general purpose computers whenever such are suitable for a particular purpose.
Parcel delivery companies, such as United Parcel Service (UPS), could make extensive use of OCR reader systems. UPS ships millions of parcels every day. If OCR
systems were used by parcel delivery companies such as UPS they would generate an enormous amount of computer data. There is therefore a need to limit the amount of image data that must be saved for processing by a text reader. There is also a need for computer systems that can 20 quickly and accurately process the images created by CCD cameras. For example, computer systems have been developed that can attempt to read the destination address written on certain parcels, so that the parcels may be correctly routed to their destinations when the address is successfully read. Reading text is a sophisticated task, and a system capable of doing so is commensurately sophisticated and may comprise expensive equipment such as a high resolution 25 CCD camera and a high speed, storage, and processing, or special purpose computer.
To the extent that less expensive equipment can perform less sophisticated tasksin an OCR system, more expensive equipment can be cle~ t~-d to reading text. Determining the location and orientation of the destination address on a package moving along a conveyor is an example of a function required of an OCR system that has been performed with sophisticated 30 equipment than that is also used to read text. There is therefore a need for a system using low cost equipment such as a low resolution CCD camera and a general purpose computer to determine the location and orientation of the destination address on a package moving along a conveyor.
Miette, U.S. Patent No. 5,103,489, describes a label, and a method and device 35 for locating addresses on mailed articles to be sorted. The system uses a preprinted mailing label including an address locating mark located in a known relation to the area on the label where the destination address is likely to be written. The address locating mark is a black ink circle including a rotation specific image inside the circle. A space for the destination address is included on the label to the side and below the mark. The surface of the package is imaged with Wo 96/326g2 PCT/US96105019 a single camera system, and the location and orientation of the mark is ascertained. The destination address is then read in the expected area, and at the expected orientation, with reference to the location and orientation of the clPtectecl mark.
The system described by Miette reduces the amount of data that must be 5 processed by the text reader. The address locating mark is located on a mailed article in a known relationship to the position and orientation of the destination address. The text reader may then process only the data within a relatively small area and at an orientation defined with respect to the mark. The system described by Miette is used to scan closely arranged items such as magazines exiting an unstacking device. In such a system, the items to be scanned are of 10 uniform size and shape, and little time elapses between consecutive items. Moreover, the entire surface of every item must be searched for the mailing label. Therefore, Miette is not concerned with limiting the amount of high resolution CCD camera data that must be stored for subsequent processing.
The single camera system described by Miette would have ~ significant 15 disadvantage if applied to parcels moving along a conveyor because the address locating mark could not be used to limit the amount of CCD camera data that must be stored in a computer memory for processing by the text reader. A high resolution CCD camera sc~nning a moving conveyor, such as one suitable for use with a text reader, generates a tremendous amount of data. Most of the data is a useless image of the conveyor and the non-text bearing areas of the 20 parcels moving along the conveyor; only a small percentage of the data includes the important destination address bearing portions of the parcels. The single camera system described by Miette would require storing all of the data generated by the high resolution CCD camera even though only a small portion of the data would be processed by a text reader. The system described by Miette also relies on preprinted address labels. It would be advantageous if an 25 OCR system for a parcel handling system was not limited to reading addresses written on preprinted labels.
Kizu et al., U.S. Patent No. 4,516,265, describes a two camera system that reads the postal (zip) codes on envelopes traveling on an envelope transport system. The two camera system includes a low resolution pre-scanner that coarsely scans the surface of the 30 envelope. The position of the destination address block is determined from the coarse scan, and the coordinates of the destination address block with respect to the leading edge of the envelope are then passed to a second camera system. The second camera system scans the destination address block by first detecting the leading edge of the envelope. The second camera then starts scanning when the destination address block reaches the second camera and stops sc~nning 35 when the destination address block moves past the second camera. A postal code reader then processes the high resolution scan and detects and reads the postal code.
The two camera system described by Kizu et al. does not use an address locating mark to determine the position and orientation of the destination address block on a letter Rather, Kizu et Gl. relies on the positional relation and sizes of the indicia bearing areas (i.e stamp and post mark, destination address block, and return address block) on the envelope. The position and size of the destination address block is then defined with respect to the leading edge of the envelope. The timing of the operation of the high resolution camera is then controlled to limit the amount of the surface of the envelope that is scanned with the high resolution camera.
S Thus, Kizu et al. relies on the envelope, and the destination address on the envelope, being properly oriented on the envelope transport system. Kizu et al. also relies on the envelope having a well defined leading edge.
The two camera system described by Kizu et al. would therefore not be suitable for an over-the-belt OCR system for arbitrarily positioned parcels bearing a~ d,ily positioned 10 destination address labels moving along a conveyor. The system described by Kizu et al. cannot ascertain the position and orientation of the destination address blocks on parcels with arbitrarily positioned destination address labels, such as those conveyed in a parcel handling system.
Sirnilarly, the system described by Kizu et al. does not have a method for reading the destination address on a parcel that does not have a well defined leading edge, such as a parcel with soft or 15 irregular edges.
IBM technical disclosure bulletin Vol. 30, ~o. 11 entitled "System For Determining Form Alignment" describes a document handwriting recognition system that uses fiducial marks on the back side of a form. The fiducial marks may be associated with pre-assigned handwriting areas on the front of the form. A user places the form on a tablet with the 20 fiducial marks facing toward the tablet and writes in the pre-assigned areas. A scanner in the tablet detects the position and orientation of the fiducial marks and a pressure sensor records the handwriting. The fiducial marks described by the IBM disclosure document are not used in connection with parcels moving on a conveyor.
Therefore, after Miette and Kizu et al. there remains a need for an OCR system 25 which minimizes the amount of high resolution CCD camera data that must be stored for processing by a text reader. There also remains a need for a two camera OCR system that can ascertain the orientation of albill~ily positioned destination blocks on articles moving along a conveyor, such as those conveyed in a parcel handling system. There also remains a need for a two camera OCR system that can read information on a parcel that does not have a well defined 30 leading edge.
Summary of the Invention The present invention meets the above objectives by providing a two camera 35 system for reading indicia on conveyed items. The plerell~d embodiment is an over-the-belt OCR system for reading the destination addresses on parcels carried on a conveyor belt. The parcels bear an orientation defining fluorescent ink fiduciary mark that is typically stamped on the parcel in the so that it is superimposed relative to the destination address. The fiduciary mark is located approximately in the center of the destination address block and oriented in the same S
direction as the underlying destination address. The first camera, a low resolution CCD camera, captures an image of the fiduciary mark. A host computer stores an image of the fiduciary mark, determines the position and orientation of the fiduciary mark, and defines a region of interest about the fiduciary mark. The second camera, a high resolution CCD camera, captures a grey-scale image of the conveyor and the parcels carried on the conveyor. The host computer extracts and stores the portion of the high resolution image that is within the region of interest including the text defining the destination address of the parcel. The host computer then rotates and displays the rotated text image and/or transmits the text image to a text reader.
Although the preferred embodiment of the present invention is described in the 10 context of an over-the-belt OCR system using a fluorescent ink fiduciary mark, those skilled in the art will appreciate that the principles of the present invention may be applied to many other contexts in which a two camera system can read indicia on a substrate.
Generally described, the present invention is a method and system for reading indicia on a substrate by determining the position and orientation of a predefined mark that is 15 superimposed relative to the indicia. A first image of the substrate is captured with a first camera system, and the first image is stored in a computer memory. The position of a the mark within the first image is determined, and a region of interest is defirled in association with the mark. A
second image of the substrate is then captured with a second camera system, and the portion of the second image that is within the region of interest is stored in a computer memory. In this 20 manner, the host computer only stores data from the second camera for the area within the region of interest.
According to one feature of the present invention, the presence of indicia of a first type is detected in the image captured by the first camera system. In response to detecting indicia of the first type, the host computer begins storing the first image. Upon detecting the absence of 25 indicia of the first type for a predetermined period, the host computer stops storing the first image. In this manner, the host computer only stores data from the first camera that corresponds to indicia of the first type.
According to another feature of the present invention, the image captured by thesecond camera comprises indicia of a second type. The first image is composed of indicia other 30 than indicia of the second type, and the second image is composed of indicia other than indicia of the first type. Typically, the indicia of the first type is a pre~efined mark and the indicia of the second type is text, such as a destination address on a parcel. The host computer determines the position and orientation of the mark and uses this information to store an image of the text and to rotate the text for further processing by a character recognition processor.
In the preferred over-the-belt OCR system, the indicia of the first type is a fluorescent ink orientation defining fiduciary mark, and the indicia of the second type is the destination address on a parcel. The fiduciary mark and the destination address occupy the same area of the parcel. According to one feature of the preferred embodiment, the first camera is a low resolution CCD camera and the second camera is a high resolution CCD camera. According to another feature of the preferred embodiment, the orientation of the fiduciary mark is determined.
It is a further object of the present invention to provide a two camera OCR system that can read information on a parcel that does not have a well defined leading edge. That S the present invention and the preferred embodiments thereof improve over the drawbacks of the prior art and accomplish the objects of the invention set forth above will become apparent from the following detailed description of the preferred embodiments, claims, and drawings. Further objects and advantages of the present invention may become apparent from a review of the following detailed description of the preferred embodiments, claims, and drawings.
(remainder of this page intentionally left blank) W 096/32692 PCTrUS96/05019 Brief D~scription of the Drawings FIG. 1 is a diagram of an two camera over-the-belt Optical Character Recognition(OCR) system.
FIG. 2 is a diagram of a parcel including a fluorescent ink fiduciary mark located within the destination address block of the parcel.
FIG. 3 is a diagram of a parcel including a region of interest defined around a fiduciary mark.
FIG. 4 is a functional block diagram illustrating the flow of information in a two 10 camera OCR system.
FIG. 5 is a logical flow diagram for a two camera over-the belt OCR system.
FIG. 6 is a logical flow diagram for a computer-implemented routine for storing a low resolution image from data captured with the first camera of a two camera OCR system.
FIG. 7 is a logical flow diagram for a computer-implemented routine for 15 processing black/white data captured with the first camera of a two camera OCR system.
FIG. 8 is a logical flow diagram for a computer-implemented routine for creatinga low resolution image from black/white data captured with the first camera of a two camera OCR system.
FIG. 9 is a logical flow diagram for a co~ ulel-implemented routine for storing a 20 high resolution image from data captured with the second camera of a two camera OCR system.
Description of the Preferred Embodiments Turning first to the nomenclature of the specification, the detailed description25 which follows is represented largely in terms of processes and symbolic representations of operations performed by conventional co~llpuLel components, including a central processing unit (CPU), memory storage devices for the CPU, and connected pixel-oriented display devices.
These operations include the manipulation of data bits by the CPU and the maintenance of these bits within data structures resident in one or more of the memory storage devices. Such data 30 structures impose a physical org~ni7~tion upon the collection of data bits stored within computer memory and represent specific electrical or m~gnetic ele.ml~.nt!i. These symbolic representations are the means used by those skilled in the art of computer programming and computer construction to most effectively convey teachings and discoveries to others skilled in the art.
For the purposes of this discussion, all or a portion of a process may be a 35 sequence of computer-executed steps leading to a desired result. These steps generally require physical manipulations of physical quantities. Usually, though not nece~s~3rily, these quantities take the form of electrical, magnetic, or optical signals capable of being stored, transferred, combined, cornpared, or otherwise manipulated. It is conventional for those skilled in the art to refer to these signals as bits, values, elements, symbols, characters, terms, objects, numbers, CA 022l7369 l997-lO-03 096/32692 PCT~US96/05019 records, files, or the like. It should be kept in mind, however, that these and sirnilar terms are associated with applv~liate physical quantities for computer operations, and that these terms are merely conventional labels applied to physical quantities that exist within and during operation of the COlll~u~
It should also be understood that manipulations within the computer are often referred to in terms such as adding, comparing, moving, etc. which are often associated with manual operations performed by a human operator. It must be understood that no such involvement of a human operator is n~ces~ry or even desirable in the present invention. The operations described herein are machine operations performed in conjunction with a human operator or user who interacts with the computer. The machines used for performing the operation of the present invention include general purpose digital computers, work stations, or other similar co~ ,uLi~lg devices.
In addition, it should be understood that the programs, processes, methods, etc.described herein are not related or limited to any particular computer or ap~aratus. Rather, various types of general purpose machines may be used with programs constructed in accordance with the teachings described herein. Similarly, it may prove advantageous to construct specialized apparatus to perform the method steps described herein by way of dedicated computer systems with hard-wired logic or programs stored in nonvolatile memory, such as read only memory.
Referring now to the drawings, in which like numerals refer to like elements in the several views, FIG. l shows a two camera over-the-belt optical character recognition (OCR) system 10 including a preferred embodiment of the present invention. The OCR system 10 includes a conveyer 12 which carries parcels 14a through 14n. Conveyer 12 is preferably 18 inches wide and may carry approximately 3,600 parcels per hour, and may move at a rate of l00 feet per minute. Parcels 14a through 14n may vary in height, may have soft or irregular edges, and may be arbitrarily oriented on the conveyor 12. The preferred configuration shown in FIG.
1 having an l 8 inch wide conveyor belt is adapted for h~n~lling relatively small parcels. It will be appreciated that alternative embodiments of the present invention that are adapted for h~n~lling larger or smaller parcels may be constructed in accordance with the principles of the present invention.
Generally described, the two camera OCR system 10 is operational to ascertain the position and orientation of a fluorescent ink fiduciary mark located within the destination address block on the surface of a parcel 14, to capture an image of the text comprising the destination address in a region of interest defined with respect to the position of the fiduciary mark, to rotate the text image by an angle defined with respect to the orientation of the fiduciary mark, and to transmit the rotated text image to a display device or to a computer-implemented text reader. The preferred fiduciary mark is described with more particularity with respect to FIGS. 2 and 3. Although the preferred embodiment of the present invention is described in the context of an over-the-belt OCR system using a fluorescent ink fiduciary mark, those skilled in , WO 96/32692 PcTluss6lo5ol9 the art will appreciate that the principles of the present invention may be applied to virtually any type of two camera system for reading indicia on a substrate.
The conveyor 12 moves a parcel 14 through the field of view of a low resolution CCD camera 16. The low resolution camera 16 is preferably a low resolution, monochrome, 256 pixel line-scan type camera such as a Thompson TH7806A, TH793 lD. An ultraviolet light source 18 in conjunction with a reflector 19 illllmin~tes the parcel 14 as it is conveyed through the viewing area of the low resolution camera 16, which captures an image of the surface of the parcel 14. The low resolution camera 16 is fitted with a commercially available optical filter 20 that passes yellow/green light and ~ttt-nll~tes light in other portions of the visible spectrum. The low resolution camera 16 is thus configured to be responsive to yellow/green light such as that emitted by fluorescent ink exposed to ultraviolet light. More specifically, the optical filter 20 causes the low resolution camera 16 to be responsive to the yellow/green light emitted from the commercially available National Ink No. 35-48-J (Fluorescent Yellow) in response to ultraviolet light.
lS The conveyor 12 then moves the parcel 14 through the field of view of a high resolution CCD camera 22. The high resolution camera 22 is preferably a monochrome, pixel line-scan type camera such as one using a Kodak KLI-500 1 CCD chip. A white light source 24 in conjunction with a reflector 25 illllmin~tes the parcel 14 as it is conveyed through the viewing area of the second camera 22, which captures an image of the surface of a parcel 14. The high resolution camera 22 is responsive to a grey-scale light pattern such as that reflected by black ink text on the surface of the parcel 14. The high resolution camera 22 is relatively unresponsive to light reflected by fluorescent ink illuminated by white light. More specifically, the commercially available National Ink No. 35-48-J (Fluorescent Yellow) is substantially invisible to the high resolution camera 22 when illl-min~t~ by the white light source 24.
Each camera 16 and 22 may be pointed towards a first mirror (not shown), which may be pointed towards a second mirror (not shown), which may be pointed at the conveyor 12 to alter the optical path from the camera to the conveyor 12. The optical path from the low resolution camera 16 to the conveyor 12 is preferably 52 inches, and the optical path from the high resolution camera 22 to the conveyor 12 is preferably 98 inch~s. These parameters may be varied somewhat without unduly affecting the performance of the present invention. Those skilled in the art will appreciate that mirror systems can be used to increase the optical path length of a camera system while accommodating a smaller physical distance between the camera and the object to be imaged. See, for example, Smith et al., U.S. Patent No.
5,308,960, incorporated herein by reference.
The cameras 16 and 22 are positioned approximately 52 inches apart and 25 inches above the center of the conveyer 12 and each have an 18 inch wide field of view at the conveyor. In the preferred configuration shown in FIG. l, the low resolution camera 16 remains focused at a height of four inches above the conveyor. The focal height of four inches above the conveyor 12 corresponds to the average expected height of parcels 14a through 14n r W 096132692 PCTrUS96105019 handled by the OCR system 10. The low resolution camera 16 has a field of view that is approximately 16 inches wide at its focal height. The high resol-ltion camera 22 uses an optical path equalizer to focus on parcels of different heights. The preferred optical path equalizer is described in the commonly owned pending and allowed U.S. Patent Application No.
08/292,400, "Optical Path Equalizer" filed August 18, 1994, inventors Johannes A. S. Bjorner and Steven L. Smith, incorporated herein by reference.
A belt encoder 26 provides a signal indicating the speed of the conveyor 12 to avideo processor 28 and to the high resolution camera 22. The belt encoder 26 is a standard belt driven opto-mechanical encoder. The video processor 28 controls the operation of the low resolution camera 22 and sequentially transmits a one-bit ~i.e., black/white) vide~o signal corresponding to the image captured by the low resolution camera 16 to a host computer 30.
The high resolution camera 22 transmits an eight-bit grey-scale video signal corresponding to the image captured by the high resolution camera 22 to the host computer 30. The host computer 30 is preferably a general purpose computer comprising a standard microprocessor such as Heurikon HKV4d 68040 CPU board and a special purpos~ high speed image acquisition and procccsing board set such as the lS0/40 series manufactured ~y Tm~ging Technologies, Inc.
of Bedford Massachusetts. The operation of the cameras 16 and 22, the video processor 28, and the host coln~,ul~l 30 are described with more particularity with respect to FIG. 4.
Referring to FIG. 2, a parcel 14 bears a destination address block 32 cont~iningtext 33 defining the llestin~tion address of the parcel 14. A fluorescent ink fiduciary mark 34 is located approximately in the center of the destination address block 32 in the same area as the text 33 defining the ~iestin~tion address. As shown in FIG. 2, the text 33 may be oriented at an angle with respect to the OCR coordinate system 41.
FIG. 3 shows the configuration of the preferred fiduciary mark 34 which comprises two fluorescent ink non-overlapping circles of different diameter. As used herein, a circle means either an annulus or the area bounded by an annulus. The fiduciary mark 34 includes a large circle 35 and a small circle 37 oriented such that a vector 38 from the center of large circle 36 to the center of the small circle 37 is oriented approximately in the same direction as underlying text 33. The position of the fiduciary mark 34 is defined to be the mid-point 39 of the vector 38. It will be clear to one skilled in the art that alternative embodiments might include locating the fiduciary mark elsewhere on the parcel in a known relation to text bearing area 32, or in a different known relationship to the underlying text. The fiduciary mark 34 is typically stamped on a parcel using a conventional ink stamp after the clestin:~tion address 33 has been affixed to the parcel. It will be appreciated that the fiduciary mark 34 might be carried on a label, preprinted upon the parcel, or might be carried upon a transparent envelope into which an address label is placed.
For the preferred fiduciary mark 34, the diameter of large circle 36 is approximately 3/4 of an inch, the diameter of small circle is approximately 7/16 of an inch, and the distance separating them is approximately 1/4 of an inch. It is noted that a limit is imposed W 096/32692 PCTrUS96/05019 upon the size of the fiduciary mark 34 by the resolution of ihe low resolution camera 16. For example, the fiduciary mark 34 may be made ~maller if the lu~ resolution camera 16 has a higher resolution, and the resolution of camera 16 may be reduced if the fiduciary mark 34 is made larger. Acceptable performance is observed for the preferred configuration shown in FIG.
S 1 when the above fiduciary mark parameters are used. ~lternative embodiments might vary the size of the components of fiduciary mark 34 somewhat ~ithout unduly affecting the ~~elru.lllance of the present invention.
FIG. 3 also shows a region of interest 40 defined with respect to the fiduciary mark 34. The region of interest 40 is defined in terms of the high resolution camera to be a lk b~y lk square (i.e., 1,024 pixels by 1,024 pixels, which is equivalent to four inches by four inches). The region of interest 40 is orthogonally oriented with respect to the OCR coordinate system 41 and centered on the defined position 39 of the fiduciary mark 34. The host computer 30 creates and stores a bit map image of a small area comprising the fiduciary mark 34 from the data captured by the low resolution camera 16 as a parcel 14 passes through the field of view of the low resolution camera 16. The host computer 30 then determines the position and orientation of the fiduciary mark 34 and defines the region of interest 40 with respect to the position 39 of the fiduciary mark 34. The host computer then creates and stores a high resolution text image within the region of interest 40 from the data captured by the high resolution camera 22 as the parcel 14 passes through the field of view of the high resolution camera 22. In this manner, only a relatively small portion of the data captured by the high resolution camera 22 is saved for processing by the host computer 30.
FIG. 4 shows the connections among the belt encoder 26, the low resolution camera 16, the video processor 28, the host computer 30, and the high resolution camera 22.
The belt encoder 26 supplies a signal indicating the speed of the conveyor 12 to the video processor 28 and the high resolution camera 22. The Yi~o processor 28 provides a power supply 44 and a line clock signal 45 to the low resolution c~ne~a 16. The line clock signal 45 is used to trigger cycles of the low resoll}tion camera 1~ (i e.1 ~xposures of the line of CCD
pixels comprising the low resolution camera 16). Each cycle ca~tures a row of the image of the surface of a parcel 14 as it moves past the low resolutîon camera 16. The belt encoder 26 is selected to provide a pulse for each cycle of ~he high resolutioll camera 22. The high resolution camera 22 uses 4,096 pixels and has an exposure width of ~6 inches. The belt encoder 26 is therefore selected to provide 4,096 pulses for every 16 inches of travel of the conveyor belt 12.
The low resolution camera includes 256 pixels and also has an exposure width of 16 inches.
The line clock signal 45 for the low resolution camera 16 therefore includes one trigger pulse for every 16 pulses of the belt encoder 26 so that the high resolution camera 22 has 16 cycles for each cycle of the low resolution camera 16. In this manner, the line images captured by the cameras 16 and 22 may be assembled by the host computer 30 into two-dimensional images with the correct aspect ratios (i.e., the ratio of the length of an image to its width).
W O 96/32692 PCTrUS96/05019 The low resolution camera 16 transmits a pixel clock signal 46 and a raw video signal 47 to the video processor 28. The raw video signal 47 comprises a sequence of analog line signals, each line signal comprising 256 CCD signals created by a cycle of the low resolution camera 16. The pixel clock signal 46 comprises 256 pixel pulses per cycle, one pixel S pulse for each of the 256 CCD signals read during a cycle of the low resolution camera 16. The video processor 28, which is a standard one-bit A/D converter, includes a filter 48 that sharpens the raw video signal 47, an amplifier 49 that increases the magnitude of the raw video signal, and a comparator S0 that compares each CCD signal of the raw video signal 47 to a threshold value to produce a black/white video signal 54. As used herein, a black/white video signal is a pixelized signal with uniform foreground and background pixel values wherein each foreground pixel is "on" and each background pixel is "off". The threshold value is selected in view of the filter 20 so that foreground pixels are only produced by fluorescent indicia, such as that produced by fluorescent ink illnmin~t~l by ultraviolet light. Thus, the low resolution camera 16 captures an image of the fiduciary mark 34 without capturing an image of th~ underlying text lS 33.
The line clock signal 45 and the pixel clock signal 46 are transmitted from the video processor 26 to the host computer 30. It will be appreciated that these signals may be used to generate x,y coordinates in the OCR coordinate system 41 for each pixel of the black/white video signal 54. The black/white video signal 54 is transmitted from the video processor 28 to the host computer 30 where it is initially captured, one line at a time, in a 256 by one-bit FIFO low resolution line buffer 56. The CPU of the host computer 30 is interrupted each cycle of the line clock 45 to read the contents of the low resolution line buffer 56. The CPU first checks the status of a buffer empty bit 57 to determine whether the low resolution line buffer 56 contains data (i.e., whether any foreground pixels are included in the most recent line image captured by the low resolution camera 16), and reads the contents of the low resolution line buffer 56 only if the buffer empty bit 57 is not set.
The host computer 30 receives a new line image from the low resolution camera 16 each cycle of the line clock 45 and determines whether each line image contain~ foreground pixels. If a line image contains foreground pixels, the host computer 30 computes the x,y coordinates for the foreground pixels using the line clock signal 45 (for the x coordinate3 and the pixel clock signal 46 (for the y coordinate~ and places the x,y coordinates in the low resolution line buffer 56. The low resolution line buffer 56 captures and ml m~-nt~rily (i.e., the duration of a cycle of the line clock 45) retains x,y coordinates of the foreground pixels captured by the low resolution camera 16. The host computer sequentially reads the low resolution line buffer 56 and creates and stores a two-dimensional bit map image from the foreground pixel data read from the low resolution line buffer 56 in a general purpose computer memory 58. Those skilled in the art will appreciate that a signal from a belt encoder indicating the speed of a conveyor, a line-scan CCD camera, a FIFO buffer, and a general purpose computer memory can be used to produce and control the aspect ratio of a two-dimensional computer image of an object conveyed Wo 96/32692 PCT/US96/0501s past the camera. See, for example, Shah et al., U.S~ Patent No. 5,291,564, which is incorporated by reference.
The host computer 30 creates and stores a separate image for the fiduciary mark 34 on each parcel conveyed past the low resolution camera 16. Provided that a partially 5 completed image of a fiduciary mark 34 has not been constructed, a new image is started each time foreground pixels are found in the low resolution line buffer S6. An image is considered complete when no foreground pixels are found in the low resolution line buffer 56 for 16 consecutive cycles of the line clock 45. The distance traveled by the conveyor 12 in 16 cycles of the line clock 45 is approximately four times the distance separating the large circle 36 and l0the small circle 37 of a nominal fiduciary mark 34. In this manner, the host computer 30 creates and stores a separate foreground pixel image for the fiduciary mark 34 on each parcel 14a through 14n. Those skilled in the art will appreciate that foreground images may be created and stored by the host computer 30 in various formats including bit maps and run-length encoded im;~ges.
15The host computer 30 counts the cycles of the lir~e clock 45 in a rolling counter 51. The rolling counter 51 is used to keep track of the positions of parcels in the x coordinate of the OCR coordinate system 41 (i.e., the direction in which the conveyor 12 travels). The host computer 30 determines the position and orientation of a fiduciary mark 34 and defines a region of interest 40 around the mark. The host computer 30 then determines the number of cycles of 20 the line clock 45 required for the region of interest 40 to reach the field of view of the high resolution camera 22, adds this number to the value in the counter 51, and stores the result as a trigger value 52 for storing data subsequently captured by the high resolution camera 22. The host computer 30 determines that the region of interest 40 has reached the field of view of the high resolution camera 22 when the value in the counter 51 is ec~ual to the trigger value 52. The 2S host computer 30 then extracts from the high resolution buffer 68 the data within the region of interest 40 until the counter 51 increments a predeterrnined num~er times (64 increments in the preferred configuration) and clears the trigger value 52. The extracted region of interest data is stored in the general purpose memory 58.
The host computer 30 provides a power su~ply ~ to the high resolution camera 30 22 which provides a line clock signal 62, a pixel c~ock signal ~ and a grey-scale video signal 66 to the host computer. The line clock sign~ ~2 and ~~e pi~el clock signal 64 for the high resolution camera 22 are analogous to the line clock signal 45 and the pixel clock signal 46 for the low resolution camera 16, except that they produce 16 cycles for each cycle of the low resolution camera. It will be appreciated that these signals may be used to generate x,y 35 coordinates in the OCR coordinate system 41 for each pixel o~ the grey-scale video signal 66.
The operation of a conventional CCD camera such as the high resolution camera 22 in producing a grey-scale video signal such as the grey-scale video signal 66 is well known in the art and will not be further described herein.
wo 96l32692 Pcrluss6lo5ol9 The grey-scale video signal 54 is transmitted from the high resolution camera 22to the host computer 30 where it is initially captured, one line 2t a time, in a 4,096 by eight-bit high resolution line buffer 68. The high resolution line buffer 68 captures and momentarily (i.e:, the duration of-a cycle of the line clock 62) retains the grey-scale pixel data captured during a cycle of the high resolution camera 22. The host computer 30 sequentially reads the high resolution buffer 68, extracts data that is within the region of interest 40, and creates and stores a two-dimensional image of the region of interest 40 in the general purpose memory 58 of the host co~ u~er 30. Thus, only the high resolution data within the region of interest 40 is stored for processing by the host computer 30. A suitable method and system for producing a computer image from eight-bit grey-scale pixel data is described in the commonly owned pending U.S. Patent Application No. 08/380,732, "Method and Apparatus for Separating Foreground From Background in Images Containing Text" filed January 31, 1995, inventors Michael C. Moed and Izrail S. Gorian, which is incorporated by reference. Those skilled in the art will appreciate that grey-scale images may be created and stored by the host computer 30 in various formats including bit maps and run-length encoded images.
FIG.5 is a logical flow diagram of the image processing method 500 performed by the two camera over-the-belt OCR system 10. In routine 510, a low resolution CCD camera 16 captures an image of a fluorescent ink fiduciary mark 34 located within the destination address block 32 on a parcel 14 (see FIG. 2). The steps associated with routine 510 will be described more particularly with reference to FIGS. 6-8. In the next step 520, the position and orientation of the fiduciary mark 32 is determined. In the next step 530, a region of interest 40 is defined about the fiduciary mark 34 (see FIG. 3). In routine 540, a high resolution CCD
camera 22 captures a grey-scale image of the parcel 14 as it is conveyed through the field of view of the high resolution camera 22, and a host computer 30 extracts and stores a grey-scale image of the region of interest 40. The steps associated with routine 540 are described with more particularity with reference to FIG. 9. In step 550 the text image stored in step 540 is rotated to coincide with the x axis of the OCR coordinate system 41. In step 560, the text image is displayed on a display device such as a computer screen and/or C.ent to a computer-implemented text reader.
A suitable method and system for step 520 is described in the commonly owned pending U.S. Patent Application No.08/419,176, ~'Method for Locating the Position and Orientation of a Fiduciary Mark" filed April l0, 1995, inventors James Stephen Morton and James Vincent Recktenwalt, which is incorporated by reference. A suitable method and system for step 550 is described in the commonly owned pending U.S. Patent Application No. (to be assigned), "Method and System for Fast Rotation of Run-Length Encoded Images" filed July 26, 1995, inventors Jie Zhu, Michael C. Moed, and Izrail S. Gorian, which is incorporated by reference.
FIG. 6 shows the steps associated with routine 510 (from FIG. 5) for storing a low resolution image captured with the first camera system 16. In step 602, the video , 096/32692 PCTrUS96/05019 processor 28 receives a signal from the belt encoder 26 indicating ~he speed of the conveyor belt 12. In step 604, the signal from the belt encoder 26 is used to generate a line clock signal 45.
The line clock signal 45 for the low resolution camera l~ includes one pulse for each 16 pulses of the signal from the belt encoder 26. In step 606, the line ~lock signal 45 triggers a cycle of the low resolution camera 16. In step 608, the raw video data ~7 and pixel clock signal 46 are returned from the low resolution camera 16 to the video ~roces~or 28. The raw video signal 47 is then filtered in step 610, amplified in step 612, and compare~ to a threshold in step 614 to produce a black/white video signal 54 in step 616. The b~ack/white video signal 54 is then tr~ncmittç~l from the video processor 28 to a low resolution line buffer 56 in the host computer 30. The host computer 30 generates x,y coordinates for foreground pixel data in the black/white video signal 54 using the line clock signal 45 (for the x coordinate) and the pixel clock signal 46 (for the y coordinate). The coordinates of the foreground pixels are then captured in the low resolution line buffer 56 and Illtim~ttqly stored for processing in the general purpose memory 58 of the host computer 30.
The low resolution image stored in the general purpose memory 58 is then processed by the host computer 30 in routine 618, which is described with more particularity with respect to FIGS. 6-8. The low resolution camera 16 continuously produces an image of the conveyor belt 12 and the parcels carried by the conveyor belt 14 which is converted to the black/white video signal 54. The host computer 30 generates x,y coordinates for the foreground pixels (i.e., pixels exposed by the light produced by the fluorescent ink fiduciary mark) of the black/white video signal 54 and stores them in the general purpose memory 58 in the form of a two-dimensional bit map. A separate bit map is constructed for each fiduciary mark 34.
FIG. 7 shows the steps of routine 618 (fr~m FIG. 6) for processing the black/white video data 54 stored in the low resolution line ~s~ffer 56. In step 702, the host computer 30 receives the black/white video data 54 from the vi~eo processor 28. In decision step 704, it is determined whether there are any ~oregro~nd pi~cels in the black/white video signal 54. If there are no foreground pixels in the blacklwhite ~rideo signal 54, the "no" branch is followed from step 704 to step 710 in which the CPU of ~h~ host computer 30 is interrupted.
If there are foreground pixels in the black/white video signal ~;4, the "yes" branch is followed from step 704 to step 706 in which the x,y coordinates oi the ~reground pixels are generated.
The x,y coordinates are defined with respect to the OCR coordinate system 41 with the y coordinate defined relative to a edge of the conveyor 12 and the x coordinate defined relative to the value in the counter 51. The line clock signal 45 is used to generate the x coordinate, and the pixel clock signal 46 is used to generate the y coordinate. Step 706 is followed by step 708 in which the x,y coordinates of the foreground pixels are captured and momentarily retained in the low resolution line buffer 56.
~ he "no" branch of decision step 704 and the "yes" branch (via steps 706 and 708) are followed by step 710 in which the CPU of the host computer 30 is interrupted. The CPU is interrupted each cycle of the line clock 45 regardless of whether foreground pixel data has been placed in the low resolution buffer 56 so that blank lines may be built into the bit map image of the fiduciary mark 34. In response to the interrupt signal generated during step 710, the low resolution line buffer S6 may be read in step 712. In routine 714, a two ~im~ n~ional bit map image of a fiduciary mark 34 is created. The steps associated with routine 714 are described with more particularity with reference to FIG.8.
FIG. 8 shows the steps of routine 714 (from FIG. 7) for creating a low resolution image of a fiduciary mark 34. The first step of routine 714 is decision step 802 in which the empty bit 57 of the low resolution buffer 56 is checked. The empty bit 57 may be a software register, or it may be a pin that is latched to a pre~l~tl~rrnined voltage. The empty bit 57 in~ic~tes whether there is data in the low resolution buffer. If the buffer empty bit 57 is not set, the "no" branch is followed from step 802 to step 804 in which the contents of the low resolution line buffer 56 are read and the buffer is cleared. Step 804 is followed by decision step 806 in which it is determined whether there is a partial image of a fiducialy mark stored in l S the general purpose memory 58 of the host computer 30. If there is no partial image stored, the "no" branch is followed to step 808 in which a new fiduciary mark image is started. If there is a partial image stored, the "yes" branch is followed from step 806 to step 810 in which the data from the low resolution line buffer 56 is added to the partial image. Steps 808 and 810 are followed by step 812 in which the computer implemented routine 510 loops back to step 602 (FIG. 6) to begin another cycle.
Referring again to step 802, if the empty bit is set, the '~yes" branch is followed from step 802 to decision 814 in which it is determined whether the low resolution line buffer 56 has been empty for 16 consecutive cycles of the line clock 45 (i.e., 16 consecutive blank lines). If there have not been 16 consecutive blank lines, the "no" branch is followed to decision step 816 in which it is determined whether a partial image is stored. If a partial image is not stored, the "no" branch is followed to the continue step 812 in which the computer implemented routine 510 loops back to step 602 to begin another cycle. If a partial image is stored, step 816 is followed by step 818 in which a blank line is added to the partially stored image. Step 818 is then followed by the continue step 812 in which the computer implemented routine 510 loops back to step 602 to begin another cycle. Referring again to step 814, if there have been 16 consecutive blank lines read from the low resolution line buffer 5~, the "yes"
branch followed from step 814 to the decision step 820 in which it is determined whether a partial image is stored. If a partial image is not stored, the "no" branch is followed from step 820 to the continue step 812 in which the computer implemented routine 510 loops back to step 602 to begin another cycle. If a partial image is stored, the "yes" branch is followed from step 820 to step 822 in which the low resolution image is deemed to be complete. Step 822 is then followed by the continue step 812 in which the computer implemented routine 510 loops back to step 602 to begin another cycle.
, wo 96/32692 PCT/US96/OS019 Upon reaching step 822, a complete image of a fiduciary mark 34 has been stored in the general purpose memory 58 of the host computer 30. Referring to FIG. 5, steps 520 through 560, which require a complete image of a fiduciary mark 34, are then initiated. It will be appreciated that the colllpuLel-implemented routine 510 loops back to step 602 to each cycle of the line clock 45, whereas the steps 520 through 560 are only completed from time to time in response to a complete image of a fiduciary mark 34 having been constructed pursuant to routine 510. Thus step 822, which indicates that routine 510 has stored an image of a complete fiduciary mark 34 in the general purpose memory 58, is followed by steps 520 through 560, which require a complete image of a fiduciary mark 34, as well as a loop back to step 602 to for another cycle of the line clock 45.
FIG. 9 shows the steps associated with routine 540 (from FIG. 5) for storing a high resolution image captured with the second camera system 22. In step 902, the high resolution camera 22 receives a signal from the belt encoder 26 indicating the speed of the conveyor b~it 12. In step 904 the signal from the belt encoder 26 is used to generate a line clock signal 62 for the high resolution camera. In step 906, the line clock 62 triggers a cycle of the high resolution line camera 22 for each pulse of the signal received from the belt encoder 26 In step 908 the grey-scale video image 66 captured by a cycle of the high resolution line camera 22 is captured and momentarily (i.e., for the duration of one cycle of the line clock 62) retained in the high resolution line buffer 68.
Step 90B is followed by decision step 910 in which it is determined whether the data in the high resolution line buffer 68 corresponds to a line which is in the region of interest 40. Routine 540 determines the number of cycles of the line clock 45 required for the region of interest 40 to reach the field of view of the high resolution camera 22, adds this number to the value in the counter 51, and stores the result as a trigger value 52 for storing data subsequently captured by the high resolution camera 22. Routine 540 then determines that the region of interest 40 is within the field of view of the high resolution camera 22 when the value in the counter 51 is equal to the trigger value 52. The region of interest 40 remains in the field of view of the high resolution camera 22 for a predefined number of cycles of the line clock 45 (i.e., 64 cycles in the preferred configuration shown n FIG. 1 corresponding to a four inch wide region of interest and 16 cycles of the line clock 45 per inch of travel of the conveyor 12). The host computer 30 therefore extracts from the high resolution buffer 68 the data within the region of interest 40 until the counter 51 increments a predetermined number times (64 increments in the preferred configuration~ and clears the trigger value 52. The extracted region of~interest data is stored in the general purpose memory 58.
Referring again to step 910, if the current line is not within the region of interest 40, the "no" branch is followed from step 910 back to step 902 for another cycle of the high resolution carnera 22. If the line is within the region of interest 40, the "yes" branch is followed from step 910 to step 912 in which the portion of the data stored in the high resolution line buffer 68 which is within the general interest 40 is extracted from the line buffer 68 and stored W 096/32692 PCT~US96/05019 in the general purpose computer memory 58 of the host computer 30. Step 912 is followed by the decision step 914 in which it is determined whether the line stored in the general purpose computer memory 58 in step 912 was the last line of the region of interest 40. If it was the last line, the "no" branch is followed back to step 902 for another cycle of the high resolution line camera 22. If it was the last line in the region of interest 40, the high resolution image of the region of interest 40 is deemed to be complete and the "yes" branch is followed to the continue step 916 which causes the image processing method 500 to exit routine 540. The continue step 916 is then followed by step 550 of the image processing method 500 shown on FIG. 5.
In view of the foregoing, it will be appreciated that the low resolution camera 16 captures a new line image each cycle of the line clock 45. The host co~ uLer 30 only computes the x,y coordinates for the foreground pixels in the resulting black/white video data stream 54 for ultimate storage in the general purpose memory 58. Similarly, the high resolution camera 22 captures a new grey-scale line image which is captured in the high resolution line buffer 56 each cycle of the line clock 62. The host computer 30 then extracts from the high resolution line buffer 56 for ultimate storage in the general purpose memory 58 only the data that is within the region of interest 40. Thus, the two camera OCR system 10 limits the amount of low resolution data and the amount of high resolution data that must be stored in the general purpose memory 58 for processing by the host computer 30.
Those skilled in the art will appreciate that many different combinations of hardware will be suitable for practicing the present invention. Many commercially available substitutes, each having somewhat different cost and performance characteristics, exist for each of the physical components listed above. For example, the host computer 30 could be hard-wired logic, reconfigurable hardware, an application specific integrated circuit (ASIC), or some other equivalent means for implem.-nting a set of inst~uctions.
It should be understood that the foregoing relates only to the preferred embodiment of the present invention, and that numerous changes may be made therein without departing from the spirit and scope of the invention as defined by the following claims.
Claims (21)
1. A method of capturing an image of readable indicia 33 on an object 14, comprising the steps of capturing an image of the object 14, creating a matrix defining a coordinate system corresponding to the image, and translating a set of pixel values corresponding to the image into the matrix, CHARACTERISED BY THE STEPS OF:
affixing a mark 34 to the object 14 such that the mark 34 and the readable indicia 33 are superimposed relative to each other;
capturing an image of the mark 34 by exposing a side of the object 14 to a firstcamera 16;
determining the position of the mark 34 with respect to the coordinate system;
defining a region 40 about the position of the mark 34; and capturing an image of the readable indicia 33 within the region 40 by exposing the same side of the object 14 to a second camera 22.
affixing a mark 34 to the object 14 such that the mark 34 and the readable indicia 33 are superimposed relative to each other;
capturing an image of the mark 34 by exposing a side of the object 14 to a firstcamera 16;
determining the position of the mark 34 with respect to the coordinate system;
defining a region 40 about the position of the mark 34; and capturing an image of the readable indicia 33 within the region 40 by exposing the same side of the object 14 to a second camera 22.
2. The method of Claim 1, wherein the mark 34 is configured to define an orientation of the mark 34, further comprising the step of:
determining the orientation of the mark 34 with respect to the coordinate system;
determining the orientation of the mark 34 with respect to the coordinate system;
3. The method of Claim 2, further comprising the step of rotating the readable indicia 33 by an angle defined with respect to the orientation of the mark 34.
4. The method of Claim 2, wherein the mark 34 comprises fluorescent ink.
5. The method of Claim 4, wherein the indicia 33 comprises text.
6. The method of Claim 5, wherein the object 14 comprises a parcel to be shipped and the indicia 33 comprises the destination address for the parcel.
7. The method of Claim 2, wherein the first camera 16 comprises a low resolution CCD camera.
8. The method of Claim 7, wherein the second camera 22 comprises a high resolution CCD camera.
9. An article comprising a substrate 14 and readable indicia 33 positioned on the substrate 14, CHARACTERISED BY:
a mark 34 superimposed relative to the readable indicia 33 on the substrate;
the mark 34 being identifiable by exposing a side of the substrate 14 to scanning equipment 16; and the readable indicia 33 being separately identifiable by exposing the same side of the substrate 14 to scanning equipment 22.
a mark 34 superimposed relative to the readable indicia 33 on the substrate;
the mark 34 being identifiable by exposing a side of the substrate 14 to scanning equipment 16; and the readable indicia 33 being separately identifiable by exposing the same side of the substrate 14 to scanning equipment 22.
10. The article of Claim 9, wherein the mark 34 is configured to define an orientation of the mark 34.
11. The article of Claim 10, wherein the mark 34 comprises fluorescent ink.
12. The article of Claim 11, wherein the indicia 33 comprises text.
13. The article of Claim 12, wherein the article comprises a parcel 14 to be shipped and the indicia 33 comprises the destination address for the parcel.
14. A system including an apparatus and an object, the apparatus for capturing an image of readable indicia 33 on the object 14, the apparatus comprising scanning equipment for capturing an image of the object 14 and a conveying system 12 for moving the object 14 adjacent to the scanning equipment, CHARACTERISED
BY:
the object comprising a mark 34 superimposed relative to the readable indicia 33 on the object 14; and the apparatus comprising, a first camera 16 for capturing an image of the mark 34 by exposing a side of the object 14 to the first camera 16, a second camera 22 for capturing an image of the readable indicia 33 by exposing the same side of the object 14 to the second camera 22, and a computer 30 configured to, process the image of the mark 34 to determine the position of the mark 34, define a region 40 about the position of the mark 34, and trigger the second camera 22 to capture an image of the readable indicia 33 by capturing an image of the region 40.
BY:
the object comprising a mark 34 superimposed relative to the readable indicia 33 on the object 14; and the apparatus comprising, a first camera 16 for capturing an image of the mark 34 by exposing a side of the object 14 to the first camera 16, a second camera 22 for capturing an image of the readable indicia 33 by exposing the same side of the object 14 to the second camera 22, and a computer 30 configured to, process the image of the mark 34 to determine the position of the mark 34, define a region 40 about the position of the mark 34, and trigger the second camera 22 to capture an image of the readable indicia 33 by capturing an image of the region 40.
15. The system of Claim 14, wherein:
the mark 34 is configured to define an orientation of the mark 34; and the computer 30 is configured to process the image of the mark 34 to determine the orientation of the mark 34,
the mark 34 is configured to define an orientation of the mark 34; and the computer 30 is configured to process the image of the mark 34 to determine the orientation of the mark 34,
16. The system of Claim 15, wherein the computer 30 is further configured to process the image of the region 40 to rotate the image of the readable indicia 33 by an angle defined with respect to the orientation of the mark 34.
17. The system of Claim 15, wherein the mark 34 comprises fluorescent ink.
18. The system of Claim 17, wherein the indicia 33 comprises text.
19. The system of Claim 18, wherein the object 14 comprises a parcel to be shipped and the indicia 33 comprises the destination address for the parcel.
20. The system of Claim 15, wherein the first camera 16 comprises a low resolution CCD camera.
21. The system of Claim 20, wherein the second camera 22 comprises a high resolution CCD camera.
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US41917695A | 1995-04-10 | 1995-04-10 | |
US53651295A | 1995-09-29 | 1995-09-29 | |
US08/536,512 | 1995-09-29 | ||
US08/419,176 | 1995-09-29 |
Publications (1)
Publication Number | Publication Date |
---|---|
CA2217369A1 true CA2217369A1 (en) | 1996-10-17 |
Family
ID=27024375
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CA002217369A Abandoned CA2217369A1 (en) | 1995-04-10 | 1996-04-09 | Two-camera system for locating and storing indicia on conveyed items |
Country Status (5)
Country | Link |
---|---|
US (1) | US6236735B1 (en) |
EP (1) | EP0820618A1 (en) |
JP (1) | JPH11501572A (en) |
CA (1) | CA2217369A1 (en) |
WO (1) | WO1996032692A1 (en) |
Families Citing this family (66)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10334221A (en) * | 1997-05-30 | 1998-12-18 | Fanuc Ltd | Image acquisition device |
JPH1131225A (en) * | 1997-07-10 | 1999-02-02 | Toshiba Corp | Detecting device for label or the like and detection processor |
US6715686B1 (en) | 1998-04-30 | 2004-04-06 | C Technologies Ab | Device for recording information in different modes |
WO1999060515A1 (en) * | 1998-04-30 | 1999-11-25 | C Technologies Ab | Device for recording information in different modes |
US6924781B1 (en) * | 1998-09-11 | 2005-08-02 | Visible Tech-Knowledgy, Inc. | Smart electronic label employing electronic ink |
US6970182B1 (en) * | 1999-10-20 | 2005-11-29 | National Instruments Corporation | Image acquisition system and method for acquiring variable sized objects |
US6585159B1 (en) * | 1999-11-02 | 2003-07-01 | Welch Allyn Data Collection, Inc. | Indicia sensor system for optical reader |
JP2003532239A (en) * | 2000-05-03 | 2003-10-28 | レナード ライフェル | Dual mode data drawing product |
FR2812226B1 (en) * | 2000-07-25 | 2002-12-13 | Mannesmann Dematic Postal Automation Sa | PROCESS FOR PROCESSING LARGE POSTAL OBJECTS IN A SORTING INSTALLATION |
JP4095243B2 (en) * | 2000-11-28 | 2008-06-04 | キヤノン株式会社 | A storage medium storing a URL acquisition and processing system and method and a program for executing the method. |
WO2002049340A1 (en) * | 2000-12-15 | 2002-06-20 | Leonard Reiffel | Multi-imager multi-source multi-use coded data source data input product |
AUPR245301A0 (en) * | 2001-01-10 | 2001-02-01 | Silverbrook Research Pty Ltd | An apparatus (WSM06) |
US6730926B2 (en) | 2001-09-05 | 2004-05-04 | Servo-Robot Inc. | Sensing head and apparatus for determining the position and orientation of a target object |
US6817610B2 (en) * | 2001-12-03 | 2004-11-16 | Siemens Aktiengesellschaft | Multiples detect apparatus and method |
US20030137590A1 (en) * | 2002-01-18 | 2003-07-24 | Barnes Danny S. | Machine vision system with auxiliary video input |
US8146823B2 (en) * | 2002-01-18 | 2012-04-03 | Microscan Systems, Inc. | Method and apparatus for rapid image capture in an image system |
US7118042B2 (en) * | 2002-01-18 | 2006-10-10 | Microscan Systems Incorporated | Method and apparatus for rapid image capture in an image system |
US7058203B2 (en) * | 2002-05-14 | 2006-06-06 | Lockheed Martin Corporation | Region of interest identification using region of adjacent pixels analysis |
US6829937B2 (en) * | 2002-06-17 | 2004-12-14 | Vti Holding Oy | Monolithic silicon acceleration sensor |
US20040076321A1 (en) * | 2002-07-16 | 2004-04-22 | Frank Evans | Non-oriented optical character recognition of a wafer mark |
US6878896B2 (en) | 2002-07-24 | 2005-04-12 | United Parcel Service Of America, Inc. | Synchronous semi-automatic parallel sorting |
US6753899B2 (en) | 2002-09-03 | 2004-06-22 | Audisoft | Method and apparatus for telepresence |
US7121469B2 (en) * | 2002-11-26 | 2006-10-17 | International Business Machines Corporation | System and method for selective processing of digital images |
US7034317B2 (en) * | 2002-12-17 | 2006-04-25 | Dmetrix, Inc. | Method and apparatus for limiting scanning imaging array data to characteristics of interest |
US7063256B2 (en) * | 2003-03-04 | 2006-06-20 | United Parcel Service Of America | Item tracking and processing systems and methods |
WO2004079546A2 (en) * | 2003-03-04 | 2004-09-16 | United Parcel Service Of America, Inc. | System for projecting a handling instruction onto a moving item or parcel |
DE602004032172D1 (en) * | 2003-09-03 | 2011-05-19 | Visible Tech Knowledgy Inc | ELECTRONICALLY UPDATABLE LABEL AND DISPLAY |
US7406196B2 (en) * | 2004-03-19 | 2008-07-29 | Lockheed Martin Corporation | Methods and systems for automatic detection of corners of a region |
US20050276508A1 (en) * | 2004-06-15 | 2005-12-15 | Lockheed Martin Corporation | Methods and systems for reducing optical noise |
US20050281440A1 (en) * | 2004-06-18 | 2005-12-22 | Pemer Frederick A | Iris feature detection and sensor-based edge detection |
US7561717B2 (en) * | 2004-07-09 | 2009-07-14 | United Parcel Service Of America, Inc. | System and method for displaying item information |
EP1661834A1 (en) * | 2004-11-25 | 2006-05-31 | Kba-Giori S.A. | Marking for printed matter |
US7809158B2 (en) * | 2005-05-02 | 2010-10-05 | Siemens Industry, Inc. | Method and apparatus for detecting doubles in a singulated stream of flat articles |
US20080012981A1 (en) * | 2006-07-07 | 2008-01-17 | Goodwin Mark D | Mail processing system with dual camera assembly |
US20080013069A1 (en) * | 2006-07-07 | 2008-01-17 | Lockheed Martin Corporation | Synchronization of strobed illumination with line scanning of camera |
US20080049972A1 (en) * | 2006-07-07 | 2008-02-28 | Lockheed Martin Corporation | Mail imaging system with secondary illumination/imaging window |
US7855348B2 (en) * | 2006-07-07 | 2010-12-21 | Lockheed Martin Corporation | Multiple illumination sources to level spectral response for machine vision camera |
US20080035866A1 (en) * | 2006-07-07 | 2008-02-14 | Lockheed Martin Corporation | Mail imaging system with UV illumination interrupt |
US7775431B2 (en) * | 2007-01-17 | 2010-08-17 | Metrologic Instruments, Inc. | Method of and apparatus for shipping, tracking and delivering a shipment of packages employing the capture of shipping document images and recognition-processing thereof initiated from the point of shipment pickup and completed while the shipment is being transported to its first scanning point to facilitate early customs clearance processing and shorten the delivery time of packages to point of destination |
US20080302633A1 (en) * | 2007-06-05 | 2008-12-11 | Snow Gerald F | Apparatus and method for coating and inspecting objects |
US20080306841A1 (en) * | 2007-06-07 | 2008-12-11 | Caterpillar Inc. | System and method for processing orders |
FR2926899B1 (en) * | 2008-01-28 | 2010-02-12 | Solystic | DEVICE FOR ACQUIRING IMAGES OF POSTAL SHIPMENTS COMPRISING FLUORESCENT BRANDS AND BRANDS IN CONTRAST. |
EP2283472B1 (en) * | 2008-05-05 | 2021-03-03 | Iomniscient Pty Ltd | A system and method for electronic surveillance |
US20100119142A1 (en) * | 2008-11-11 | 2010-05-13 | Sean Miceli | Monitoring Multiple Similar Objects Using Image Templates |
US8780206B2 (en) * | 2008-11-25 | 2014-07-15 | De La Rue North America Inc. | Sequenced illumination |
IT1394202B1 (en) * | 2008-12-15 | 2012-06-01 | Meccanotecnica Spa | RECOGNITION OF ARTICLES BASED ON COMBINATION OF CODES AND IMAGES |
US9418496B2 (en) * | 2009-02-17 | 2016-08-16 | The Boeing Company | Automated postflight troubleshooting |
US9541505B2 (en) * | 2009-02-17 | 2017-01-10 | The Boeing Company | Automated postflight troubleshooting sensor array |
US8812154B2 (en) * | 2009-03-16 | 2014-08-19 | The Boeing Company | Autonomous inspection and maintenance |
US9046892B2 (en) * | 2009-06-05 | 2015-06-02 | The Boeing Company | Supervision and control of heterogeneous autonomous operations |
KR101140317B1 (en) * | 2009-08-25 | 2012-05-02 | 주식회사 제우스 | Camera alternating apparatus and dual line producing apparatus and method using same |
US8773289B2 (en) | 2010-03-24 | 2014-07-08 | The Boeing Company | Runway condition monitoring |
US8712634B2 (en) * | 2010-08-11 | 2014-04-29 | The Boeing Company | System and method to assess and report the health of landing gear related components |
US8599044B2 (en) | 2010-08-11 | 2013-12-03 | The Boeing Company | System and method to assess and report a health of a tire |
US20120200742A1 (en) | 2010-09-21 | 2012-08-09 | King Jim Co., Ltd. | Image Processing System and Imaging Object Used For Same |
CN103353937A (en) | 2010-09-21 | 2013-10-16 | 株式会社锦宫事务 | Imaging object |
US8982207B2 (en) * | 2010-10-04 | 2015-03-17 | The Boeing Company | Automated visual inspection system |
US9117185B2 (en) | 2012-09-19 | 2015-08-25 | The Boeing Company | Forestry management system |
US20140175289A1 (en) * | 2012-12-21 | 2014-06-26 | R. John Voorhees | Conveyer Belt with Optically Visible and Machine-Detectable Indicators |
KR20150060338A (en) * | 2013-11-26 | 2015-06-03 | 삼성전자주식회사 | Electronic device and method for recogniting character in electronic device |
US9646369B2 (en) | 2014-03-11 | 2017-05-09 | United Parcel Service Of America, Inc. | Concepts for sorting items using a display |
US9952160B2 (en) * | 2014-04-04 | 2018-04-24 | Packaging Corporation Of America | System and method for determining an impact of manufacturing processes on the caliper of a sheet material |
WO2017109801A1 (en) | 2015-12-24 | 2017-06-29 | Datalogic Ip Tech S.R.L. | Coded information reader |
US10471478B2 (en) | 2017-04-28 | 2019-11-12 | United Parcel Service Of America, Inc. | Conveyor belt assembly for identifying an asset sort location and methods of utilizing the same |
RU2661760C1 (en) | 2017-08-25 | 2018-07-19 | Общество с ограниченной ответственностью "Аби Продакшн" | Multiple chamber using for implementation of optical character recognition |
ES2745066T3 (en) * | 2017-09-06 | 2020-02-27 | Sick Ag | Camera device and method to record a stream of objects |
Family Cites Families (61)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3444517A (en) | 1965-03-09 | 1969-05-13 | Control Data Corp | Optical reading machine and specially prepared documents therefor |
DE1816816A1 (en) | 1967-12-28 | 1969-08-14 | Tokyo Shibaura Electric Co | Position and direction determination device using labels or patterns |
US3663814A (en) | 1969-01-29 | 1972-05-16 | Resource Data Corp | System for delineating selective response of a material to radiation in presence of visible illumination |
US3662181A (en) | 1970-04-22 | 1972-05-09 | American Cyanamid Co | Scanning apparatus for the detection and identification of luminescing code bars on articles |
US3666946A (en) | 1970-09-29 | 1972-05-30 | Ncr Co | Automatic information reading system using photoluminescent detection means |
US3674924A (en) | 1970-10-02 | 1972-07-04 | Ncr Co | Document scanning and display system |
CA2047821A1 (en) | 1990-09-17 | 1992-03-18 | Wayne A. Buchar | Electronic filing system recognizing highlighted original to establish classification and retrieval |
US3744025A (en) | 1971-02-25 | 1973-07-03 | I Bilgutay | Optical character reading system and bar code font therefor |
USRE29104E (en) * | 1971-08-18 | 1977-01-04 | Cognitronics Corporation | Method of scanning documents to read characters thereon without interference from visible marks on the document which are not to be read by the scanner |
US3891324A (en) | 1972-05-01 | 1975-06-24 | Oliver Machinery Co | Method and apparatus for inspecting position of labels on product surfaces |
US3801775A (en) | 1972-08-07 | 1974-04-02 | Scanner | Method and apparatus for identifying objects |
US3885229A (en) | 1972-10-28 | 1975-05-20 | Nippon Electric Co | Document scanning apparatus |
US3949363A (en) | 1974-06-28 | 1976-04-06 | Recognition Equipment, Incorporated | Bar-Code/MICR/OCR merge |
DE3067771D1 (en) | 1979-10-23 | 1984-06-14 | Scantron Gmbh | Method and device for the identification of objects |
JPS56129981A (en) * | 1980-03-14 | 1981-10-12 | Toshiba Corp | Optical character reader |
US4402088A (en) | 1981-04-09 | 1983-08-30 | Recognition Equipment Incorporated | OCR And bar code reading using area array |
US4408344A (en) | 1981-04-09 | 1983-10-04 | Recognition Equipment Incorporated | OCR and Bar code reader using multi port matrix array |
US4542528A (en) | 1981-04-09 | 1985-09-17 | Recognition Equipment Incorporated | OCR and bar code reader with optimized sensor |
US4411016A (en) | 1981-06-01 | 1983-10-18 | Recognition Equipment Incorporated | Barcode width measurement system |
DE3214621A1 (en) | 1982-04-20 | 1983-10-20 | Siemens AG, 1000 Berlin und 8000 München | COMBINED OPTICAL HAND READER FOR MECHANICAL CHARACTER RECOGNITION WITH INTEGRATED OPTICAL SYSTEM |
GB2136242B (en) * | 1983-02-10 | 1986-04-09 | Atomic Energy Authority Uk | Superimposition in a video display system |
JPS6143328A (en) | 1984-08-07 | 1986-03-01 | Nippon Denki Kaigai Shijiyou Kaihatsu Kk | Optical digitizer |
DE3507569A1 (en) | 1985-03-04 | 1986-09-04 | Erhardt & Leimer GmbH, 8900 Augsburg | DEVICE FOR DETECTING LABELS ON MOVING LABEL CARRIERS |
NL8501460A (en) | 1985-05-22 | 1986-12-16 | Philips Nv | METHOD FOR IDENTIFYING OBJECTS INCLUDING A CODE FIELD WITH DOT CODE, DEVICE FOR IDENTIFYING SUCH A DOT CODE, AND PRODUCT USING SUCH A DOT CODE |
US4776464A (en) | 1985-06-17 | 1988-10-11 | Bae Automated Systems, Inc. | Automated article handling system and process |
US4760247A (en) | 1986-04-04 | 1988-07-26 | Bally Manufacturing Company | Optical card reader utilizing area image processing |
US4736109A (en) | 1986-08-13 | 1988-04-05 | Bally Manufacturing Company | Coded document and document reading system |
US4955062A (en) | 1986-12-10 | 1990-09-04 | Canon Kabushiki Kaisha | Pattern detecting method and apparatus |
US4822986A (en) | 1987-04-17 | 1989-04-18 | Recognition Equipment Incorporated | Method of detecting and reading postal bar codes |
US4924078A (en) | 1987-11-25 | 1990-05-08 | Sant Anselmo Carl | Identification symbol, system and method |
US4972494A (en) | 1988-02-26 | 1990-11-20 | R. J. Reynolds Tobacco Company | Package inspection system |
US4896029A (en) | 1988-04-08 | 1990-01-23 | United Parcel Service Of America, Inc. | Polygonal information encoding article, process and system |
US4874936A (en) | 1988-04-08 | 1989-10-17 | United Parcel Service Of America, Inc. | Hexagonal, information encoding article, process and system |
US4998010A (en) | 1988-04-08 | 1991-03-05 | United Parcel Service Of America, Inc. | Polygonal information encoding article, process and system |
US5132808A (en) | 1988-10-28 | 1992-07-21 | Canon Kabushiki Kaisha | Image recording apparatus |
US5003613A (en) * | 1988-12-21 | 1991-03-26 | Recognition Equipment Incorporated | Document processing system and method |
US5291002A (en) * | 1989-06-28 | 1994-03-01 | Z Mark International Inc. | System for generating machine readable codes to facilitate routing of correspondence using automatic mail sorting apparatus |
US5138465A (en) | 1989-09-14 | 1992-08-11 | Eastman Kodak Company | Method and apparatus for highlighting nested information areas for selective editing |
DE3942932A1 (en) * | 1989-12-23 | 1991-06-27 | Licentia Gmbh | METHOD FOR DISTRIBUTING PACKAGES O. AE. |
FR2657982B1 (en) * | 1990-02-02 | 1992-11-27 | Cga Hbs | METHOD FOR LOCATING AN ADDRESS ON SORTING ARTICLES, ADDRESSING LABEL AND DEVICE FOR IMPLEMENTING THE METHOD. |
US5155343A (en) | 1990-03-28 | 1992-10-13 | Chandler Donald G | Omnidirectional bar code reader with method and apparatus for detecting and scanning a bar code symbol |
US5065237A (en) | 1990-08-01 | 1991-11-12 | General Electric Company | Edge detection using patterned background |
US5120940A (en) | 1990-08-10 | 1992-06-09 | The Boeing Company | Detection of barcodes in binary images with arbitrary orientation |
US5138140A (en) | 1990-08-22 | 1992-08-11 | Symbol Technologies, Inc. | Signature capture using electro-optical scanning |
US5135569A (en) * | 1990-08-24 | 1992-08-04 | W. R. Grace & Co.-Conn. | Ink composition containing fluorescent component and method of tagging articles therewith |
US5189292A (en) | 1990-10-30 | 1993-02-23 | Omniplanar, Inc. | Finder pattern for optically encoded machine readable symbols |
US5153418A (en) | 1990-10-30 | 1992-10-06 | Omniplanar, Inc. | Multiple resolution machine readable symbols |
IT1246931B (en) | 1991-04-19 | 1994-11-29 | Gd Spa | METHOD FOR THE CONTROL, THROUGH CAMERA, OF PRODUCTS WRAPPED WITH TRANSLUCENT TRANSPARENT MATERIAL. |
US5199084A (en) | 1991-08-16 | 1993-03-30 | International Business Machines Corporation | Apparatus and method for locating characters on a label |
US5308960A (en) | 1992-05-26 | 1994-05-03 | United Parcel Service Of America, Inc. | Combined camera system |
US5327171A (en) | 1992-05-26 | 1994-07-05 | United Parcel Service Of America, Inc. | Camera system optics |
US5307423A (en) | 1992-06-04 | 1994-04-26 | Digicomp Research Corporation | Machine recognition of handwritten character strings such as postal zip codes or dollar amount on bank checks |
US5326959A (en) | 1992-08-04 | 1994-07-05 | Perazza Justin J | Automated customer initiated entry remittance processing system |
US5343028A (en) | 1992-08-10 | 1994-08-30 | United Parcel Service Of America, Inc. | Method and apparatus for detecting and decoding bar code symbols using two-dimensional digital pixel images |
WO1994006247A1 (en) * | 1992-09-08 | 1994-03-17 | Paul Howard Mayeaux | Machine vision camera and video preprocessing system |
US5331151A (en) | 1993-01-25 | 1994-07-19 | Pressco Technology, Inc. | Multiple envelope detector |
US5365597A (en) | 1993-06-11 | 1994-11-15 | United Parcel Service Of America, Inc. | Method and apparatus for passive autoranging using relaxation |
US5865471A (en) * | 1993-08-05 | 1999-02-02 | Kimberly-Clark Worldwide, Inc. | Photo-erasable data processing forms |
US5542971A (en) * | 1994-12-01 | 1996-08-06 | Pitney Bowes | Bar codes using luminescent invisible inks |
US5642442A (en) * | 1995-04-10 | 1997-06-24 | United Parcel Services Of America, Inc. | Method for locating the position and orientation of a fiduciary mark |
US5770841A (en) * | 1995-09-29 | 1998-06-23 | United Parcel Service Of America, Inc. | System and method for reading package information |
-
1996
- 1996-04-09 JP JP8531185A patent/JPH11501572A/en active Pending
- 1996-04-09 CA CA002217369A patent/CA2217369A1/en not_active Abandoned
- 1996-04-09 WO PCT/US1996/005019 patent/WO1996032692A1/en not_active Application Discontinuation
- 1996-04-09 EP EP96911719A patent/EP0820618A1/en not_active Withdrawn
-
1997
- 1997-11-07 US US08/967,287 patent/US6236735B1/en not_active Expired - Lifetime
Also Published As
Publication number | Publication date |
---|---|
WO1996032692A1 (en) | 1996-10-17 |
EP0820618A1 (en) | 1998-01-28 |
US6236735B1 (en) | 2001-05-22 |
JPH11501572A (en) | 1999-02-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US6236735B1 (en) | Two camera system for locating and storing indicia on conveyed items | |
EP0820617B1 (en) | Location of the position and orientation of a fiduciary mark | |
EP0852520B1 (en) | System and method for reading package information | |
JP3207419B2 (en) | Parcel distribution method | |
US4516265A (en) | Optical character reader | |
JP2575539B2 (en) | How to locate and identify money fields on documents | |
US7436979B2 (en) | Method and system for image processing | |
US5325447A (en) | Handwritten digit normalization method | |
EP0241259A2 (en) | Optical character recognition by detecting geographical features | |
US8369601B2 (en) | Method of processing a check in an image-based check processing system and an apparatus therefor | |
CN1052322C (en) | Improvements in image processing | |
EP0076332B1 (en) | Optical character reader with pre-scanner | |
JP2000057250A (en) | Reading method for two-dimensional code | |
JPH10328623A (en) | Postal address recognition device | |
JPH0228897A (en) | Address area detector | |
JPS59181595A (en) | Printed pattern reading device on moving individual article | |
JP2977219B2 (en) | Mail address reading device | |
JPH08155397A (en) | Postal matter classifying device and bar code printer | |
JPS61238379A (en) | Sectionalizing apparatus | |
JPH06282676A (en) | Address recognizing device | |
JPH11207265A (en) | Information processing device and mail processing device | |
JP2001312695A (en) | Device and method for detecting label area and recording medium | |
JPH03182981A (en) | Address reading and classifying machine | |
JPH08171607A (en) | Character line direction judging device | |
JP2001516281A (en) | How to recognize symbols and letters on the face of mail parcels |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
EEER | Examination request | ||
FZDE | Discontinued |