WO2006045878A1 - Method and system for sorting a post item in a delivery bin according to identification information - Google Patents

Method and system for sorting a post item in a delivery bin according to identification information Download PDF

Info

Publication number
WO2006045878A1
WO2006045878A1 PCT/FI2004/000640 FI2004000640W WO2006045878A1 WO 2006045878 A1 WO2006045878 A1 WO 2006045878A1 FI 2004000640 W FI2004000640 W FI 2004000640W WO 2006045878 A1 WO2006045878 A1 WO 2006045878A1
Authority
WO
WIPO (PCT)
Prior art keywords
sorting
data
identifier data
rack
identifier
Prior art date
Application number
PCT/FI2004/000640
Other languages
French (fr)
Inventor
Rainer Kalevi Waltzer
Pekka Ilmari Nousiainen
Mikael Vilhelm Nyberg
Lasse Allan YLÄNEVA
Juha Kalervo Salminen
Arto Tampio
Jari Kalervo Paasikivi
Jyrki Olavi Janatuinen
Kari Juhani Hiltunen
Original Assignee
Suomen Posti Oyj
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suomen Posti Oyj filed Critical Suomen Posti Oyj
Priority to PCT/FI2004/000640 priority Critical patent/WO2006045878A1/en
Priority to EP04791431A priority patent/EP1812175A1/en
Publication of WO2006045878A1 publication Critical patent/WO2006045878A1/en
Priority to NO20072725A priority patent/NO20072725L/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B07SEPARATING SOLIDS FROM SOLIDS; SORTING
    • B07CPOSTAL SORTING; SORTING INDIVIDUAL ARTICLES, OR BULK MATERIAL FIT TO BE SORTED PIECE-MEAL, e.g. BY PICKING
    • B07C3/00Sorting according to destination

Definitions

  • the invention relates to a method and a system for sorting an object provided with identifier data, preferably a postal item, the sorting comprising both sorting into a rack and collecting from a rack by identifier data.
  • a sorting rack typically comprises delivery bins, into which the postal items are directed manually on the basis of data on an address tag.
  • Each distribution path can be allocated an individual sorting rack with bins, into which postal items are sorted and from which postal items are collected for delivery.
  • Preliminary postal delivery operations typically include steps for sorting postal items in a rack in accordance with address data, with sorting into the rack being performed by the street, road or number, and the postal items collected from the rack manually according to a delivery list. These steps are time consuming and require experienced staff. The work performance also requires printouts. There have been efforts to develop various partly automated work steps have been developed to facilitate sorting operations.
  • Prior art systems and methods for sorting postal items in a sorting rack equipped with bins have been depicted in US 6,881,890, for instance.
  • a computer-controlled scanner reads a bar code on an envelope, and then the corresponding address is searched in the databank and information about the bin number corresponding to the address is transmitted to an input/output circuit.
  • the input/output circuit is connected as an integrated part of a circuitry including an input line from an infrared detector in each bin and an output line to a guide light provided in each bin.
  • the computer transmits the address number to the input/output circuit, it turns on the guiding light of the associated bin, allowing the sorter to put the postal item into this particular bin.
  • the guide light of this bin is switched off and identification of the next postal item can be started.
  • the system described in US 6,881,890 comprises an output line of the input-output circuit connected to a warning light provided in each bin.
  • a warning light in this bin gives an alarm, so that the sorter may remove the item from the erroneous bin and resort it into the correct bin indicated by the guide light.
  • this system does not guarantee that the postal item is sorted into the correct bin, should the bar code be incorrectly read in the identification step or had it not been read at all. If the bar code is defective or torn or otherwise destroyed in the identification step, the bar code cannot be read and the postal item will require resorting carried out completely by hand.
  • the purpose of the invention is to eliminate the problems mentioned above and to provide a method and a system for sorting an object equipped with identifier data, in which an identification unit reads partial identifier data, completes the partial identifier data into corrected identifier data and compares the identifier data thus formed with sorting data stored in a database, the object being directed to the correct delivery bin on the basis of the comparison.
  • the method and system of the invention markedly increase sorting automation, because the number of incorrectly interpreted identifier data and thus of wrongly sorted objects decreases appreciably in the sorting step. Accordingly, the step of collecting objects is enhanced, because the proportion of incorrectly sorted objects has been minimised in the sorting step.
  • the collecting step is further enhanced by the fact that, whenever possible, the collecting area of the sorting rack is formed of those bins alone into which objects have been sorted, or that empty bins are not even examined.
  • sorting quality and efficiency are improved, thus reducing the time, work steps and costs required for sorting and speeding up delivery of the object.
  • the invention provides a system for sorting at least one object equipped with identifier data
  • the system comprising a sorting rack including several delivery bins and associated guide means, which are connected with the controller unit of a computer and a database including sorting data communicating with the computer
  • the system comprising an identification unit, which is connected to the controller unit of the computer and the database including sorting data communicating with the computer, in order to read identifier data
  • the system being characterised by the fact that the identification unit comprises reading means for reading at least partial identifier data of the object, processing means for complementing the read partial identifier data into corrected identifier data, and comparative means for comparing the corrected identifier data with sorting data stored in the database, the sorting data serving as a basis for controlling the sorting of the object by means of the guide means.
  • the sorting system comprises a switch connected to the controller unit of the computer and the guide means for setting the operating mode of the digitalised sorting rack into a mode for collecting from the rack, the collection list comprising the identifier data of the objects stored in a database, the collection list serving as a means for generating a control signal in the controller unit of the computer for transmission to the guide means of the delivery bins, and guide means for guiding the collection of the object under said control signal.
  • the invention provides a method for sorting at least one object equipped with identifier data in a sorting rack, which comprises a plurality of delivery bins, the method comprising steps for
  • the sorting method identifies and indicates the delivery bins for controlling the sorting rack and switches the mode of the digitalised sorting rack either into a mode for sorting into the rack or into a mode for collecting from the rack.
  • the invention provides a system for sorting an object for delivery, the system comprising a digitalised sorting rack with a plurality of delivery bins and associated guide means connected to the controller unit of a computer and a database equipped with sorting data communicating with the computer with a view to controlling the sorting for delivery, the system being characterised by the feature that said guide means are adapted to indicate by said sorting data the delivery bin presently in turn and to indicate discriminatingly the delivery bin next in turn with a view to controlling the sorting of the object for delivery.
  • the invention provides a method for sorting an object for delivery into a digitalised sorting rack comprising a plurality of delivery bins, the method comprising steps for
  • Figure Ia is a block diagram of the sorting system of one embodiment comprising a digitalised sorting rack
  • Figure Ib is a block diagram of a second embodiment of a sorting system comprising a digitalised sorting rack
  • FIG. 2 is a block diagram of an identification unit of one embodiment
  • Figure 3 is a block diagram of the identification unit of a second embodiment
  • Figure 4 shows the user interface of an embodiment of the identification unit of an application
  • Figure 5 is a flow chart of a method of one embodiment for sorting an object into a rack by the identifier data of the object
  • Figure 6 is a flow chart of a method of a second embodiment for sorting an object into a rack by the identifier data of the object
  • Figure 7 is a flow chart of a method of one embodiment for identifying an object for sorting
  • Figure 8 is a flow chart of a method of a second embodiment for identifying an object for sorting.
  • FIG. Ia shows an embodiment of the system of the invention for sorting an object equipped with identifier data.
  • the sorting rack 10 comprises a plurality of delivery bins 11, into which the objects to be sorted are sorted by the desired sorting criteria, and from which the sorted objects are collected by the desired collecting criteria.
  • sorting denotes both sorting the object into a rack, i.e. input, and collection of the object from the rack, i.e. output.
  • the sorting of the object is controlled by a computer 30 and an indicator means 13 installed in each delivery bin and a recognition means 15 installed in connection with each array of delivery bins 112 consisting of a plurality of delivery bins.
  • the array of delivery bins 112 is preferably a column of bins consisting of delivery bins 11, as shown with a dashed line 112 in figure Ia.
  • sorting the object into the rack and collecting the object from the rack are mutually independent operations.
  • sorting into the rack is a separate independent function and collection from the rack is a separate independent function.
  • Identifier data implies data about the object to be sorted that are needed for carrying out the sorting, in other words, for transferring the object into the correct bin.
  • the identifier data may be any data allowing the object to be sorted into the rack.
  • the identifier data may consist of address data or a technical identifier, such as address identifier data, delivery point identifier data or any similar data for delivery control, so that the object does not require any actual address data at all.
  • the identifier data may be provided in different forms in the object, such as in the form of alphanumerical signs, bar codes, a radio-frequency identifier, preferably as an RFID, an optical identifier or an electrical identifier.
  • the identifier data may be printed out or programmed directly in the object, on a tag to be affixed to the object, a printed circuit board or any other substrate, or it may be integrated in the object in any similar manner known per se.
  • substrate will be used below to designate the various manners for connecting the identifier data to the object mentioned above.
  • a system of the invention comprising a digitalised sorting rack 10, is controlled by a computer 30 comprising at least one controller unit and a memory, and also by a switch SW 40 and an I/O controller I/O 50, which are connected both to the controller unit of the computer and to the rack.
  • the switch 40 is connected between the controller unit and the I/O controller 50 and the I/O controller is connected between the controller unit and the bins 11 of the rack.
  • the switch SW has the function of switching the digitalised sorting rack either into a mode for sorting into the rack or into a mode for collecting from the rack.
  • connection 131 is arranged from the I/O controller 50 to the indicator means 13 of each bin and a connection 151 to the recognition means 15 of each array of delivery bins 112.
  • the connections 131, 151 can be carried out as wire communication lines consisting of output lines 131 and input lines 151.
  • the lines 131, 151 preferably form a bus controlled from the controller unit of the computer.
  • the I/O controller may be in wireless communication, instead of communication lines, with the indicator means 13 provided in each bin, and e.g. communicate by radio or optical means with the recognition means 15 provided in each bin array, and then the I/O controller, the indicator and recognition means are equipped with the transmission and reception means required for the communication in question.
  • the sorting of the object is controlled by means of an indicator means 13 installed in each delivery bin 11, preferably a signal light or any other light-emitting component, and an recognition means 15, preferably a distance sensor, installed in connection with each array of delivery bins 112, preferably column of bins,.
  • the signal light 13 indicates to the sorter into which bin 11 the object presently in turn shall be placed, and the distance sensor 15 identifies the input of the object into the bin, and then the signal light is turned off.
  • the distance sensor 15 identifies the input of the object into a bin
  • the identifier data of the new object are read, and as a result of this, the bin into which the new object shall be placed is indicated to the sorter, and the signal light of the preceding bin is switched off.
  • the delivery bin indicated for the new object may be the same bin as the one indicated for the preceding object.
  • the signal light 13 indicates to the sorter the bin from which the object presently in turn shall be collected, and the distance meter 15 identifies the collection of the object from the bin in the manner described in connection with figure 5.
  • the light signal of this particular bin is turned off and the bin of the object to be collected next is indicated by the related light signal being turned on.
  • the collecting step uses simultaneously two mutually different light signals of two bins, so that the light signal relating to bin of the object to be collected next in turn after the presently collected object, i.e. the bin to be collected next, indicates that this bin is in turn to be collected next after the bin presently in turn for collection.
  • the light signals indicate the bin presently in turn to be collected and the bin next in turn to be collected with the light signal of the former continuously turned on and the light signal of the latter blinking, for instance.
  • the distance sensor of the bin to be collected next identifies collection of the object from the bin, it transmits an acknowledge signal of successful collection to the computer 30 acting as the central processing unit.
  • the computer shifts the light arrangement by one step forward, i.e. the light signal of the preceding, i.e. collected bin goes out, the light signal of the subsequent bin, i.e. the one presently in turn to be collected, stops blinking and is turned on continuously, and the light signal of the bin subsequent to the bin presently in turn, i.e. of the bin in turn to be collected next, starts blinking.
  • Both the light signals mentioned above may also be lighted continuously, yet with a mutually different light or emitting clearly different signals.
  • the controller unit 30 of the computer is connected with an identifying unit 20 for at least partial reading and identification of the identifying data and a database DB 60, in which data required for sorting are stored, such as sorting data and collection lists.
  • the data stored in the database can be retrieved from other data systems when the sorting starts and they can be changed and/or updated in real time, provided that the database is integrated in the data network.
  • the identifying unit includes either a connected memory or the identifying unit uses the memory of the computer as its memory for storing data it has identified.
  • one or more applications SOVl, SOV2 70, 80 have been stored in the memory connected to the computer controller unit for performing various functions as desired, such as identification, processing, correction and/or comparison.
  • the digitalised sorting rack 10 which consists of a plurality of bin columns, can be divided into several rack sections, whose sorting functions can be selected as mutually different.
  • the rack section comprises an array of bin columns 112, which is illustrated with a dashed line in figure Ia and with reference 114, and bin columns can be combined into independently acting rack sections 114 by programming or any other connections.
  • the numbers of bin columns or bin rows included in the rack section can be freely selected.
  • sorting the object into the rack and collecting the object from the rack are mutually independent operations.
  • Each rack section of the sorting rack then comprises conductors connected over a separate switch SW 40 to the computer 30 from the I/O controller 50 to this rack section.
  • the divided sorting rack allows the sorting functions to be determined by rack sections, so that the different sorting steps may proceed in parallel.
  • Each switch SW has the function of switching this rack section either in a mode for sorting into the rack or a mode for collecting from the rack.
  • Figure Ib illustrates a system of a second embodiment of the invention. It differs from the embodiment in figure Ia only in that, besides the indicator means 13, each bin has an individual recognition means 15. This provides a communication between the I/O controller 50 and the indicator means 13 of each bin and a communication 151 to the recognition means 15 of each bin 112.
  • the communications 131, 151 can be carried out as wire communications consisting of output lines 131 and input lines 151.
  • the communications 131, 151 preferably form a bus controlled from the controller unit of the computer.
  • the I/O controller may be in wireless communication with an indicator means 13 provided in each bin, and a recognition means 15 provided in each bin over the radio or optically, for instance, and then the I/O controller, the indicator and recognition means are equipped with appropriate transmission and reception means required for communication.
  • FIG. 2 is a block diagram of an identifying unit 20 of one embodiment.
  • an identifying unit 20 communicating with the sorting rack 10 comprises reading means 22, by means of which it reads the at least partial identifying data of the object 2 to be sorted, and a memory, in which it stores the identifying data it has read.
  • the memory (not illustrated) may be a separate or integrated storage means in the reading means or a memory card or the identifying unit may use the computer memory as its memory for storage of the data it has identified.
  • the camera KAM 22 reads, i.e. images in this case, the identifying data related to the object 2, which may be provided in the object as such or connected to a substrate 5 for instance.
  • the substrate 5 may be a printout of alphanumeric signs, a bar code tape, an infrared printout or any similar substrate affixed to the object, in which the readable identifying data are generated.
  • the camera 22 is connected to the controller unit and memory of the computer 30 and over this to the applications SOVl, SOV2 as shown in figure 1.
  • the computer for instance, can be equipped with an IEEE 1394 PCI card for connecting the camera.
  • the identifying unit may further comprise a display 32, a keyboard 34 or a combination of these connected to the computer, such as a contact display screen, or any other data feed means, by means of which the applications SOVl, SOV2 can be controlled over their user interfaces. There may be one or more applications SOVl, SOV2 in use.
  • the camera is preferably a digital camera.
  • the camera resolution may be e.g. 768 x 1024 pixels and the pixel size may be e.g. 6.25 x 6.25 ⁇ m 2 .
  • the camera my have an imaging rate of e.g. 15 full-pixel pictures per second.
  • the imaging distance i.e. the distance between the front surface of the camera objective and the imaging area on the identification substrate is e.g. 370 mm.
  • the object to be sorted is placed on the identification substrate 19 in connection with the identifying unit for reading of the identifying data in a manner such that the identifying data of the object get on the identification substrate, i.e. in this case on the imaging area of the camera 21 with the identifying data facing the camera.
  • This application uses generally the term imaging area to denote the identifying area.
  • the camera is equipped with a lens 221, by means of which the imaging area of the camera can be adjusted as desired.
  • a light source 24 is further mounted in the vicinity of the camera and oriented so as to efficiently illuminate the imaging area adjusted with the lens 221 of the camera 21. This is particularly useful in spaces where sorting is performed with irregular illumination.
  • the light source 24 can be mounted in the same stand as the camera, in which it can be shifted and its light incident angle can be adjusted as well.
  • a lens (not illustrated in the figure) can be mounted in front of the light source of preferably one or more LEDs, the lens allowing collection of light from one or more LEDs and focussing it efficiently to the imaging area.
  • the focal length of the lens in front of the LEDs of the light source may be e.g. 50 mm and its diameter 25.4 mm.
  • the LEDs have the additional advantage of allowing pulsation e.g. at a frequency of 60 Hz, the light appearing as continuously switched on, and of having long service life and low power consumption.
  • the light source 24 preferably comprises five LEDs disposed e.g.
  • the LEDs of the light source 24 indicate the centre and the corners of the imaging area.
  • the LEDs should be switched off when an image is taken in order to prevent heir emission light from interfering with the identification of the identifying data.
  • imaging takes only about 10 ms, so that the substrate 5 including the identifying data can still be correctly positioned even though the LEDs are switched off for such a short period.
  • a second light source 26 is mounted on the other side of the identification substrate with respect to the camera, perpendicularly to the camera lens, the light source illuminating the identification substrate from the side opposite to the camera and forming light points 261 on the identification substrate.
  • the light points are formed on the identification substrate e.g. by drilling one or more holes, preferably five holes 261 in the shape of a cross, for instance, at the second light source 26.
  • the second light source 26 preferably comprises five LEDs, which are embedded e.g. in the shape of a cross in the identification substrate 19, preferably in a light or transparent plastic sheet.
  • the light detector detects when an object to be imaged is in the imaging area, the object covering at least part of the light points and preferably all of the light points, whereby the light received by the light detector from the second light source decreases or ceases.
  • the camera starts imaging the identifying data of the object, which should now be located in the imaging range on the identification substrate.
  • the first and the second light source 24, 26, each preferably comprising five LEDs in cross shape indicate the centre and the corners of the imaging area on the identification substrate.
  • first light source 24 and the second light source 26 are disposed with respect to each other such that the camera pulsation is set to be synchronised into the same phase as the second light source and into the opposite phase relative to the light source, so that the camera is activated to image the identifying data of the first object on the identification substrate under the control of said pulsation.
  • a third light source 28 is mounted in the vicinity of the camera and is oriented so as to efficiently illuminate the imaging area adjusted by the lens 221 of the camera 21 on the identification substrate 19. This is especially useful in spaces where sorting is performed in irregular light conditions.
  • the light source 28 can also be mounted in the same stand as the camera, where it can be shifted in the horizontal plane and its incident angle can also be adjusted. This allows adjustment of the light incident angle so as to avoid mirror reflection from glossy samples.
  • a lens (not shown in the figures) can be mounted in front of the light source 28 to allow collection of light from one or more LEDs and focussing it on the imaging area on the identification substrate.
  • cross-shaped LEDs are mutually disposed so that the camera pulsation is set to be synchronised in the same phase as the third light source 28 and in the opposite phase relative to the first light source 24 and the second light source 26, so that the camera is activated to image the identifying data of the object on the identification substrate under the control of said pulsation.
  • the person who performs the sorting may carry a reading means 22, preferably a camera, as a means of automated sorting.
  • a reading means 22 preferably a camera
  • the person carrying out sorting should have both his hands free for handling the object.
  • the camera should be fixed to the sorter's shoulder, helmet or any other location leaving his hands free.
  • the moment of imaging could be selected e.g. using a push button provided on a glove that the sorter is wearing.
  • Data transmission between the camera and the computer can occur by wireless means, e.g. over the radio.
  • FIG. 3 shows a bloc diagram of an identifying unit 20 of a second embodiment.
  • an identifying unit 20 connected to a digitalised sorting rack 10 comprises reading means 21, by means of which it reads the at least partial identifying data of the object to be sorted 2, and a memory in which it stores the identifying data it has read.
  • the memory (not illustrated) may be a separate or integrated memory means or memory card in the reading means, or the identifying unit may use the computer memory as its data storage for storage of data it has identified.
  • the receiver RF 21 reads, in other words, receives in this case, identifying data on the substrate 5 connected to the object 2.
  • the substrate 5 comprises a transmitter means 23, which transmits receivable identifying data on e.g. radio frequency, in optical form or any similar form.
  • the receiver 21 is preferably a radio receiver RF which receives the radio signal transmitted by the radio transmitter 23 of the substrate 5, preferably an RFID signal.
  • the receiver 21 my also be an optical receiver, preferably an optical detector, which receives the optical signal transmitted by the optical transmitter 23 of the substrate 5.
  • the receiver 21 is connected to the controller unit and memory of the computer 30 and over this to applications SOVl, SOV2 as illustrated in figure 1.
  • the identifying unit may further comprise a display 32 connected to the computer, a keyboard 34 or a combination of these, such as a contact display, or any other data input means, allowing control of applications SOVl, SOV2 over their interfaces.
  • the object to be sorted is placed on the identification substrate 19 connected to the identifying unit for reading of the identifying data. There may be one or more applications SOVl, SOV2 in use.
  • the embodiments of the digital sorting rack 10 illustrated in figures 1-3 comprise applications SOVl, SOV2, 70, 80, which are intended to be stored in the memory of the computer 30 and to perform various necessary functions, such as identification, processing, correction and/or comparison. It also possible to connect a display 32 connected to a computer, a keyboard 34 or a combination of these, such as a contact display or any other data input means, to the identifying unit 20, 21, 22 connected to the computer, these means allowing control of the applications SOVl, SOV2 over their interfaces.
  • the processor speed and the storage capacity affect the imaging, image manipulation and text reading speed of the apparatus.
  • a flat-panel display as such does not require much space.
  • the display can be used to indicate to the user when an image has been generated for identification and to indicate the results of the identification.
  • Figure 4 exemplifies the interface 90 of an application of an identifying unit of one embodiment.
  • the interfaces of these embodiments which are visualised on the computer display 32 and controlled by the keyboard 34, typically comprise storage selections 92, processing and identification selections 94 and windows 96a, 96b for showing the identification results.
  • the identified identifier data i.e. the data found in the database, is displayed in the window 940.
  • the interface additionally comprises signal lights 910 to guide the user and an ending push-button 920.
  • the application together with the interface are started e.g. from the start menu of windows or any other operating system, and then the actual sorting application opens in the computer memory.
  • a first, second and third light source 24, 26, 28 are simultaneously switched on under the control of the controller unit of the computer in the identifying unit 20 of the sorting rack.
  • the first and the second light source indicate the centre and the corners of the imaging area and the pulsation of the identifying unit, preferably a digital camera, starts as described above.
  • the interface exemplified in figure 4 comprises three light signals, of which the first one 910a is switched on when the application is being started, the second light signal 910b is switched on when the initialisation is finished and the third light signal 910c is switched on when the object can be removed from the imaging area.
  • the first light signal 910a “database being initialised” is lighted if the user, when initialising the application, has clicked “yes”as a reply to the program's question whether there have been changes in the database since the previous use. As the initialisation of the database ends, the light signal 910a is switched off.
  • the second light signal 910b "identification completed, put another object in the imaging area" indicates when the identification is completed and a new object can be placed in the imaging area.
  • This light signal 910b is switched off when a new object is detected in the imaging area.
  • the third light signal 910c indicates when the object can be removed from the imaging area.
  • the light signal 910c is switched on immediately when an image of the object has been generated for processing.
  • the identification is completed and another object can be placed in the imaging area, the light signal 910c is switched off.
  • the sorting application is ended e.g. with the aid of the STOP push-button 920.
  • the application completes the identification that is going on and then stops the program. If the user wishes to continue the sorting after the program has stopped, he can do so e.g. by pressing the arrow key at the left upper corner. This restarts the application.
  • Figure 5 is a flow chart of a method for sorting at least one object equipped with identifier data into a sorting rack comprising a plurality of delivery bins.
  • the user of the digitalised sorting rack activates the rack 500 and sets the rack in the desired sorting mode 502.
  • the user assumedly starts by selecting the rack mode in step 502, and then the rack is set in the selected mode, while sorting data 504 are loaded in the database, unless they are already provided in the database.
  • the user of the rack grips the object to be identified and places it on the identification substrate, and then the at least partial identifier data are read and stored 506 in a memory connected to the identification unit 20.
  • the read identifier data are transferred from the identification unit to a first application stored in the computer memory, which compares the read identifier data with the sorting data 508 provided in the database. Unless the read identifier data comply with the sorting data, the second application stored in the computer memory complements the read incomplete identifier data to the correct identifier data 510, which correspond to the sorting data in the database. Then the controller unit of the computer indicates to the user the bin into which the object should be sorted 512 and the user places this object in the indicated bin 514. In step 516, the user decides whether to continue sorting into the rack or to stop sorting.
  • step 506 the identifier data of the new (following) object are read. If the sorting into the rack is ended (YES), the user proceeds to step 518, where he decides whether to continue the sorting at all (YES), returning to step 502, or whether to stop sorting (NO), proceeding to step 530.
  • a shift from step 516 to step 518 usually means that all the objects of one batch have been sorted into the correct bins.
  • step 502 the user selects the mode collecting from the rack, and the rack is set into the selected mode, while the collection list
  • the user is indicated the bin from which the object is taken 522 next, and at the same time discriminatingly the bin 524 next in turn, from which the object in turn next after the presently collected object is collected.
  • the object 526 is taken from the bin and the collection is continued until all the objects in the bin have been taken out through steps 528 and 526, in other words, all the objects present (sorted) in one single bin are collected from it.
  • the recognition means connected to the bin identifies this particular collecting operation in step 530, and then the collection control proceeds by one step through steps 522-528, unless the user wishes to stop the collection in step 532.
  • step 532 the user decides whether to continue collecting from the rack or to stop. If collection from the rack is continued (NO) the user proceeds to step 522, in which a new (following) bin is indicated, from which the object is collected. If collection from the rack is stopped (YES), the user proceeds to step 518, in which he decides whether to continue sorting (YES) at all, returning to step 502, or to terminate sorting (NO), proceeding to step 534.
  • a shift from step 532 to step 518 usually means that all the objects pertaining to one batch have been collected from the bins for delivery, for instance.
  • method steps 500, 502, 520-532, 518 and 534 described above constitute a completely independently operating method for collecting an object from the rack on the basis of a collecting list stored in a database.
  • the method is applicable to the system of the invention, which comprises a digitalised sorting rack and a sorting rack section.
  • Figure 6 is a flow chart of a method of one embodiment by steps, when step 502 of figure 5 selects as sorting mode "sorting into the rack".
  • the flow chart of figure 6 shows steps 502-518 of figure 5 in greater detail.
  • the object is preferably a postal item, to which at least partial identifier data, e.g. an address field, an address identifier or a delivery area identifier has been connected.
  • the application has been activated 600 and the light signal "identification completed, put new object on the imaging area" has been switched on, the first object to be identified is placed on the imaging area on the identification substrate 602.
  • the light signal "identification completed, put new object on the imaging area” is switched off.
  • step 604 comprises monitoring of the moment the object is immobilised and the user brings the centre of the address field on the central LED light point and immobilises the movement of the object at this location.
  • Step 606 comprises detection of the immobilisation of the object, processing an image of the address field and switching on the light signal "remove object from imaging area", allowing the user to remove the object from the imaging area 608.
  • the image of the address field of the object preferably a postal item, is displayed in the first window of the display, and a corrected, e.g. straightened image of the address field is shown in the second window, as illustrated in figure 4.
  • step 610 The partial identifier data read in step 610 is pre-processed and compared with the address directory in the database in the manner explained below, and the identification results are shown on the display over the interface in the field "found address" (cf. window 940 in figure 4).
  • step 612 the delivery bin of the object is indicated.
  • the light signal "identification completed, put a new object in the imaging area” is switched on, signalling that a new object, preferably a postal item, can be placed in the imaging area on the identification substrate 614.
  • Step 616 comprises a decision of whether to continue sorting or to stop sorting. If the sorting is continued (NO), the user proceeds to step 604, where he reads the identifier data of the new (following) object). If the sorting is stopped (YES), the user proceeds to step 618.
  • the user presses e.g. the STOP push button (figure 4), and then the image manipulation in process is completed and the application is turned off.
  • Figure 7 is a flow chart of a method of one embodiment for identifying partial identifier data of an object by comparing the read partial identifier data with sorting data stored in the database and by complementing defective identifier data to corrected identifier data.
  • the method described here relates to steps 508 and 510 illustrated in figure 5 and to steps 606, 610 and 614 illustrated in figure 6, which will be exemplified in greater detail below.
  • the flow chart of figure 7 shows an embodiment of the identification method steps, by means of which imaging (reading) and identification (pre-processing, comparison) with the aid of the camera are automated.
  • the described example assumes that the identifier data of the object, preferably a postal item, is an alphanumeric address field and that pre-processing is performed three times at the most.
  • reading is preferably text reading and identification is preferably text identification in this example.
  • the identifier data of the object are imaged and stored 702.
  • the most recent image is compared with the preceding image, and if they are identical (YES), it is stated that no new object has been introduced in the imaging area, in other words, after two successive imaging operations, the same (preceding) object is still in the imaging area.
  • the image release is automated, e.g. by synchronising the camera with the light sources 24, 26 in figure 2, the camera generates successive images at given intervals, and the "difference image" of these is calculated. Two successive images can be taken at 100 ms intervals, for instance.
  • the user may conclude whether the two last images were identical.
  • a comparison in step 704 states that the two last images were not identical (NO)
  • the user knows that a new object, preferably a postal item, has been brought in the imaging area.
  • the image can be subjected to preliminary pre-processing, e.g. by modifying the image pixel values in the user interface (window 94 in figure 4) either to a lighter or a darker shade according to the selection.
  • the image is subjected to post ⁇ processing 706, 708, 710, 712 and the user is informed of completed imaging by means of the light signal 910b (figure 4) of the user interface.
  • the post- processing steps comprise image straightening 706, image pre-processing 708 and reading corrected identifier data in the image 710, and the iteration cycles needed in steps 708 and 710 for trimming of the corrected identifier data in step 712.
  • the user can interrupt the sorting in step 716 and proceed to step 702, or terminate sorting in step 718.
  • step 714 after the corrected identifier data of the object, preferably a postal item, have been read, the address complying best with the corrected identifier data is searched in the address database (sorting data) and the bin corresponding to this address is indicated to the user for sorting.
  • the read (partial) identifier data, the identified address and an image of the address field of the object, preferably a postal item, are stored in the directory defined as identifier data.
  • the directory in which the identification results are stored is written in the user interface (figure 4) in the field "identification result directory”.
  • image By selecting "image”, the user stores an image of the object that has not been processed.
  • identification results the user stores the results of identification. By selecting “unidentified”, the user stores the results and the images also when no address has been identified in the object. In this case, however, "image” and “identification results” should be selected for the image of unidentified objects and its identification results to be stored. In the storage of the data and the image of objects whose address has not been identified, the file name will read “unidentified”.
  • the straightening of the image in step 706 of figure 7 depends on the angular range selected for examination in the user interface.
  • the determination of the angular range is explained in the description below. If the angular range selected for examination is e.g. in the range +30 degrees to -30 degrees, the image is turned e.g. at an angle of -30 degrees and the pixels of the turned image are summed row-wise. This yields a vertical image profile, on which a Fourier transformation is calculated. This operation is performed by steps of 7.5 degrees a total of 9 times. This will cover the entire desired angular range.
  • the Fourier transformation indicates the spatial frequencies in the profile. An examination of the suitable number of spatial frequencies in the calculated Fourier transformations allows the conclusion of the angle of the image in which the partial identifier data text is horizontal. When the text is in a straight position in the image, its profile comprises a large number of relatively high frequencies.
  • the image is turned first at an angle of -42 degrees.
  • the image pixels are summed separately both by rows and columns, yielding an image profile both in the horizontal and the vertical direction.
  • the Fourier transformations are calculated on both the profiles.
  • the angle is changed by steps of 7 degrees totally 13 times. Since both the vertical and the horizontal profile is calculated on each image, the angular ranges will be examined in the range -42 to +42 degrees and +48 to +132 degrees.
  • the examined angular range is in the range -42 to +132 degrees.
  • the examined angular range is in fact half of all the conceivable angles. 3 degrees can be added to the end points, because this is less than half of the length of one step, in other words, with a text at an angle of 45 degrees, the algorithm interprets it at an angle of 42 degrees, which is consequently a value with an error of 3 degrees. If the text is at an inclination of e.g.
  • the algorithm interprets it at an inclination of 0 or 7 degrees, i.e. a value with an error of 3.5 degrees. Since an addition of 3 degrees to the end points of the range does not impair the angular resolution of the algorithm, this addition can be made.
  • the text may be also upside down in these images, in other words, noting that the text is e.g. at an inclination of 15 degrees, it could equally well be at an inclination of 195 degrees.
  • the algorithm described above yields a value of the text orientation angle in the range -45 to +135 degrees. When the image is positioned at such an angle, the text may be upside down.
  • OCR optical character recognition
  • the selection "angular range” determines the size of the angular range in which the inclination should be examined.
  • the selection "limit of reliable recognition” selects the limit determining the minimum acceptable value for recognition reliability. Values under this value are classified as unidentified.
  • the limit of reliable identification can be selected e.g. in the range 0.00 to 1.00 and the selection can be made e.g. with a 0.05 resolution.
  • the reliability value indicates the degree to which the corrected identifier data comply with the "found address" data in the database.
  • the bin indicates the bin number to which the address corresponds.
  • the processing duration indicates the duration of the image pre-processing and the comparison between the identification result and the address directory.
  • postal code and post office the user fills in the postal code and post office of the mail items being sorted. This information is needed in searching the street address of a postal item and in the straightening of the text of a postal item, if the selection 360 degree has been made under "angular range”.
  • the selection "maximum number of image manipulations” is selected as the maximum number of image manipulations of one single image. The number of manipulations can be selected e.g. in the range 1 to 3.
  • the selection "darkness of image in first manipulation” determines the effect of the first pre-processing of the image. The higher the selected value, the higher the whiteness of the image.
  • the darkness of the image in the first manipulation can be selected e.g. in the range 0.00 to 2.00, and the selection can be made e.g. with a resolution of 0.1.
  • the selection "darkness of image in the third manipulation” determines the effect of the third pre-manipulation of the image in this case.
  • the higher the selected value the higher the whiteness of the image.
  • the darkness of the image in the third manipulation can be selected e.g. in the range 0.00 to 4.00, with the selection made e.g. with a resolution of 0.1.
  • Figure 8 is a flow chart of a method of a second embodiment for identifying partial identifier data of an object by comparing the read partial identifier data with sorting data stored in the database and by complementing defective identifier data to corrected identifier data.
  • the method described below relates to steps 508 and 510 illustrated in figure 5 and to steps 606, 610 and 612 illustrated in figure 6, which will be exemplified in greater detail below.
  • the comparison uses the identification result obtained in optical character recognition as explained in conjunction with image straightening (step 710 in figure 7).
  • this identification result is processed 802 and the identification result is compared with sorting data stored in the database, such as an address directory, for instance.
  • step 804 the identification result is modified by e.g. changing all its characters to capital characters, all the 0 numbers to the letter O and all the 1 numbers and L letters to the letter I, by eliminating the special characters and the space characters.
  • the same operations have been performed for the address directory in the database for the address directory and the identification result to be compatible.
  • all the rows containing 2 or fewer characters 804 are deleted from the identification result.
  • the row having the postal code and postal office 806 is searched in the read identifier data.
  • Step 808 comprises a comparison of the identification result with the database, and if the identification result is adequate (YES), step 810 calculates the correlation between the row preceding the postal code and the street names in the database.
  • Step 812 comprises a comparison to conclude whether the highest correlation is adequate, and if this is the case (YES), the user proceeds to step 816. Unless the identification result of step 808 or the highest correlation in step 812 is adequate (NO), the highest correlation between all the rows and the street names is calculated in the database in step 814.
  • Step 816 comprises calculation of the correlation between the row where the street name was found and the potential addresses. Then the correlation between the row preceding the address and the names 818 of the persons living in the same street is calculated, yielding the highest name correlation.
  • Step 820 comprises a comparison to conclude whether the highest name correlation is adequate, and if this is the case (YES), the user proceeds to step 824. Unless the highest name correlation in step 820 is adequate (NO), the correlations between all the rows and the database names of the persons living in the same street are calculated in step 822. Finally, in step 824, the correct bin is assigned for the sorted object on the basis of the highest summed correlation, the sum correlation.
  • the program adds a bin number to the address directory, the number being the same for all of those living at the same address.
  • Name 2 Name 2, i.e. first names or blank (a blank in the case of a company name).
  • the following is an exemplifying description of two substantially different manners of calculating the correlation between two character strings.
  • the method in figure 8 uses both these calculating procedures in the calculation of the correlation between the two character strings, selecting the higher correlation value of the values thus calculated in two different manners.
  • Character string 2 is the result of the optic character recognition described above and character string 1 is the character string in the database with which the recognition result is compared.
  • the underline signifies a blank character in this context.
  • Character string 1 is glided over character string 2 and the number of coinciding identical characters in each situation is calculated. Initially the last character in character string 1 and the first character in character string 2 coincide, and then the character string 1 is moved by steps of one character over character string 2.
  • Results The number of hits out of the characters glided in the different steps is calculated.
  • the calculation of correlation coefficients should take account of the ratio of character sting lengths to the number of hits.
  • Character string 2 is the result of the optic character recognition and character string 2 is the character string in the database with which the recognition result is compared.
  • the correlation between two character strings is examined by taking two successive characters at a time from character string 1 and by examining whether an identical character pair is found in character string 2.
  • Step 2 Two characters taken from character string 1 ; AT
  • the calculation of the correlation coefficient takes account of the number of correct character pairs found in character string 2 and the lengths of the character strings. Calculation of the correlation string:
  • N is the number of found character pairs
  • MJl and MJ2 are character string 1 and character string 2.
  • the correlation coefficient obtained in this example is 0.67.
  • the method steps illustrated in figures 5 - 8, or at least part of the method steps can be carried out by a computer program or programs, which can be stored in the memory of computer 30 for performance of various functions such as identification, processing, correction and/or comparison.
  • the computer program consists of an encoded programmable product.

Landscapes

  • Warehouses Or Storage Devices (AREA)
  • Sorting Of Articles (AREA)

Abstract

The invention relates to a method and a system for sorting an object (2) equipped with identifier data, the system comprising a digitalised sorting rack (10) including a plurality of delivery bins (11) having connected guide means (13, 15), which are connected to a computer controller unit (30) and a database (60) provided with sorting data communicating with the computer. The system comprises an identification unit (20), which is connected to the computer controller unit (30) and the database (60) provided with sorting data communicating with the computer for reading the identifier data. The identification unit comprises reading means (21, 22) for reading the at least partial identifier data of the object, processing means (30, 70, 80) for complementing the read partial identifier data to corrected identifier data, and comparison means (30, 70, 80) for comparing the corrected identifier data with sorting data stored in the database, on the basis of which said guide means guide the sorting of the object. In the system of the invention, sorting of the object into the rack and collection of the object from the rack are functions that can be selected independently. Sorting of the objects into the rack and collection of the objects from the rack can also operate simultaneously in the digitalised sorting rack, which is divided into sorting rack sections (114).

Description

Method and system for sorting a post item in a delivery bin according to identification information
FIELD OF TECHNOLOGY
The invention relates to a method and a system for sorting an object provided with identifier data, preferably a postal item, the sorting comprising both sorting into a rack and collecting from a rack by identifier data.
STATE OF THE ART
A sorting rack typically comprises delivery bins, into which the postal items are directed manually on the basis of data on an address tag. Each distribution path can be allocated an individual sorting rack with bins, into which postal items are sorted and from which postal items are collected for delivery. Preliminary postal delivery operations typically include steps for sorting postal items in a rack in accordance with address data, with sorting into the rack being performed by the street, road or number, and the postal items collected from the rack manually according to a delivery list. These steps are time consuming and require experienced staff. The work performance also requires printouts. There have been efforts to develop various partly automated work steps have been developed to facilitate sorting operations.
Prior art systems and methods for sorting postal items in a sorting rack equipped with bins have been depicted in US 6,881,890, for instance. A computer-controlled scanner reads a bar code on an envelope, and then the corresponding address is searched in the databank and information about the bin number corresponding to the address is transmitted to an input/output circuit. The input/output circuit is connected as an integrated part of a circuitry including an input line from an infrared detector in each bin and an output line to a guide light provided in each bin. When the computer transmits the address number to the input/output circuit, it turns on the guiding light of the associated bin, allowing the sorter to put the postal item into this particular bin. When a postal item has been identified in a bin, the guide light of this bin is switched off and identification of the next postal item can be started. With a view to detection of erroneous sorting, the system described in US 6,881,890 comprises an output line of the input-output circuit connected to a warning light provided in each bin. When the sorter has sorted a postal item into the wrong bin, a warning light in this bin gives an alarm, so that the sorter may remove the item from the erroneous bin and resort it into the correct bin indicated by the guide light. However, this system does not guarantee that the postal item is sorted into the correct bin, should the bar code be incorrectly read in the identification step or had it not been read at all. If the bar code is defective or torn or otherwise destroyed in the identification step, the bar code cannot be read and the postal item will require resorting carried out completely by hand.
Prior art methods and systems require that the bar code associated with a postal item can be correctly read and interpreted for the correct delivery bin in a sorting rack to be allocated for the postal item. If the bar code is defective, the postal item will probably be directed to the wrong bin or will have to be rejected and transferred to manual sorting, for instance. The situation is the same if the bar code has been torn, distorted, bent, skewed, set upside down, partly destroyed or otherwise deteriorated. Bar codes that have been erroneously or wrongly interpreted or not read at all will delay both the sorting and the delivery of postal items and also increase the time, work steps and costs required by sorting.
SUMMARY OF THE INVENTION
The purpose of the invention is to eliminate the problems mentioned above and to provide a method and a system for sorting an object equipped with identifier data, in which an identification unit reads partial identifier data, completes the partial identifier data into corrected identifier data and compares the identifier data thus formed with sorting data stored in a database, the object being directed to the correct delivery bin on the basis of the comparison. The method and system of the invention markedly increase sorting automation, because the number of incorrectly interpreted identifier data and thus of wrongly sorted objects decreases appreciably in the sorting step. Accordingly, the step of collecting objects is enhanced, because the proportion of incorrectly sorted objects has been minimised in the sorting step. The collecting step is further enhanced by the fact that, whenever possible, the collecting area of the sorting rack is formed of those bins alone into which objects have been sorted, or that empty bins are not even examined. In accordance with the invention, sorting quality and efficiency are improved, thus reducing the time, work steps and costs required for sorting and speeding up delivery of the object. Under one aspect, the invention provides a system for sorting at least one object equipped with identifier data, the system comprising a sorting rack including several delivery bins and associated guide means, which are connected with the controller unit of a computer and a database including sorting data communicating with the computer, the system comprising an identification unit, which is connected to the controller unit of the computer and the database including sorting data communicating with the computer, in order to read identifier data, the system being characterised by the fact that the identification unit comprises reading means for reading at least partial identifier data of the object, processing means for complementing the read partial identifier data into corrected identifier data, and comparative means for comparing the corrected identifier data with sorting data stored in the database, the sorting data serving as a basis for controlling the sorting of the object by means of the guide means.
In one embodiment of the invention, the sorting system comprises a switch connected to the controller unit of the computer and the guide means for setting the operating mode of the digitalised sorting rack into a mode for collecting from the rack, the collection list comprising the identifier data of the objects stored in a database, the collection list serving as a means for generating a control signal in the controller unit of the computer for transmission to the guide means of the delivery bins, and guide means for guiding the collection of the object under said control signal.
According to a second aspect, the invention provides a method for sorting at least one object equipped with identifier data in a sorting rack, which comprises a plurality of delivery bins, the method comprising steps for
- connecting the delivery bin to the controller unit of a computer,
- storing the sorting data in a database communicating with the computer, - setting the object to be sorted in an identification unit connected with the controller unit of the computer for reading the identifier data,
the method being characterised by comprising steps for
- reading the at least partial identifier data of the object,
- comparing the read identifier data with the sorting data,
- complementing the read partial identifier data into corrected identifier data, and - directing the object to the delivery bin on the basis of the comparison.
In another embodiment of the invention, the sorting method identifies and indicates the delivery bins for controlling the sorting rack and switches the mode of the digitalised sorting rack either into a mode for sorting into the rack or into a mode for collecting from the rack.
According to a third aspect, the invention provides a system for sorting an object for delivery, the system comprising a digitalised sorting rack with a plurality of delivery bins and associated guide means connected to the controller unit of a computer and a database equipped with sorting data communicating with the computer with a view to controlling the sorting for delivery, the system being characterised by the feature that said guide means are adapted to indicate by said sorting data the delivery bin presently in turn and to indicate discriminatingly the delivery bin next in turn with a view to controlling the sorting of the object for delivery.
According to a fourth aspect, the invention provides a method for sorting an object for delivery into a digitalised sorting rack comprising a plurality of delivery bins, the method comprising steps for
- connecting a delivery bin to the controller unit of a computer,
- storing the sorting data in a database communicating with the computer,
the method being characterised of comprising steps for
- indicating by said sorting data the delivery bin presently in turn and
- indicating by said sorting data discriminatingly the delivery bin next in turn for controlling the sorting of the object with a view to delivery.
BRIEF DESCRIPTION OF THE DRAWINGS
The invention is explained in greater detail below with reference to the accompanying drawings, of which
Figure Ia is a block diagram of the sorting system of one embodiment comprising a digitalised sorting rack, Figure Ib is a block diagram of a second embodiment of a sorting system comprising a digitalised sorting rack,
Figure 2 is a block diagram of an identification unit of one embodiment,
Figure 3 is a block diagram of the identification unit of a second embodiment,
Figure 4 shows the user interface of an embodiment of the identification unit of an application,
Figure 5 is a flow chart of a method of one embodiment for sorting an object into a rack by the identifier data of the object,
Figure 6 is a flow chart of a method of a second embodiment for sorting an object into a rack by the identifier data of the object,
Figure 7 is a flow chart of a method of one embodiment for identifying an object for sorting, and
Figure 8 is a flow chart of a method of a second embodiment for identifying an object for sorting.
DETAILED DESCRIPTION OF THE INVENTION
Figure Ia shows an embodiment of the system of the invention for sorting an object equipped with identifier data. The sorting rack 10 comprises a plurality of delivery bins 11, into which the objects to be sorted are sorted by the desired sorting criteria, and from which the sorted objects are collected by the desired collecting criteria. In this application, the term "sorting" denotes both sorting the object into a rack, i.e. input, and collection of the object from the rack, i.e. output. The sorting of the object is controlled by a computer 30 and an indicator means 13 installed in each delivery bin and a recognition means 15 installed in connection with each array of delivery bins 112 consisting of a plurality of delivery bins. The array of delivery bins 112 is preferably a column of bins consisting of delivery bins 11, as shown with a dashed line 112 in figure Ia. In the system of the invention, sorting the object into the rack and collecting the object from the rack are mutually independent operations. In other words, in the system of the invention, sorting into the rack is a separate independent function and collection from the rack is a separate independent function.
Identifier data implies data about the object to be sorted that are needed for carrying out the sorting, in other words, for transferring the object into the correct bin. The identifier data may be any data allowing the object to be sorted into the rack. Thus, for instance, during sorting of the object for delivery, the identifier data may consist of address data or a technical identifier, such as address identifier data, delivery point identifier data or any similar data for delivery control, so that the object does not require any actual address data at all. The identifier data may be provided in different forms in the object, such as in the form of alphanumerical signs, bar codes, a radio-frequency identifier, preferably as an RFID, an optical identifier or an electrical identifier. The identifier data may be printed out or programmed directly in the object, on a tag to be affixed to the object, a printed circuit board or any other substrate, or it may be integrated in the object in any similar manner known per se. The common term "substrate" will be used below to designate the various manners for connecting the identifier data to the object mentioned above.
As shown in figure Ia, a system of the invention comprising a digitalised sorting rack 10, is controlled by a computer 30 comprising at least one controller unit and a memory, and also by a switch SW 40 and an I/O controller I/O 50, which are connected both to the controller unit of the computer and to the rack. In one embodiment, the switch 40 is connected between the controller unit and the I/O controller 50 and the I/O controller is connected between the controller unit and the bins 11 of the rack. The switch SW has the function of switching the digitalised sorting rack either into a mode for sorting into the rack or into a mode for collecting from the rack. In the embodiment of figure Ia, a connection 131 is arranged from the I/O controller 50 to the indicator means 13 of each bin and a connection 151 to the recognition means 15 of each array of delivery bins 112. The connections 131, 151 can be carried out as wire communication lines consisting of output lines 131 and input lines 151. The lines 131, 151 preferably form a bus controlled from the controller unit of the computer. In a second embodiment, the I/O controller may be in wireless communication, instead of communication lines, with the indicator means 13 provided in each bin, and e.g. communicate by radio or optical means with the recognition means 15 provided in each bin array, and then the I/O controller, the indicator and recognition means are equipped with the transmission and reception means required for the communication in question.
The sorting of the object is controlled by means of an indicator means 13 installed in each delivery bin 11, preferably a signal light or any other light-emitting component, and an recognition means 15, preferably a distance sensor, installed in connection with each array of delivery bins 112, preferably column of bins,. The signal light 13 indicates to the sorter into which bin 11 the object presently in turn shall be placed, and the distance sensor 15 identifies the input of the object into the bin, and then the signal light is turned off. Optionally, as the distance sensor 15 identifies the input of the object into a bin, the identifier data of the new object are read, and as a result of this, the bin into which the new object shall be placed is indicated to the sorter, and the signal light of the preceding bin is switched off. Of course, the delivery bin indicated for the new object may be the same bin as the one indicated for the preceding object.
During the collecting step pertaining to the sorting, the signal light 13 indicates to the sorter the bin from which the object presently in turn shall be collected, and the distance meter 15 identifies the collection of the object from the bin in the manner described in connection with figure 5. After this, the light signal of this particular bin is turned off and the bin of the object to be collected next is indicated by the related light signal being turned on. In one embodiment, the collecting step uses simultaneously two mutually different light signals of two bins, so that the light signal relating to bin of the object to be collected next in turn after the presently collected object, i.e. the bin to be collected next, indicates that this bin is in turn to be collected next after the bin presently in turn for collection. In other words, the light signals indicate the bin presently in turn to be collected and the bin next in turn to be collected with the light signal of the former continuously turned on and the light signal of the latter blinking, for instance. When the distance sensor of the bin to be collected next identifies collection of the object from the bin, it transmits an acknowledge signal of successful collection to the computer 30 acting as the central processing unit. Then the computer shifts the light arrangement by one step forward, i.e. the light signal of the preceding, i.e. collected bin goes out, the light signal of the subsequent bin, i.e. the one presently in turn to be collected, stops blinking and is turned on continuously, and the light signal of the bin subsequent to the bin presently in turn, i.e. of the bin in turn to be collected next, starts blinking. Both the light signals mentioned above may also be lighted continuously, yet with a mutually different light or emitting clearly different signals.
In accordance with figure Ia, the controller unit 30 of the computer is connected with an identifying unit 20 for at least partial reading and identification of the identifying data and a database DB 60, in which data required for sorting are stored, such as sorting data and collection lists. The data stored in the database can be retrieved from other data systems when the sorting starts and they can be changed and/or updated in real time, provided that the database is integrated in the data network. The identifying unit includes either a connected memory or the identifying unit uses the memory of the computer as its memory for storing data it has identified. In addition, one or more applications SOVl, SOV2 70, 80 have been stored in the memory connected to the computer controller unit for performing various functions as desired, such as identification, processing, correction and/or comparison.
The digitalised sorting rack 10, which consists of a plurality of bin columns, can be divided into several rack sections, whose sorting functions can be selected as mutually different. The rack section comprises an array of bin columns 112, which is illustrated with a dashed line in figure Ia and with reference 114, and bin columns can be combined into independently acting rack sections 114 by programming or any other connections. The numbers of bin columns or bin rows included in the rack section can be freely selected. In the system of the invention, sorting the object into the rack and collecting the object from the rack are mutually independent operations. Each rack section of the sorting rack then comprises conductors connected over a separate switch SW 40 to the computer 30 from the I/O controller 50 to this rack section. The divided sorting rack allows the sorting functions to be determined by rack sections, so that the different sorting steps may proceed in parallel. Each switch SW has the function of switching this rack section either in a mode for sorting into the rack or a mode for collecting from the rack.
Figure Ib illustrates a system of a second embodiment of the invention. It differs from the embodiment in figure Ia only in that, besides the indicator means 13, each bin has an individual recognition means 15. This provides a communication between the I/O controller 50 and the indicator means 13 of each bin and a communication 151 to the recognition means 15 of each bin 112. The communications 131, 151 can be carried out as wire communications consisting of output lines 131 and input lines 151. The communications 131, 151 preferably form a bus controlled from the controller unit of the computer. In a second embodiment, instead of lines, the I/O controller may be in wireless communication with an indicator means 13 provided in each bin, and a recognition means 15 provided in each bin over the radio or optically, for instance, and then the I/O controller, the indicator and recognition means are equipped with appropriate transmission and reception means required for communication.
Figure 2 is a block diagram of an identifying unit 20 of one embodiment. In accordance with the invention, an identifying unit 20 communicating with the sorting rack 10 comprises reading means 22, by means of which it reads the at least partial identifying data of the object 2 to be sorted, and a memory, in which it stores the identifying data it has read. The memory (not illustrated) may be a separate or integrated storage means in the reading means or a memory card or the identifying unit may use the computer memory as its memory for storage of the data it has identified. In the identifying unit of figure 2 the camera KAM 22 reads, i.e. images in this case, the identifying data related to the object 2, which may be provided in the object as such or connected to a substrate 5 for instance. The substrate 5 may be a printout of alphanumeric signs, a bar code tape, an infrared printout or any similar substrate affixed to the object, in which the readable identifying data are generated. The camera 22 is connected to the controller unit and memory of the computer 30 and over this to the applications SOVl, SOV2 as shown in figure 1. The computer, for instance, can be equipped with an IEEE 1394 PCI card for connecting the camera. The identifying unit may further comprise a display 32, a keyboard 34 or a combination of these connected to the computer, such as a contact display screen, or any other data feed means, by means of which the applications SOVl, SOV2 can be controlled over their user interfaces. There may be one or more applications SOVl, SOV2 in use. The camera is preferably a digital camera. The camera resolution may be e.g. 768 x 1024 pixels and the pixel size may be e.g. 6.25 x 6.25 μm2. The camera my have an imaging rate of e.g. 15 full-pixel pictures per second. The imaging distance, i.e. the distance between the front surface of the camera objective and the imaging area on the identification substrate is e.g. 370 mm.
The object to be sorted is placed on the identification substrate 19 in connection with the identifying unit for reading of the identifying data in a manner such that the identifying data of the object get on the identification substrate, i.e. in this case on the imaging area of the camera 21 with the identifying data facing the camera. This application uses generally the term imaging area to denote the identifying area. In figure 2, the camera is equipped with a lens 221, by means of which the imaging area of the camera can be adjusted as desired. A light source 24 is further mounted in the vicinity of the camera and oriented so as to efficiently illuminate the imaging area adjusted with the lens 221 of the camera 21. This is particularly useful in spaces where sorting is performed with irregular illumination. The light source 24 can be mounted in the same stand as the camera, in which it can be shifted and its light incident angle can be adjusted as well. A lens (not illustrated in the figure) can be mounted in front of the light source of preferably one or more LEDs, the lens allowing collection of light from one or more LEDs and focussing it efficiently to the imaging area. The focal length of the lens in front of the LEDs of the light source may be e.g. 50 mm and its diameter 25.4 mm. The LEDs have the additional advantage of allowing pulsation e.g. at a frequency of 60 Hz, the light appearing as continuously switched on, and of having long service life and low power consumption. The light source 24 preferably comprises five LEDs disposed e.g. as a cross and adjusted so as to distinctly indicate the imaging area formed on the identification substrate, on which the camera and the lens are focused. In other words, the camera lens is focused on the identification substrate with the central LED out of five LEDs being directed to indicate the centre of the imaging area. In this manner, the LEDs of the light source 24 indicate the centre and the corners of the imaging area. However, the LEDs should be switched off when an image is taken in order to prevent heir emission light from interfering with the identification of the identifying data. However, imaging takes only about 10 ms, so that the substrate 5 including the identifying data can still be correctly positioned even though the LEDs are switched off for such a short period.
In one embodiment, a second light source 26 is mounted on the other side of the identification substrate with respect to the camera, perpendicularly to the camera lens, the light source illuminating the identification substrate from the side opposite to the camera and forming light points 261 on the identification substrate. The light points are formed on the identification substrate e.g. by drilling one or more holes, preferably five holes 261 in the shape of a cross, for instance, at the second light source 26. The second light source 26 preferably comprises five LEDs, which are embedded e.g. in the shape of a cross in the identification substrate 19, preferably in a light or transparent plastic sheet. By means of the light points pointed at the camera, the light detector (not illustrated in the figure) detects when an object to be imaged is in the imaging area, the object covering at least part of the light points and preferably all of the light points, whereby the light received by the light detector from the second light source decreases or ceases. When the light detector of the camera no longer receives light from the second light source, the camera starts imaging the identifying data of the object, which should now be located in the imaging range on the identification substrate. In a second embodiment, the first and the second light source 24, 26, each preferably comprising five LEDs in cross shape, indicate the centre and the corners of the imaging area on the identification substrate. Then the first light source 24 and the second light source 26 are disposed with respect to each other such that the camera pulsation is set to be synchronised into the same phase as the second light source and into the opposite phase relative to the light source, so that the camera is activated to image the identifying data of the first object on the identification substrate under the control of said pulsation.
A third light source 28 is mounted in the vicinity of the camera and is oriented so as to efficiently illuminate the imaging area adjusted by the lens 221 of the camera 21 on the identification substrate 19. This is especially useful in spaces where sorting is performed in irregular light conditions. The light source 28 can also be mounted in the same stand as the camera, where it can be shifted in the horizontal plane and its incident angle can also be adjusted. This allows adjustment of the light incident angle so as to avoid mirror reflection from glossy samples. A lens (not shown in the figures) can be mounted in front of the light source 28 to allow collection of light from one or more LEDs and focussing it on the imaging area on the identification substrate. According to a third embodiment, the first light source 24, the second light source 26 and the third light source 28, each preferably comprising five e.g. cross-shaped LEDs, are mutually disposed so that the camera pulsation is set to be synchronised in the same phase as the third light source 28 and in the opposite phase relative to the first light source 24 and the second light source 26, so that the camera is activated to image the identifying data of the object on the identification substrate under the control of said pulsation.
In one embodiment, the person who performs the sorting may carry a reading means 22, preferably a camera, as a means of automated sorting. However, the person carrying out sorting should have both his hands free for handling the object. For this reason, the camera should be fixed to the sorter's shoulder, helmet or any other location leaving his hands free. The moment of imaging could be selected e.g. using a push button provided on a glove that the sorter is wearing. Data transmission between the camera and the computer can occur by wireless means, e.g. over the radio.
Figure 3 shows a bloc diagram of an identifying unit 20 of a second embodiment. In accordance with the invention, an identifying unit 20 connected to a digitalised sorting rack 10 comprises reading means 21, by means of which it reads the at least partial identifying data of the object to be sorted 2, and a memory in which it stores the identifying data it has read. The memory (not illustrated) may be a separate or integrated memory means or memory card in the reading means, or the identifying unit may use the computer memory as its data storage for storage of data it has identified. In the identifying unit of figure 3, the receiver RF 21 reads, in other words, receives in this case, identifying data on the substrate 5 connected to the object 2. In this case, the substrate 5 comprises a transmitter means 23, which transmits receivable identifying data on e.g. radio frequency, in optical form or any similar form. The receiver 21 is preferably a radio receiver RF which receives the radio signal transmitted by the radio transmitter 23 of the substrate 5, preferably an RFID signal. The receiver 21 my also be an optical receiver, preferably an optical detector, which receives the optical signal transmitted by the optical transmitter 23 of the substrate 5. The receiver 21 is connected to the controller unit and memory of the computer 30 and over this to applications SOVl, SOV2 as illustrated in figure 1. The identifying unit may further comprise a display 32 connected to the computer, a keyboard 34 or a combination of these, such as a contact display, or any other data input means, allowing control of applications SOVl, SOV2 over their interfaces. The object to be sorted is placed on the identification substrate 19 connected to the identifying unit for reading of the identifying data. There may be one or more applications SOVl, SOV2 in use.
The embodiments of the digital sorting rack 10 illustrated in figures 1-3 comprise applications SOVl, SOV2, 70, 80, which are intended to be stored in the memory of the computer 30 and to perform various necessary functions, such as identification, processing, correction and/or comparison. It also possible to connect a display 32 connected to a computer, a keyboard 34 or a combination of these, such as a contact display or any other data input means, to the identifying unit 20, 21, 22 connected to the computer, these means allowing control of the applications SOVl, SOV2 over their interfaces. The processor speed and the storage capacity affect the imaging, image manipulation and text reading speed of the apparatus. A flat-panel display as such does not require much space. In addition to indication of the interfaces of the applications, the display can be used to indicate to the user when an image has been generated for identification and to indicate the results of the identification.
Figure 4 exemplifies the interface 90 of an application of an identifying unit of one embodiment. The interfaces of these embodiments, which are visualised on the computer display 32 and controlled by the keyboard 34, typically comprise storage selections 92, processing and identification selections 94 and windows 96a, 96b for showing the identification results. The identified identifier data, i.e. the data found in the database, is displayed in the window 940. The interface additionally comprises signal lights 910 to guide the user and an ending push-button 920. The application together with the interface are started e.g. from the start menu of windows or any other operating system, and then the actual sorting application opens in the computer memory. When the sorting application of one embodiment starts, a first, second and third light source 24, 26, 28 are simultaneously switched on under the control of the controller unit of the computer in the identifying unit 20 of the sorting rack. The first and the second light source indicate the centre and the corners of the imaging area and the pulsation of the identifying unit, preferably a digital camera, starts as described above.
The interface exemplified in figure 4 comprises three light signals, of which the first one 910a is switched on when the application is being started, the second light signal 910b is switched on when the initialisation is finished and the third light signal 910c is switched on when the object can be removed from the imaging area. The first light signal 910a "database being initialised" is lighted if the user, when initialising the application, has clicked "yes"as a reply to the program's question whether there have been changes in the database since the previous use. As the initialisation of the database ends, the light signal 910a is switched off. The second light signal 910b "identification completed, put another object in the imaging area" indicates when the identification is completed and a new object can be placed in the imaging area. This light signal 910b is switched off when a new object is detected in the imaging area. The third light signal 910c indicates when the object can be removed from the imaging area. The light signal 910c is switched on immediately when an image of the object has been generated for processing. When the identification is completed and another object can be placed in the imaging area, the light signal 910c is switched off. The sorting application is ended e.g. with the aid of the STOP push-button 920. The application completes the identification that is going on and then stops the program. If the user wishes to continue the sorting after the program has stopped, he can do so e.g. by pressing the arrow key at the left upper corner. This restarts the application.
Figure 5 is a flow chart of a method for sorting at least one object equipped with identifier data into a sorting rack comprising a plurality of delivery bins. First, the user of the digitalised sorting rack activates the rack 500 and sets the rack in the desired sorting mode 502. The user assumedly starts by selecting the rack mode in step 502, and then the rack is set in the selected mode, while sorting data 504 are loaded in the database, unless they are already provided in the database. Then the user of the rack grips the object to be identified and places it on the identification substrate, and then the at least partial identifier data are read and stored 506 in a memory connected to the identification unit 20. Next, the read identifier data are transferred from the identification unit to a first application stored in the computer memory, which compares the read identifier data with the sorting data 508 provided in the database. Unless the read identifier data comply with the sorting data, the second application stored in the computer memory complements the read incomplete identifier data to the correct identifier data 510, which correspond to the sorting data in the database. Then the controller unit of the computer indicates to the user the bin into which the object should be sorted 512 and the user places this object in the indicated bin 514. In step 516, the user decides whether to continue sorting into the rack or to stop sorting. If the sorting into the rack is continued (NO), the user proceeds to step 506, where the identifier data of the new (following) object are read. If the sorting into the rack is ended (YES), the user proceeds to step 518, where he decides whether to continue the sorting at all (YES), returning to step 502, or whether to stop sorting (NO), proceeding to step 530. A shift from step 516 to step 518 usually means that all the objects of one batch have been sorted into the correct bins.
As further illustrated in figure 5, in step 502, the user selects the mode collecting from the rack, and the rack is set into the selected mode, while the collection list
520 is loaded into the database, unless it is previously provided in the database.
Then, on the basis of the collection list, the user is indicated the bin from which the object is taken 522 next, and at the same time discriminatingly the bin 524 next in turn, from which the object in turn next after the presently collected object is collected. The object 526 is taken from the bin and the collection is continued until all the objects in the bin have been taken out through steps 528 and 526, in other words, all the objects present (sorted) in one single bin are collected from it. When the first object is taken from the following bin, the recognition means connected to the bin identifies this particular collecting operation in step 530, and then the collection control proceeds by one step through steps 522-528, unless the user wishes to stop the collection in step 532. At the same time as the recognition means connected to the bin identifies collection from the following bin in step 530, the mode of this particular bin is changed to a bin to be collected. In step 532, the user decides whether to continue collecting from the rack or to stop. If collection from the rack is continued (NO) the user proceeds to step 522, in which a new (following) bin is indicated, from which the object is collected. If collection from the rack is stopped (YES), the user proceeds to step 518, in which he decides whether to continue sorting (YES) at all, returning to step 502, or to terminate sorting (NO), proceeding to step 534. A shift from step 532 to step 518 usually means that all the objects pertaining to one batch have been collected from the bins for delivery, for instance.
In the method of the invention for sorting an object into a rack and collecting it from the rack, these operations are mutually independent operations. As described above, method steps 500, 502, 520-532, 518 and 534 described above constitute a completely independently operating method for collecting an object from the rack on the basis of a collecting list stored in a database. The method is applicable to the system of the invention, which comprises a digitalised sorting rack and a sorting rack section.
Figure 6 is a flow chart of a method of one embodiment by steps, when step 502 of figure 5 selects as sorting mode "sorting into the rack". In other words, the flow chart of figure 6 shows steps 502-518 of figure 5 in greater detail. In this embodiment, the object is preferably a postal item, to which at least partial identifier data, e.g. an address field, an address identifier or a delivery area identifier has been connected. When the application has been activated 600 and the light signal "identification completed, put new object on the imaging area" has been switched on, the first object to be identified is placed on the imaging area on the identification substrate 602. When the arrival of the object in the imaging area has been detected, the light signal "identification completed, put new object on the imaging area" is switched off. The following step 604 comprises monitoring of the moment the object is immobilised and the user brings the centre of the address field on the central LED light point and immobilises the movement of the object at this location. Step 606 comprises detection of the immobilisation of the object, processing an image of the address field and switching on the light signal "remove object from imaging area", allowing the user to remove the object from the imaging area 608. The image of the address field of the object, preferably a postal item, is displayed in the first window of the display, and a corrected, e.g. straightened image of the address field is shown in the second window, as illustrated in figure 4. The partial identifier data read in step 610 is pre-processed and compared with the address directory in the database in the manner explained below, and the identification results are shown on the display over the interface in the field "found address" (cf. window 940 in figure 4). In step 612, the delivery bin of the object is indicated. After this, the light signal "identification completed, put a new object in the imaging area" is switched on, signalling that a new object, preferably a postal item, can be placed in the imaging area on the identification substrate 614. Step 616 comprises a decision of whether to continue sorting or to stop sorting. If the sorting is continued (NO), the user proceeds to step 604, where he reads the identifier data of the new (following) object). If the sorting is stopped (YES), the user proceeds to step 618. When wishing to stop sorting, the user presses e.g. the STOP push button (figure 4), and then the image manipulation in process is completed and the application is turned off.
Figure 7 is a flow chart of a method of one embodiment for identifying partial identifier data of an object by comparing the read partial identifier data with sorting data stored in the database and by complementing defective identifier data to corrected identifier data. The method described here relates to steps 508 and 510 illustrated in figure 5 and to steps 606, 610 and 614 illustrated in figure 6, which will be exemplified in greater detail below. The flow chart of figure 7 shows an embodiment of the identification method steps, by means of which imaging (reading) and identification (pre-processing, comparison) with the aid of the camera are automated. The described example assumes that the identifier data of the object, preferably a postal item, is an alphanumeric address field and that pre-processing is performed three times at the most. Further, reading is preferably text reading and identification is preferably text identification in this example.
When sorting is started 700 and the object, preferably a postal item, has been appropriately placed on the imaging area on the identification substrate, the identifier data of the object are imaged and stored 702. In step 704, the most recent image is compared with the preceding image, and if they are identical (YES), it is stated that no new object has been introduced in the imaging area, in other words, after two successive imaging operations, the same (preceding) object is still in the imaging area. When the image release is automated, e.g. by synchronising the camera with the light sources 24, 26 in figure 2, the camera generates successive images at given intervals, and the "difference image" of these is calculated. Two successive images can be taken at 100 ms intervals, for instance. By criteria determined on the difference image, the user may conclude whether the two last images were identical. When a comparison in step 704 states that the two last images were not identical (NO), the user knows that a new object, preferably a postal item, has been brought in the imaging area. After automatic imaging, the image can be subjected to preliminary pre-processing, e.g. by modifying the image pixel values in the user interface (window 94 in figure 4) either to a lighter or a darker shade according to the selection. Then the image is subjected to post¬ processing 706, 708, 710, 712 and the user is informed of completed imaging by means of the light signal 910b (figure 4) of the user interface. The post- processing steps, which are explained in greater detail below, comprise image straightening 706, image pre-processing 708 and reading corrected identifier data in the image 710, and the iteration cycles needed in steps 708 and 710 for trimming of the corrected identifier data in step 712. Unless one wishes to continue the image post¬ processing for one reason or another, when the correct address data are not found in the database, the user can interrupt the sorting in step 716 and proceed to step 702, or terminate sorting in step 718. As an end result of step 714, after the corrected identifier data of the object, preferably a postal item, have been read, the address complying best with the corrected identifier data is searched in the address database (sorting data) and the bin corresponding to this address is indicated to the user for sorting. At the same time, the read (partial) identifier data, the identified address and an image of the address field of the object, preferably a postal item, are stored in the directory defined as identifier data.
The directory in which the identification results are stored is written in the user interface (figure 4) in the field "identification result directory". There are three options under the text "store" for selecting what to store. The options are e.g.
"image", "unidentified" and "identification results". By selecting "image", the user stores an image of the object that has not been processed. By selecting
"identification results", the user stores the results of identification. By selecting "unidentified", the user stores the results and the images also when no address has been identified in the object. In this case, however, "image" and "identification results" should be selected for the image of unidentified objects and its identification results to be stored. In the storage of the data and the image of objects whose address has not been identified, the file name will read "unidentified".
The straightening of the image in step 706 of figure 7 depends on the angular range selected for examination in the user interface. The determination of the angular range is explained in the description below. If the angular range selected for examination is e.g. in the range +30 degrees to -30 degrees, the image is turned e.g. at an angle of -30 degrees and the pixels of the turned image are summed row-wise. This yields a vertical image profile, on which a Fourier transformation is calculated. This operation is performed by steps of 7.5 degrees a total of 9 times. This will cover the entire desired angular range. The Fourier transformation indicates the spatial frequencies in the profile. An examination of the suitable number of spatial frequencies in the calculated Fourier transformations allows the conclusion of the angle of the image in which the partial identifier data text is horizontal. When the text is in a straight position in the image, its profile comprises a large number of relatively high frequencies.
The following is a description of an example of performing image straightening in accordance with the read text in step 706. If there has been a selection in the user interface for examining 360 degrees, i.e. all the angles, the image is turned first at an angle of -42 degrees. The image pixels are summed separately both by rows and columns, yielding an image profile both in the horizontal and the vertical direction. The Fourier transformations are calculated on both the profiles. The angle is changed by steps of 7 degrees totally 13 times. Since both the vertical and the horizontal profile is calculated on each image, the angular ranges will be examined in the range -42 to +42 degrees and +48 to +132 degrees. Since the difference between these angular ranges is smaller than the length of an individual step (48 - 42 = 6<7), one can conclude that the examined angular range is in the range -42 to +132 degrees. Considering also that 3 degrees can be added to the end points of the examined angular range, the examined angular range is in fact half of all the conceivable angles. 3 degrees can be added to the end points, because this is less than half of the length of one step, in other words, with a text at an angle of 45 degrees, the algorithm interprets it at an angle of 42 degrees, which is consequently a value with an error of 3 degrees. If the text is at an inclination of e.g. 3.5 degrees, the algorithm interprets it at an inclination of 0 or 7 degrees, i.e. a value with an error of 3.5 degrees. Since an addition of 3 degrees to the end points of the range does not impair the angular resolution of the algorithm, this addition can be made. One may further note that the text may be also upside down in these images, in other words, noting that the text is e.g. at an inclination of 15 degrees, it could equally well be at an inclination of 195 degrees. However, the algorithm described above yields a value of the text orientation angle in the range -45 to +135 degrees. When the image is positioned at such an angle, the text may be upside down. To eliminate this error, optical character recognition (OCR) is performed on the image; the same is done on an image turned upside down. After this the user examines which of the images comprises the postal code and post office. If the image is turned upside down, the optical character recognition will not find any text that makes sense. This allows the conclusion of the angle of the image in which the text is straight.
The following is an exemplifying description of the control of the post-processing steps 706, 708, 710 and 712 from the user interface of the application, with reference to window 94 of the user interface in figure 4. The selection "angular range" determines the size of the angular range in which the inclination should be examined. The selection "limit of reliable recognition" selects the limit determining the minimum acceptable value for recognition reliability. Values under this value are classified as unidentified. The limit of reliable identification can be selected e.g. in the range 0.00 to 1.00 and the selection can be made e.g. with a 0.05 resolution. The reliability value indicates the degree to which the corrected identifier data comply with the "found address" data in the database. If the correlation is total, the reliability value is 1 , and the smaller the reliability value, the lower the correlation. The bin indicates the bin number to which the address corresponds. The processing duration indicates the duration of the image pre-processing and the comparison between the identification result and the address directory. In the field "postal code and post office" the user fills in the postal code and post office of the mail items being sorted. This information is needed in searching the street address of a postal item and in the straightening of the text of a postal item, if the selection 360 degree has been made under "angular range". The selection "maximum number of image manipulations" is selected as the maximum number of image manipulations of one single image. The number of manipulations can be selected e.g. in the range 1 to 3. The selection "darkness of image in first manipulation" determines the effect of the first pre-processing of the image. The higher the selected value, the higher the whiteness of the image. The darkness of the image in the first manipulation can be selected e.g. in the range 0.00 to 2.00, and the selection can be made e.g. with a resolution of 0.1. Accordingly, the selection "darkness of image in the third manipulation" determines the effect of the third pre-manipulation of the image in this case. The higher the selected value, the higher the whiteness of the image. The darkness of the image in the third manipulation can be selected e.g. in the range 0.00 to 4.00, with the selection made e.g. with a resolution of 0.1.
Figure 8 is a flow chart of a method of a second embodiment for identifying partial identifier data of an object by comparing the read partial identifier data with sorting data stored in the database and by complementing defective identifier data to corrected identifier data. The method described below relates to steps 508 and 510 illustrated in figure 5 and to steps 606, 610 and 612 illustrated in figure 6, which will be exemplified in greater detail below. In the method of figure 8, the comparison uses the identification result obtained in optical character recognition as explained in conjunction with image straightening (step 710 in figure 7). When the comparison is started 800, this identification result is processed 802 and the identification result is compared with sorting data stored in the database, such as an address directory, for instance. Then, in step 804, the identification result is modified by e.g. changing all its characters to capital characters, all the 0 numbers to the letter O and all the 1 numbers and L letters to the letter I, by eliminating the special characters and the space characters. The same operations have been performed for the address directory in the database for the address directory and the identification result to be compatible. In addition, all the rows containing 2 or fewer characters 804 are deleted from the identification result. Next, the row having the postal code and postal office 806 is searched in the read identifier data. Step 808 comprises a comparison of the identification result with the database, and if the identification result is adequate (YES), step 810 calculates the correlation between the row preceding the postal code and the street names in the database. The following is an exemplifying description of two different manners of calculating the correlation between two strings of characters. Step 812 comprises a comparison to conclude whether the highest correlation is adequate, and if this is the case (YES), the user proceeds to step 816. Unless the identification result of step 808 or the highest correlation in step 812 is adequate (NO), the highest correlation between all the rows and the street names is calculated in the database in step 814. Step 816 comprises calculation of the correlation between the row where the street name was found and the potential addresses. Then the correlation between the row preceding the address and the names 818 of the persons living in the same street is calculated, yielding the highest name correlation. Step 820 comprises a comparison to conclude whether the highest name correlation is adequate, and if this is the case (YES), the user proceeds to step 824. Unless the highest name correlation in step 820 is adequate (NO), the correlations between all the rows and the database names of the persons living in the same street are calculated in step 822. Finally, in step 824, the correct bin is assigned for the sorted object on the basis of the highest summed correlation, the sum correlation.
The following is an example of the structure of the address directory stored as a file in the database. The program adds a bin number to the address directory, the number being the same for all of those living at the same address.
.. i 2 3 4 5
00330; Perustie;30 B 25;Meikalainen;Liisa Riitta Maria
00330; Perustie;30 B 25;Meikalainen; Matti Sakari
00330; Perustie;30 B 25;Perusfirma Oy;
in which the numbered address fields denote:
1 = Postal code
2 = Street name
3 = Te remaining address components
4 = Name 1, i.e. family name or company name
5 = Name 2, i.e. first names or blank (a blank in the case of a company name).
The following is an exemplifying description of two substantially different manners of calculating the correlation between two character strings. The method in figure 8 uses both these calculating procedures in the calculation of the correlation between the two character strings, selecting the higher correlation value of the values thus calculated in two different manners.
Manner 1 : Character strings to be compared:
Character string 1 : MATTI MEIKALAINEN Character string 2 : M ATTI-M AKEL AINEN
Character string 2 is the result of the optic character recognition described above and character string 1 is the character string in the database with which the recognition result is compared. The underline signifies a blank character in this context.
The method (1) for calculating the correlation between two character strings:
Character string 1 is glided over character string 2 and the number of coinciding identical characters in each situation is calculated. Initially the last character in character string 1 and the first character in character string 2 coincide, and then the character string 1 is moved by steps of one character over character string 2.
1st step:
MATTI MEIKALAINEN
MATTI MAKELAINEN 0 identical characters coincide
2nd step:
MATTI_MEIKALAINEN
MATTI_MAKELAINEN 8 identical characters coincide (characters T, K, L, A, I, N, E, and N)
Etc.
Results: The number of hits out of the characters glided in the different steps is calculated. The calculation of correlation coefficients should take account of the ratio of character sting lengths to the number of hits.
Calculation of the correlation coefficient
R = Nmax/(MJ1 + O,5|MJ1 - MJ2|).
in which Nmax is the highest number of this, MJl is character string 1 and MJ2 is character string 2. In this exemplifying case, the correlation coefficient obtained is 0.46.
Manner 2:
Character strings to be compared:
Character string 1: MATTI_MEIKALAINEN
Character string 2: M ATTI_M AKEL AINEN
Character string 2 is the result of the optic character recognition and character string 2 is the character string in the database with which the recognition result is compared.
Method (2) for calculating the correlation between two character strings:
The correlation between two character strings is examined by taking two successive characters at a time from character string 1 and by examining whether an identical character pair is found in character string 2.
Step 1
Two characters taken from the character string: MA
Is this character pair found in character string 2: Yes
Step 2 Two characters taken from character string 1 ; AT
Is this character pair found in character string 2: Yes
Etc.
Results:
The calculation of the correlation coefficient takes account of the number of correct character pairs found in character string 2 and the lengths of the character strings. Calculation of the correlation string:
r = N/(MJ1 - 1 + O,5|MJ1 - MJ2|),
in which N is the number of found character pairs, MJl and MJ2 are character string 1 and character string 2. The correlation coefficient obtained in this example is 0.67.
The method steps illustrated in figures 5 - 8, or at least part of the method steps can be carried out by a computer program or programs, which can be stored in the memory of computer 30 for performance of various functions such as identification, processing, correction and/or comparison. The computer program consists of an encoded programmable product.

Claims

Claims
1. A system for sorting at least one object (2) equipped with identifier data, the system comprising a digitalised sorting rack (10) including a plurality of delivery bins (11) and related guide means (13, 15), which are connected to a controller unit (30) of a computer and a database (60) equipped with sorting data communicating with the computer, the system comprising an identification unit (20), which is connected to the controller unit (30) of the computer and the database (60) equipped with sorting data communicating with the computer for reading identifier data, characterised in that the identification unit (20) comprises reading means (21, 22) for reading the at least partial identifier data of the object, processing means (30, 70, 80) for complementing the read partial identifier data to corrected identifier data, and comparison means (30, 70, 80) for comparing the corrected identifier data with sorting data stored in the database, on the basis of which said guide means guide the sorting of the obj ect.
2. A system as defined in claim 1, wherein the identification unit (20) comprises a camera (22) and a memory for imaging the partial identifier data of the object and for storing the image, for complementing the correction algorithm stored in the controller unit (30) of the computer and the image stored in the memory of the user interface (90) operating on the computer display (32) to corrected identifier data and for comparing the corrected identifier data of the comparison algorithm stored in the computer controller unit (30) with sorting data in the database (60) and for generating reference data, the computer controller unit transmitting the reference data to the guide means (13, 15) of the delivery bins.
3. A system as defined in claim 1, wherein the identification unit (20) comprises a radio-frequency receiver (21) and a memory for receiving the partial identifier data transmitted by a radio-frequency transmitter (23) connected to the object and for storing the radio transmission, for complementing the correction algorithm stored in the computer controller unit (30) and the radio transmission stored in the memory of the user interface (90) operating on the computer display (32) to corrected identifier data and for comparing the corrected identifier data of the comparison algorithm stored in the computer controller unit (30) with the sorting data in the database (60) and for generating reference data, the computer controller unit transmitting a control signal generated on the basis of the reference data to the guide means (13, 15) of the delivery bins.
4. A system as defined in claim 2 or 3, wherein, on the basis of the reference data it has generated, the comparison algorithm generates the correlation data of the identifier data of the object, the computer controller unit transmitting a control signal generated on the basis of the correlation data to the guide means of the delivery bins.
5. A system as defined in claim 2, wherein the corrected identifier data generated by the correction algorithm is a straightened image of the read partial identifier data of the object.
6. A system as defined in claim 2, wherein the identification unit (20) comprises a first light source (24), which is disposed at a suitable distance in connection with the camera (22) and is equipped with an adjustable incident angle, the first light source being oriented at a suitable incident angle to the imaging area defined by the camera lens (221), the identifier data in the object being guided to the imaging area, with a view to enhanced identifier data detectability.
7. A system as defined in claim 6, wherein the identification unit (20) comprises a second light source (26) on the opposite side of the object relative to the camera lens (221), the second light source being directed to the imaging area defined by the camera lens and perpendicularly to the camera lens, whereby, when the object with its identifier data is placed in the imaging area, the light supply from the second light source directed to the lens and received by the camera light detector stops, the camera being thus disposed to image the identifier data of the object.
8. A system as defined in claim 7, wherein the camera, the first light source (24) and the second light source (26) have been mutually disposed such that the camera pulsation is synchronised in the same phase as the second light source and in the opposite phase relative to the first light source, the camera being set to image the identifier data of the object under the control of said pulsation.
9. A system as defined in any of claims 6-8, wherein the first light source (24) and the second light source (26) comprise five light diodes, preferably LEDs each, which are disposed in the shape of a cross with one of the light diodes, preferably LEDs in the centre of the cross.
10. A system as defined in any of claims 6-9, wherein the camera (22) can be freely mounted either in a camera stand or on the shoulder, head, or helmet or at any other suitable location of the person performing the sorting.
11. A system as defined in any of the preceding claims, wherein the identifier data of the object are stored as an alphanumeric text, a bar code, a radio-frequency identifier, an electric identifier, an optic identifier or any similar identifier, which is disposed in the object (2) or a substrate (5) affixed to the object.
12. A system as defined in claim 1, 2, or 3, wherein said guide means comprise indicator means (13) recognition means (15) connected to the delivery bins and an I/O (50) controller connected to the computer controller unit (30) for controlling said indicator means on the basis of the control signal and for controlling an acknowledgement signal of said recognition means to the computer controller unit and a switch (40) for switching the operating mode of the digitalised sorting rack from the mode sorting into the rack to the mode collecting from the rack, and vice versa.
13. A system as defined in any of the preceding claims, wherein the indicator means (13) disposed in connection with the delivery bin is disposed to indicate the delivery bin (11) into or from which the object is sorted on the basis of a control signal it has received from the computer controller unit, and, having recognised a sorting operation of the object, the recognition means (15) is disposed to transmit an acknowledgement signal to the computer controller unit.
14. A system as defined in claim 13, wherein the light-emitting component (13) in connection with the delivery bin is disposed to be activated on the basis of a control signal it has received from the computer controller unit (30) and to indicate the delivery bin (11) into or from which the object is sorted.
15. A system as defined in claim 12, characterised in that the digitalised sorting rack (10) comprises
- a collection list comprising the identifier data of the objects stored in the database (60), the computer controller unit generating a control signal on the basis of the collection list for transmission to the guide means of the delivery bins, and - guide means (13, 15) for guiding the collection of the object on the basis of the control signal.
16. A system as defined in claim 15, wherein the digitalised sorting rack comprises first guide means, comprising a first indicator means (13), and second guide means, comprising a second indicator means (13), the first indicator means being disposed to indicate, on the basis of a control signal it has received, the delivery bin (11) next in turn according to the collection list and the second indicator means is disposed to indicate discrimatingly the delivery bin subsequent to the delivery bin in turn according to the collection list.
17. A system as defined in claim 15, wherein the digitalised sorting rack comprises first guide means, comprising a first recognition means (15) and a first indicator means (13), second guide means comprising a second recognition means (15) and a second indicator means (13) and third guide means comprising a third recognition means (15) and a third indicator means (13), the second recognition means, having recognised that a collection operation has been performed, being disposed to transmit an acknowledgement signal to the computer controller unit (30), which generates a new control signal on the basis of the collection list, the first indicator means being set to stop indicating at the reception of this signal, the second indicator means being set to stop discriminating indication and to start indicating the delivery bin in turn according to the collection list, and the third indicator means being set to indicate discriminatingly the delivery bin subsequent to the delivery bin in turn according to the collection list, and the third recognition means being set to recognise the collection operation.
18. A system as defined in any of the preceding claims, wherein said guide means (13, 15) comprise an indicator means (13) provided in each delivery bin (11) and a recognition means (15) in each delivery bin array (112) comprising a plurality of delivery bins.
19. A system as defined in any of the preceding claims, wherein said guide means (13, 15) comprise a distance sensor (15) and/or a light-emitting component (13).
20. A system as defined in any of the preceding claims, wherein the digitalised sorting rack (10) comprises at least two digitalised sorting rack sections (114) connected to the computer controller unit, each of the rack sections being connected to a switch (40) connected to the computer controller unit for switching the mode of the digitalised sorting rack section from the mode sorting into the rack to the mode collecting from the rack and vice versa, each of the digitalised sorting rack sections being apt to be freely set either into the mode sorting into the rack or the mode collecting from the rack.
21. A system as defined in any of the preceding claims, wherein the object is a postal item (2) equipped with an address, a technical identifier or any similar identifier data affixed to the postal item or a substrate (5) affixed to the postal item.
22. A system as defined in claim 21, wherein said technical identifier or other identifying data contain information about the delivery point, address, changed address, pick-up, pick-up and delivery or any similar service operation received by the consignee.
23. A system as defined in any of the preceding claims, wherein the identifier data are address data comprising the postal code, street, building, stair or apartment data in alphabetic and/or numeric order.
24. A system as defined in any of the preceding claims, wherein the identifier data are address data based on data in the postman's route map.
25. A system as defined in claim 23 or 24, wherein said collection list is based at least partly on said identifier data.
26. A system as defined in any of the preceding claims, wherein the sorting data and the collection list are stored in a database, where they can be updated.
27. A system as defined in any of the preceding claims, wherein the sorting data and the collection list are stored in a database (60), where they can be updated in real time over a data communication network.
28. A system as defined in any of the preceding claims, wherein the guide means, including the indicator means (13) and the recognition means (15), are disposed for wireless control over the I/O controller (50) connected to the computer controller unit (30).
29. A method for sorting at least one object equipped with identifier data in a sorting rack comprising a plurality of delivery bins, the method comprising steps for
- connecting a delivery bin to the computer controller unit (500), - storing the sorting data in a database (504) communicating with the computer,
- placing the object to be sorted in the identification unit connected to the computer controller unit for reading (504) the identifier data,
characterised in that the method comprises steps for
- reading the at least partial identifier data (506) of the object,
- comparing the read identifier data with the sorting data (508),
- complementing the read partial identifier data to corrected identifier data (510) and - directing the object to the delivery bin (512) on the basis of the comparison.
30. A method as defined in claim 29, wherein
- imaging the at least partial identifier data of the object with a camera (506, 606, 702),
- storing the image obtained in a memory (506, 606) in connection with the camera,
- comparing the imaged identifier data with the sorting data in the database by means of a comparison algorithm (508, 712, 800),
- complementing the stored image to corrected identifier data by means of a comparison algorithm (510, 706, 708) controllable on the computer display,
- generating reference data with a comparison algorithm (510, 610) and
- directing the object to the delivery bin on the basis of the reference data (512, 612, 714).
31. A method as defined in claim 29, wherein
- receiving the partial identifier data of the object with a radio-frequency receiver (506),
- storing the received radio transmission in a memory (506) in connection with the receiver,
- comparing the corrected identifier data with the sorting data stored in the database by means of a comparative algorithm (508), - complementing the stored radio transmission to corrected identifier data by means of a correction algorithm (510) controllable on the computer display,
- generating reference data by the comparison algorithm (510) and
- directing the object to the delivery bin on the basis (512) of the reference data.
32. A method as defined in claim 30 or 31, wherein generating the correlation data (810, 814, 818, 822) of the identifier data of the object on the basis of the reference data generated by the comparison algorithm, the object being directed to the delivery bin (824) on the basis of the correlation data.
33. A method as defined in claim 30, wherein the corrected identifier data generated by the correction algorithm is a straightened image of the read partial identifier data (710) of the object.
34. A method as defined in claim 30, wherein the detectability of the identifier data is improved by orienting a first light source, which is at a suitable distance from the camera and has an adjustable light incident angle, at a suitable incident angle to the imaging area defined by the camera lens, the identifier data in the object being directed to the imaging area.
35. A method as defined in claim 34, wherein a second light source is oriented to the imaging area defined by the camera lens, the second light source being disposed perpendicularly to the camera lens on the side of the object opposite to the camera lens, and that the object with its identifier data is placed in the imaging area, the light supply to the lens received by the camera being prevented from the second light source, the identifier data pf the object being imaged.
36. A method as defined in claim 35, wherein the camera pulsation is synchronised in the same phase as the second light source and in the opposite phase relative to the first light source, the identifier data of the object being imaged under the control of said pulsation.
37. A method as defined in any of claims 34-36, wherein the first light source and the second light source, comprising five light sources each, are placed in cross shape with one of the light sources in the centre of the cross.
38. A method as defined in any of claims 34-37, wherein camera is placed freely either in a stand or on the shoulder, head, helmet or any other suitable location on the person who is performing the sorting.
39. A method as defined in any of claims 29-38, wherein the identifier data of the object are stored as an alphanumeric text, a bar code, a radio- frequency identifier, an electric identifier, an optic identifier or any similar identifier, which is disposed on the object or on a substrate affixed to the object.
40. A method as defined in any of claims 29-31, wherein the delivery bins are recognised and assigned for guiding the sorting rack and that the operating mode of the digitalised sorting rack is switched either in a mode for sorting into the rack or a mode for collecting from the rack (502).
41. A method as defined in any of claims 29-40, wherein the delivery bin into or from which the object is sorted is assigned on the basis of a control signal received from the computer controller unit and that an acknowledgement signal is transmitted to the computer controller unit when a performed sorting operation (530, 606) is identified.
42. A method as defined in claim 41, wherein a light-emitting component in connection with the delivery bin is activated on the basis of the control signal received from the computer controller unit to indicate the delivery bin into or from which the object is sorted (512, 522, 612).
43. A method as defined in claim 40, wherein
- a control signal is generated for guiding the delivery bins on the basis (520) of a collection list comprising the identifier data of the objects stored in a database, and
- the collection of the object is guided on the basis (522) of said control signal.
44. A method as defined in claim 43, wherein the delivery bin (522) in turn to be collected according to the collection list is indicated with the first indicator means of the first guide means on the basis of the control signal, and the delivery bin (524) following the delivery bin in turn to be collected according to the collection list is indicated discriminately with the second indicator means of the second guide means.
45. A method as defined in claim 43, wherein an acknowledgement signal is transmitted from the second recognition means of the second guide means for recognition (530) of a collection operation, a new control signal is generated on the basis of the collection list (520), the indication of the first indicator means of the first guide means is ended, the discriminate indication of the second indicator means of the second guide means is ended, and the delivery bin (522) in turn to be collected is indicated with the second indicator means of the second guide means, the delivery bin (524) following the delivery bin in turn to be collected according to the collection list is indicated discriminately with the third indicator means of the third guide means, and the third recognition means of the third guide means is set to recognise the collecting operation.
46. A method as defined in any of claims 44-45, wherein the delivery bin is indicated by emission of continuous light (522) and that the delivery bin is discriminately indicated by emission of light (524) that is blinking or of different colour.
47. A method as defined in any of claims 29-46, wherein said guide means indicate a delivery bin with the indicator means in said delivery bin and recognises a sorting operation with a recognition means in the delivery bin array comprising a plurality of delivery bins.
48. A method as defined in any of claims 29-46, wherein guide means identify the distance and/or emit light.
49. A method as defined in any of claims 29-48, wherein the digitalised sorting rack is divided into at least two digitalised sorting rack sections and that each rack section is freely set in either a mode for sorting into the rack or a mode (502) for collecting from the rack.
50. A method as defined in any of claims 29-49, wherein the object is a postal item, which is equipped with an address, a technical identifier or any similar identifier data, which are affixed to the postal item or to a substrate fixed to the postal item.
51. A method as defined in claim 50, wherein said technical identifier or other identifier data contain information about the delivery point, address, changed address, pick-up, pick-up and delivery or any other service operation received by the consignee.
52. A method as defined in any of claims 29-51, wherein the identifier data are address data comprising the postal code, street, building, stair and apartment data in alphabetic and/or numeric order.
53. A method as defined in any of claims 29-52, wherein the identifier data are address data based on the data in the postman's route map.
54. A method as defined in claim 52 or 53, wherein said collection list is based at least partly on said identifier data.
55. A method as defined in any of claims 29-54, wherein the sorting data and the collection list are stored in a database, where they are updated.
56. A method as defined in any of claims 29-55, wherein the sorting data and the collection list are stored in a database, where they are updated in real time over a data communication network.
57. A method as defined in any of claims 29-56, wherein the guide means, which include recognition means and indicator means, are controlled by wireless means over an I/O controller connected to the computer controller unit.
58. A computer program comprising an encoded computer programmable product characterised in that it performs the method steps of claim 29.
PCT/FI2004/000640 2004-10-29 2004-10-29 Method and system for sorting a post item in a delivery bin according to identification information WO2006045878A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
PCT/FI2004/000640 WO2006045878A1 (en) 2004-10-29 2004-10-29 Method and system for sorting a post item in a delivery bin according to identification information
EP04791431A EP1812175A1 (en) 2004-10-29 2004-10-29 Method and system for sorting a post item in a delivery bin according to identification information
NO20072725A NO20072725L (en) 2004-10-29 2007-05-29 Method and System for Sorting a Mail Item in a Delivery Container in accordance with Identification Form

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/FI2004/000640 WO2006045878A1 (en) 2004-10-29 2004-10-29 Method and system for sorting a post item in a delivery bin according to identification information

Publications (1)

Publication Number Publication Date
WO2006045878A1 true WO2006045878A1 (en) 2006-05-04

Family

ID=36227504

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/FI2004/000640 WO2006045878A1 (en) 2004-10-29 2004-10-29 Method and system for sorting a post item in a delivery bin according to identification information

Country Status (3)

Country Link
EP (1) EP1812175A1 (en)
NO (1) NO20072725L (en)
WO (1) WO2006045878A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106670111A (en) * 2016-12-08 2017-05-17 顺丰科技有限公司 Sorting prompt device and application method thereof
WO2018077011A1 (en) * 2016-10-25 2018-05-03 北京京东尚科信息技术有限公司 Visual identification system and method thereof, and classifying and sorting system and method thereof
EP3427847A1 (en) * 2017-07-13 2019-01-16 Bpost NV van publiek recht Sorting station for mail items and method for sorting mail items
CN111855658A (en) * 2020-07-28 2020-10-30 山东科技大学 Coal petrography discernment appearance
CN113457996A (en) * 2021-07-19 2021-10-01 山东宏葵医学检验实验室股份有限公司 Automatic sorting and classifying storage device for medical inspection samples

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08101879A (en) * 1994-09-30 1996-04-16 Toshiba Corp Mail processor
JPH08243503A (en) * 1995-03-14 1996-09-24 Hitachi Ltd Postal matter reading and classifying device
JPH11235555A (en) * 1998-02-20 1999-08-31 Toshiba Corp Postal item processing apparatus and its control method
JPH11253891A (en) * 1998-03-11 1999-09-21 Hitachi Ltd Mail sorting method and device
US6246925B1 (en) * 1998-04-01 2001-06-12 Forest Robinson Computerized manual mail distribution method and apparatus
JP2002042057A (en) * 2000-07-21 2002-02-08 Toshiba Corp Reading device, reading method, sorting device and sorting method
US20020113365A1 (en) * 2001-01-09 2002-08-22 Britton David Thomas Sorting system
US20030038065A1 (en) * 2001-08-01 2003-02-27 Pippin James M. Apparatus and method for mail sorting
WO2003035282A2 (en) * 2001-10-15 2003-05-01 Deutsche Post Ag Method and device for processing mail
US20030191651A1 (en) * 2001-01-24 2003-10-09 Hungerpiller Ralph Mitchell System and method for processing returned mail
WO2003086664A2 (en) * 2002-04-12 2003-10-23 Tritek Technologies, Inc. Mail sorting processes and systems

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08101879A (en) * 1994-09-30 1996-04-16 Toshiba Corp Mail processor
JPH08243503A (en) * 1995-03-14 1996-09-24 Hitachi Ltd Postal matter reading and classifying device
JPH11235555A (en) * 1998-02-20 1999-08-31 Toshiba Corp Postal item processing apparatus and its control method
JPH11253891A (en) * 1998-03-11 1999-09-21 Hitachi Ltd Mail sorting method and device
US6246925B1 (en) * 1998-04-01 2001-06-12 Forest Robinson Computerized manual mail distribution method and apparatus
JP2002042057A (en) * 2000-07-21 2002-02-08 Toshiba Corp Reading device, reading method, sorting device and sorting method
US20020113365A1 (en) * 2001-01-09 2002-08-22 Britton David Thomas Sorting system
US20030191651A1 (en) * 2001-01-24 2003-10-09 Hungerpiller Ralph Mitchell System and method for processing returned mail
US20030038065A1 (en) * 2001-08-01 2003-02-27 Pippin James M. Apparatus and method for mail sorting
WO2003035282A2 (en) * 2001-10-15 2003-05-01 Deutsche Post Ag Method and device for processing mail
WO2003086664A2 (en) * 2002-04-12 2003-10-23 Tritek Technologies, Inc. Mail sorting processes and systems

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
PATENT ABSTRACTS OF JAPAN vol. 1996, no. 08 30 August 1996 (1996-08-30) *
PATENT ABSTRACTS OF JAPAN vol. 1997, no. 01 31 January 1997 (1997-01-31) *
PATENT ABSTRACTS OF JAPAN vol. 1999, no. 13 13 November 1999 (1999-11-13) *
PATENT ABSTRACTS OF JAPAN vol. 1999, no. 14 22 December 1999 (1999-12-22) *
PATENT ABSTRACTS OF JAPAN vol. 2002, no. 06 4 June 2002 (2002-06-04) *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018077011A1 (en) * 2016-10-25 2018-05-03 北京京东尚科信息技术有限公司 Visual identification system and method thereof, and classifying and sorting system and method thereof
US11049278B2 (en) 2016-10-25 2021-06-29 Beijing Jingdong Qianshi Technology Co., Ltd. System and method for visual identification, and system and method for classifying and sorting
CN106670111A (en) * 2016-12-08 2017-05-17 顺丰科技有限公司 Sorting prompt device and application method thereof
EP3427847A1 (en) * 2017-07-13 2019-01-16 Bpost NV van publiek recht Sorting station for mail items and method for sorting mail items
CN111855658A (en) * 2020-07-28 2020-10-30 山东科技大学 Coal petrography discernment appearance
CN113457996A (en) * 2021-07-19 2021-10-01 山东宏葵医学检验实验室股份有限公司 Automatic sorting and classifying storage device for medical inspection samples

Also Published As

Publication number Publication date
EP1812175A1 (en) 2007-08-01
NO20072725L (en) 2007-07-30

Similar Documents

Publication Publication Date Title
US10592715B2 (en) System and method for reading patterns using multiple image frames
EP2422294B1 (en) A multiple barcode detection system and method
US6398112B1 (en) Apparatus and method for reading indicia using charge coupled device and scanning laser beam technology
US5621457A (en) Sighting direction detecting device for vehicle
US7044378B2 (en) System and method for imaging and decoding optical codes using at least two different imaging settings
EP2341461B1 (en) Adaptive multi-sensor handheld computing device
EP1531949B1 (en) Apparatus and method for sorting articles by an operator with a detached indicator
US7604174B2 (en) Method and apparatus for providing omnidirectional lighting in a scanning device
US20120063643A1 (en) Methods, Systems, and Products for Gesture-Activation
US6637662B2 (en) Data code image reading apparatus
EP1870170A2 (en) Capturing a non-singulated image of a plurality of forms travelling on a moving conveyor belt
US5881890A (en) Mail sorting system and process
MX2007001504A (en) Systems and methods for using radio frequency identification tags to communicating sorting information.
WO1999064980A1 (en) Imaging engine and method for code readers
US7900840B2 (en) Methods and apparatus for directing bar code positioning for imaging scanning
US12039400B2 (en) Optical information reading device
EP1812175A1 (en) Method and system for sorting a post item in a delivery bin according to identification information
US7106896B2 (en) ID recognition apparatus and ID recognition sorter system for semiconductor wafer
CN101681510A (en) Registering device, checking device, program, and data structure
JP2019071018A (en) Optical information reader and optical information reading method
JPH10105873A (en) Device for recognizing number plate of vehicle
JPH11312210A (en) Symbol reader
JPH11281754A (en) Inventory management device
CN112822411A (en) Information processing apparatus, system and method thereof, lighting apparatus, and recording medium
JPH11278615A (en) Inventory management device

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BW BY BZ CA CH CN CO CR CU CZ DK DM DZ EC EE EG ES FI GB GD GE GM HR HU ID IL IN IS JP KE KG KP KZ LC LK LR LS LT LU LV MA MD MK MN MW MX MZ NA NI NO NZ PG PH PL PT RO RU SC SD SE SG SK SY TJ TM TN TR TT TZ UA UG US UZ VN YU ZA ZM

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): BW GH GM KE LS MW MZ NA SD SZ TZ UG ZM ZW AM AZ BY KG MD RU TJ TM AT BE BG CH CY DE DK EE ES FI FR GB GR HU IE IT MC NL PL PT RO SE SI SK TR BF CF CG CI CM GA GN GQ GW ML MR SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)
NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2004791431

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 2004791431

Country of ref document: EP