NZ556253A - Object tracking method and apparatus - Google Patents

Object tracking method and apparatus

Info

Publication number
NZ556253A
NZ556253A NZ55625307A NZ55625307A NZ556253A NZ 556253 A NZ556253 A NZ 556253A NZ 55625307 A NZ55625307 A NZ 55625307A NZ 55625307 A NZ55625307 A NZ 55625307A NZ 556253 A NZ556253 A NZ 556253A
Authority
NZ
New Zealand
Prior art keywords
identification information
visual identification
location
captured
baggage
Prior art date
Application number
NZ55625307A
Inventor
Michael John Coates
Mark Watt
Guy Kristoffer Kloss
Martin Johnson
Original Assignee
Baggage Sortation Man Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Baggage Sortation Man Ltd filed Critical Baggage Sortation Man Ltd
Priority to NZ55625307A priority Critical patent/NZ556253A/en
Publication of NZ556253A publication Critical patent/NZ556253A/en

Links

Abstract

A method of tracking objects within an object transportation system comprises a) providing an object identification code for an object; b) capturing visual identification information for the object at a first location within the system; c) relating the visual identification information captured at the first location with the object identification code; d) capturing visual identification information at a second location within the system; e) comparing the visual identification information captured at the second location to the visual identification information captured at the first location; and f) relating the visual identification information captured at the second location to the object identification code if the visual identification information captured at the first and second locations meet predefined comparison criteria. A baggage tracking apparatus comprises a conveyer to convey the object; a first camera (16a) to capture visual identification information for the object at a first location of the conveyer; a second camera (16b) to capture visual identification information for the object at a second location of the conveyer; a scanner (21) to provide an object identification code for the object; and a computer server adapted to carry out the above described method of tracking objects.

Description

"S'S Q. S. "S> PATENTS FORM NO. 5 Fee No. 4: $250.00 After Provisional No: 556253 Dated: 29 June 2007 OBJECT TRACKING METHOD AND APPARATUS We Baggage Sortation Management Limited, a New Zealand company of Level 2, 3 Margot Street, Newmarket, Auckland, New Zealand hereby declare the invention for which we pray that a patent may be granted to us, and the method by which it is to be performed to be particularly described in and by the following statement: 1 PATENTS ACT 1953 COMPLETE SPECIFICATION James & Wells Ref: 232069 OBJECT TRACKING METHOD AND APPARATUS technical field This invention relates to a method and apparatus for tracking objects. In particular it relates to a method and apparatus for tracking baggage in a baggage handling 5 and/or sorting system.
Background Art A variety of systems are known in which objects are transported, handled and sorted. A baggage handling or sortation system is one example. Typically, baggage sortation systems have a check-in where an item of baggage is tagged for 10 identification and where information is assigned to that item of baggage for identification, sorting and handling purposes. For example, information may be assigned to the item of baggage identifying its destination within the system. This information may relate to a given loading bay at an airport for example. Information relating to security checks may also be assigned to the item of baggage. For 15 example, the item may be identified as a candidate for specific security checks.
From the check-in the baggage may be transported by an automated conveyor to a predefined destination within the system. The conveyor may have sortation gates where items of baggage are moved onto alternative conveyors depending on the information assigned to the given item of baggage. These alternative conveyors 20 may transport the item of baggage to given destinations or through given security checks.
As is appreciated by those skilled in the art, many baggage handling and sortation systems can become relatively complex, particularly in applications such as large airports. 2 James & Wells Ref: 232069 In applications such as airports, it is also important that the items of baggage are accurately and securely tracked through the baggage handling and sortation system to reliably transport, handle and sort items of baggage. Therefore, a reliable system of tracking baggage is necessary to ensure that bags arrive at the 5 correct destination and that suitable handling, such as security checks, are carried out within the system. Currently, high standards for reliability necessitate a relatively high level of human manual checking and identification of baggage.
Known baggage tracking systems typically employ Automatic Tag Readers (ATR) at given points within the baggage handling sortation system. These automatic tag 10 readers attempt to read the tags attached to the items of baggage as those items pass. They do this by reading barcodes associated with the tag. These barcodes must be read in a variety of orientations and positions as the bags pass the tag reader. To overcome the obvious difficulties in reading barcodes in a variety of orientations and positions, automatic tag readers are relatively costly devices to 15 manufacture, install and maintain. Also given the obvious difficulty in reading barcodes at random orientations and positions, automatic tag readers have limited reliability. Therefore, automatic tag readers need to be backed-up by barcode readings taken with a hand-held reader. This operation requires human involvement.
Known baggage tracking systems also typically employ 'photo eyes' which detect whether a bag is in their proximity. These 'photo eyes' allow the approximate length of an item of baggage to be measured. The orientation of a bag on the conveyor will affect how this measurement of length correlates to the actual length of an item of baggage. Also, trailing straps can cause the 'photo eyes' to see 25 merged baggage or 'ghost' baggage. Therefore, identifying an item of baggage with 'photo eyes' has limited reliability also. 3 James & Wells Ref: 232069 Known baggage tracking systems also typically locate baggage by predicting the location of an item of baggage on a conveyor system at given times. These predictions are based on the tracking speed of the conveyor. These predictions also have a limited reliability.
Conventional baggage tracking systems achieve reasonable reliability by combining ATRs, 'photo-eyes', conveyor speed based predictions, and human operation of barcode scanners.
However, the applicant has observed that some industries, such as the air travel industry in particular, would still benefit from greater reliability in baggage tracking.
The applicant has also observed that baggage handling and baggage tracking systems would benefit from cost reductions affected by the elimination of costly devices such as automatic tag readers and by the reduction or elimination of the need for human operators.
The applicant has also observed that baggage tracking systems could be made 15 more reliable by positively identifying items of baggage by means other than directly reading the identification tag attached to the baggage. This would lead to improved reliability and reduced cost of baggage tracking systems.
Accordingly, it is an object of the present invention to provide an object tracking method and apparatus which involves identifying items of baggage without needing 20 to directly read a barcode associated with the item of baggage, or at least to provide the public with a useful choice in baggage tracking systems.
As used herein the term 'information packet' is intended to broadly refer to any means of combining different sets of information for processing, transmission, and such like. As an example, which is not intended to be limiting, an information 4 James & Wells Ref: 232069 packet may be an object oriented encapsulation of information relating to a given object to be tracked.
As used herein the term 'visual identification information' is intended to broadly refer to any item, or set, of information which is capable of identifying an object 5 visually.
As used herein the term 'embedded' is intended to have a standard meaning as understood in the field of hardware and software engineering and is intended to include, but not be limited to, firmware, programming for FPGA's, ASIC designs and such like.
As used herein the term 'computer' is intended to refer broadly to any form of computing device including personal computers, microprocessors, microcontrollers, digital signal processing devices, field programmable gate arrays, application specific integrated circuits, programmable logic arrays and such like.
As used herein the term 'comparison criteria' is intended to broadly refer to any 15 criteria which might be used to compare given items, or sets, of information.
All references, including any patents or patent applications cited in this specification are hereby incorporated by reference. No admission is made that any reference constitutes prior art. The discussion of the references states what their authors assert, and the applicants reserve the right to challenge the accuracy and 20 pertinency of the cited documents. It will be clearly understood that, although a number of prior art publications are referred to herein, this reference does not constitute an admission that any of these documents form part of the common general knowledge in the art, in New Zealand or in any other country.
It is acknowledged that the term 'comprise' may, under varying jurisdictions, be James & Wells Ref: 232069 attributed with either an exclusive or an inclusive meaning. For the purpose of this specification, and unless otherwise noted, the term 'comprise' shall have an inclusive meaning - i.e. that it will be taken to mean an inclusion of not only the listed components it directly references, but also other non-specified components 5 or elements. This rationale will also be used when the term 'comprised' or 'comprising' is used in relation to one or more steps in a method or process.
Further aspects and advantages of the present invention will become apparent from the ensuing description which is given by way of example only.
Disclosure of Invention In one aspect the present invention provides a method of tracking objects within an object transportation system, the method including the steps of: a) providing an object identification code for an object; b) capturing visual identification information for the object at a first location within the system; c) relating the visual identification information captured at the first location with the object identification code; d) capturing visual identification information at a second location within the system; e) comparing the visual identification information captured at the second 20 location to the visual identification information captured at the first location; and f) relating the visual identification information captured at the second location to the object identification code if the visual identification information captured at the first and second locations meet predefined comparison criteria. 6 James & Wells Ref: 232069 This invention provides a means for positively linking the visual identification information captured at the second location with the object identification code which was related to the object at the first location. Therefore, this invention provides a means to positively identify the object.
Preferably, the method also includes the step of relating the object identification code to an object sorting code.
Preferably, the first and second locations correspond to points along an object conveyor.
More preferably, the object conveyor may be a baggage conveyor.
Preferably, the method also includes the step of capturing a weight measured for the object at the first location within the system.
More preferably, the method also includes relating said captured weight measurement to the visual identification information.
Preferably, the method also includes the step of relating the visual identification 15 information to the object sorting code if the visual identification criteria captured at the first and second locations meet said predefined comparison criteria.
Preferably, the method includes the step of comparing the visual identification information captured at the second location with visual identification information captured for a plurality of objects at the first location. This allows a number of 20 objects passing the second location to be assessed to identify the object identification code of a given object at the second location.
Preferably, the method includes capturing a visual image with a digital camera.
Preferably, the visual identification information is provided by a machine vision 7 James & Wells Ref: 232069 algorithm.
More preferably, the method includes conducting the machine vision algorithm on a processor located within a camera housing.
Preferably, the visual identification information includes colour information of the 5 object.
Preferably, the visual identification information includes texture information of the surface of the object.
Preferably, the visual identification information includes size and shape information of the object.
More preferably, the size and shape information is adjusted for given orientations and positions of the object.
Preferably, the visual identification information includes a record of distinguishing features of the object.
Preferably, the visual identification information may comprise less than two 15 kilobytes of data.
Preferably, the method involves relating a weight measurement of the object to the object identifier code.
More preferably, the method involves capturing said weight measurement at the first location in the system.
Preferably, the method includes the step of capturing a weight measurement of an object at a second location in the system.
Preferably, the method includes capturing at the first location in the system a code 8 James & Wells Ref: 232069 provided on a label which is adapted to be physically associated with the object. Alternatively, the method includes printing a code onto a label at the first location in the system.
More preferably, said label includes a barcode.
More preferably, the method also includes the step of generating an object information packet by combining the visual identification information with the object identification code.
More preferably, the object information packet includes the weight measurement.
More preferably, the method includes generating a first object information packet 10 with information captured at the at least one first location in the system.
More preferably, the method includes capturing at least one second information packet with information captured at the at least one second location in the system.
More preferably, the method includes the step of storing at least one object information packet at a central storage facility.
More preferably, the central storage facility is a server.
More preferably, the method includes the step of verifying the object information packets captured at first and second points on the system.
More preferably, said verifying the object information packets include comparison of the visual identification information of at least two object information packets.
More preferably, verifying object information packets includes the step of updating visual identification information captured at the at least one first location in the system with visual identification information captured at the second location in the 9 James & Wells Ref: 232069 system, if said visual identification information meet said predefined comparison criteria.
Preferably, the visual identification information may include: colour histograms; - size and shape determinations; pattern structures; key features extracted through localisation of high gradients in orthogonal directions in a captured image within a given colour space using the derivatives of the image in full dimensions; or - any combination of the above.
More preferably, the colour histograms are hue histograms or saturation/chroma-histograms.
More preferably, the size and shape determination is a boundary box.
More preferably, the pattern structures are extremes of solid colour or fine detailed 15 regular patterns to be determined via a frequency distribution obtained from a Fourier transform.
More preferably, the step of verifying the object information packets includes the step of comparing the weight measurements stored in said two or more object information packets.
Preferably, the method includes capturing visual identification information for a plurality of objects at the first location in the system.
James & Wells Ref: 232069 Preferably, the step of capturing visual identification information includes the step of capturing a visual identification information.
More preferably, the step of capturing visual identification information also includes the step of processing said visual identification information.
More preferably, said processing of visual images is defined as a set of embedded processor executable instructions.
More preferably, said processor executable instructions are adapted to be executable on a computer network.
More preferably, said processor executable instructions include instructions 10 executable by a digital signal processing chip.
More preferably, said processor executable instructions include instructions executable on a microprocessor.
In another aspect the invention there is provided a set of executable instructions stored on a computer readable medium, adapted to carry out the aforesaid method.
In another aspect the invention there is provided a baggage tracking apparatus adapted to carry out the aforesaid method.
Brief Description of Drawings Further aspects of the present invention will become apparent from the following description which is given by way of example only and with reference to the 20 accompanying drawings in which: Figure 1 depicts a simple conventional baggage handling and sorting system; Figure 2 depicts a simple baggage handling and sorting system incorporating 11 James & Wells Ref: 232069 a baggage tracking system according to a preferred embodiment of the present invention; Figure 3 depicts part of a system architecture for a baggage tracking system according to the same preferred embodiment of the present invention as figure 2; Figure 4 depicts the process carried out at a check-in of a baggage tracking system according to the same preferred embodiment as figures 2 and 3; Figure 5 depicts an algorithm defining the operation depicted in figure 4; Figure 6 depicts the process carried out at a second location within a baggage tracking system according to the same preferred embodiment of the present invention as figures 2 to 5; Figure 7 depicts the process carried out at a second location of a baggage tracking system according to the same preferred embodiment as Figure 6, in this case the process is combined with a weight measurement comparison; Figure 8 depicts an algorithm followed by the operation depicted in figure 7; Figure 9 depicts an overview of the algorithm carried out by a baggage tracking system according to the same preferred embodiment of the 20 present invention as figures 2 to 8; and Figure 10 depicts a UML activity diagram illustrating visual identification information capture and comparison processes executed in accordance with further embodiments of the invention. 12 James & Wells Ref: 232069 best modes for carrying out the invention Figure 1 depicts a simple conventional baggage sorting and handling system (1). Baggage (not shown) enters the system (1) at a check-in desk (2) where it is placed on a conveyor (3). The conveyor (3) transports the baggage to another 5 location in the system (1). The conventional baggage sorting and handling system (1) has a sorting gate (4) which moves baggage from the conveyor (3) onto another conveyor (5) leading to a baggage handling area. This conveyor or baggage handling area (5) may be used to transport baggage to a specific loading bay, for aircraft for example. The conveyor (5) may alternatively transport the baggage 10 through a security check, such as an x-ray facility.
Figure 2 depicts a simple baggage handling and sorting system which incorporates a baggage tracking system (11) according to a preferred embodiment of the present invention. Those skilled in the art will understand that the baggage tracking system depicted in figure 2 is intended only to disclose the preferred embodiment 15 of the present invention in a simple form.
Those skilled in the art will also appreciate that the application of the preferred embodiment is not limited to tracking baggage, and may be applied to tracking virtually any objects within a variety of object transportation systems.
The baggage tracking system (11) has two cameras (16a, 22) which view different 20 locations along the conveyor (13). Camera (22) views a first location along the conveyor (13), near the check-in (12). Another camera (16a) views a second location along the conveyor (13) where it is necessary to identify items of baggage. For example, camera (16a) could be positioned immediately before a sorting gate (14). Some preferred embodiments of the present invention have the two cameras 25 combined into one unit to allow them to share processors (not shown). 13 James & Wells Ref: 232069 Figure 3 depicts a system architecture for the baggage tracking system (11) of figure 2.
Figure 3 shows two cameras (16a and 16b) connected to a controller (115). Also connected to controller (115) is a weight sensing conveyor section (23). This 5 conveyor section (23) will typically be located at, or near, the check-in (12 as shown in figure 2).
The controller (115) communicates with a central server (not shown). Other controllers (not shown) communicate with the same central server (not shown).
In the preferred embodiment, each location, such as at the check-in (12) and 10 immediately before the sorting gate (14) has a dedicated camera (16a), weight sensing conveyor (23) and controller (115). The controller (115) may optionally control the weight sensing conveyor (23).
Figure 3 also shows a barcode scanner (21) connected to the controller (115). The barcode scanner (21) may take the form of either a handheld barcode scanner or 15 an automatic tag reader. Suitable examples of these and alternatives for these are known to those skilled in the art. The barcode scanner reads a code from a label which is physically associated with an item of baggage. Alternative embodiments might have a barcode printer at the check-in (12 as shown in figure 2). These would apply an allocated code to a label that is then attached to the baggage.
Figure 4 depicts the operation of the preferred embodiment of the present invention. The process depicted in Figure 4 is carried out at the check-in (12 as shown in figure 2).
The operation of the preferred embodiment, as depicted in Figure 4, is also depicted as an algorithm in Figure 5. 14 James & Wells Ref: 232069 With reference to Figure 4, the check-in (12 as shown in figure 2) has a barcode scanning facility (21). It also has a camera (22) which is directed to view an item of baggage placed on a weight sensing conveyor (23). The weight of the item of baggage is measured indirectly through tension between two rollers (24a, 24b). 5 Suitable weight sensors combined with conveyors are known to those skilled in the art. A weight measurement (26) is captured by the controller (115 in Figure 2).
The check-in (12) also has a data entry terminal (not shown) at which information relating to the item of baggage (20) can be entered. The data entry facility (not shown) is connected to the controller (depicted by 115 in Figure 2). The same data 10 entry terminal (not shown) may be provided with a software facility which assigns predefined information to given items of baggage (20) or to given barcode labels (19). Assigned information can include sorting codes, or laterals (17) which are used to define a destination for the item of baggage.
The camera (22) typically includes processors (not shown) such as a digital signal 15 processing (DSP) chip and a microprocessor. These processors (not shown) are adapted to take the image of the item of baggage (20) captured by the camera (22) and to provide visual identification information (25), depicted here by a set of spectra.
The software facility associated with the data entry terminal (not shown) assigns a 20 Baggage Sortation Management code (BSM) (19) to the barcode label (18). This BSM (19) is an item identification code for the given item of baggage.
The controller (115 in Figure 2) has a software facility which combines the baggage sortation management code (19) and the visual identification information (25) into an object information packet (27) representing a given item of baggage (20). 25 Optionally, the controller (115) may include the sorting code (17) or the weight James & Wells Ref: 232069 measurement (26) of the item of baggage (20).
Figure 5 depicts an algorithm carried out at the check-in (12) by the baggage tracking system (11) according to the preferred embodiment. At step (30) an item of baggage (20) is placed on the conveyor (23), which is optionally a weight 5 sensing conveyor.
At step (31) the item of baggage (20) is provided with a barcode label (18) which has a predefined BSM (19).
At step (32) the barcode label and associated baggage sortation management code (19), is read into the controller (115) by a hand-held barcode scanner (21).
At step (33) a sorting code (17) is optionally assigned to the item of baggage (20), or its associated BSM (19), by a facility (not shown) associated with a controller (115). This involves relating a sorting code (17) to a BSM (19).
At step (34) a visual image of the item of baggage (20) is captured with the camera (16a).
At step (35) visual identification information (25) is generated by processors associated with the camera (16a) by applying machine vision algorithms to the visual image captured with the camera (16a). These processors are optionally provided at the camera circuit board or at 'board level'.
At step (36) the weight of the item of baggage is captured by the controller (115) 20 with the weight sensor (23).
At step (37) the controller (115) generates a baggage information packet, or object information packet (27) for the item of baggage (20). The item information packet according to the preferred embodiment relates or combines, a Baggage Sortation 16 James & Wells Ref: 232069 Management code (BSM) (19) with visual identification information (25) and, optionally, a sorting lateral (17) and a weight measurement (26). The weight measurement (26) provides a means of identifying the item of baggage in addition to the visual identification information (25). As will be understood by those skilled in 5 the art, the probability of an incorrect reading with the combination of the visual identification information (25) combined with the weight measurement (26) is considerably lower than a corresponding probability where only one of these parameters is captured.
At step (38), the item information packet (27) is transmitted by the controller (115) 10 to a central server (not shown).
Figure 6 depicts the operation of the baggage tracking system (11) at a second location within the baggage handling and sorting system (11) where the second camera (16a) is positioned. This location might be immediately before a sorting gate (14), for example.
The camera (16a) captures an image of the item of baggage (40) while it is on a section of conveyor (43) which may incorporate a weight sensor. At this point the identity of the item of baggage (40) is unknown.
The camera (16a), similar to the camera (22), has board-level processors (not shown) which generate visual identification information (45) from a visual image. 20 The central controller (115) compares the visual identification information (45) with visual identification information, such as (25), stored on a central server (not shown) to determine whether they meet predefined comparison criteria, or whether they match. Various algorithms, methods and criteria for matching will be known to those skilled in the art. Some embodiments may carry out this comparison at a 25 central processor (not shown). For example, the visual identification information 17 James & Wells Ref; 232069 (25 and 45) may be parameterised and compared using database queries.
As shown in figure 6, the visual identification information (45) relating to the item (40) matches that of the item information packet (27). This indicates that the BSM (19) relates to the visual identification information (45), or that item (40) is item (20).
Figure 7 depicts the operation of the same part of the baggage tracking system (11) as Figure 6. In this case, a comparison is made using weight measurements (46) and (26) in addition to the comparison of the visual identification information (45) and (25). The weight measurement (46) is compared to a similar weight measurement (26) captured at the check-in (12).
As will be apparent to those skilled in the art, matching the item of baggage (40) with the item of baggage (20) will require comparison of the visual identification information (45), and possible weight measurement (46), with potentially all of the baggage information packets stored with the central controller (115). This is done until a match is found. Typically, this is done in a timeframe in which a given item 15 of baggage (40) is still on the conveyor (43).
Figure 8 depicts the operation shown in figures 6 and 7 as an algorithm.
At step (50) an item of baggage (40) arrives at a second section of the conveyor (43).
At step (51) an image of the item of baggage (40) is captured by the camera (22).
At step (52) processors associated with the camera (22) generate visual identification information (45) from the image at step (51).
At step (53) the visual identification information (45) is compared to visual identification information contained in baggage information packets, such as (27), 18 James & Wells Ref: 232069 stored by the central server (not shown). Eventually a match is found between the visual identification information (45) and (25), for example. This indicates that the item of baggage (40) is the same item of baggage checked in at the check-in (12). This indicates a relationship between the visual identification information (45) and 5 the baggage sortation management code (19), which identifies that the item (40) is the item (20). It will be appreciated by those skilled in the art that machine vision algorithms will be able to provide visual identification information that allows for bags to reorient or fall over and still be represented by the same or matching visual identification information.
At step (53) a weight measurement (46) of a baggage information packet (27) is compared with the weight measurement (26) of the item information packet (25) which has been identified as having visual identification information (25) that is a likely match for the visual identification information (45). The comparison of the weight measurement (46) with the weight measurement (26) confirms the likely 15 match between visual identification information (45) and visual identification information (25). As will be understood by those skilled in the art the probability of a mismatch comparing the visual identification information (45) and (25) and also the weight measurement (46) and (26) is negligible. Therefore, the preferred embodiment of the present invention provides reliable means for identifying items 20 of baggage, such as (40) and (20) and identifying the BSM or identification code which corresponds to the visual identification information (45) captured at the second location along the conveyor.
At step (55) the BSM (19) assigned at the check-in (12) is related to the visual identification information (45) captured at the second location, the sorting gate (14 25 of Figure 2), for example. In essence this identifies which item of baggage is at the sorting gate (14). 19 James & Wells Ref: 232069 Figure 9 depicts how an item of baggage (20) is actually sorted using a sorting gate (14) and a baggage tracking system (11) according to a preferred embodiment of the present invention.
At step (60) the item of baggage is checked in at the check-in (12) and a baggage 5 information packet, or item information packet, (27) is generated and transmitted to a central server as previously described. The baggage information packet (27) optionally includes a sorting lateral (17) which has been assigned to the item of baggage (20).
At step (61) visual identification information (45), and optionally, weight 10 measurements (46) are captured for each item of baggage (40) passing camera (22) at a second location. These are used to identify the baggage (40) as being the item of baggage (20) as checked in at first point in the baggage sorting handling system (11). In this embodiment camera (22) is located above a section of the conveyor (13) immediately before the sorting gate (14) but other locations will be 15 obvious to those skilled in the art.
At step (62) the visual identification information (45) is compared to that of each of the item information packets (27) stored at a central server (not shown) until a match is found. The same process might be repeated using a comparison of weights (46) and (26).
Once the item of baggage (40) has been identified as the item of baggage (20), the sorting code (17), if present, of the baggage information packet (27) is used to determine whether the sorting gate (14) is to be actuated to move the item of baggage (40/20) onto another conveyor or handling area. If the baggage information packet (27) does not include a sorting code (17), some other action 25 may be taken. For example, the location of the item of baggage (20) may be James & Wells Ref: 232069 recorded.
At step (63), if a match is found, the sorting gate (14) is activated as appropriate for the sorting lateral (17) and, if appropriate, the item of baggage (40/20) is moved to the handling area (15).
Typically, this process is repeated at multiple points along the conveyor (13) where additional cameras and sorting gates are located.
As the baggage is reliably identified by use of both visual identification information and weight measurements, the sorting of the items of baggage will be virtually completely reliable.
Figure 10 depicts a UML activity diagram illustrating visual identification information capture and comparison processes executed in accordance with further embodiments of the invention.
In particular figure 10 illustrates three distinct operational modules, being normalisation and colour adaptation, 'visual fingerprint' determination, and 'visual 15 fingerprint' matching. In the application described below for the present invention bags are present within images acquired by the system, where visual fingerprints are determined for each bag present on a conveyor. The operation of each of these modules is discussed further below.
Image Normalization Images acquired by cameras in distributed locations within the system are deteriorated due to various influences. These influences are: • Motion (resulting in motion blur), 21 James & Wells Ref: 232069 • Background (different visual characteristics of the baggage belt and surrounding environment in different locations), • Varying illumination conditions (natural, artificial, changed due to the time of the day and/or external weather conditions, etc.) and • Differences in image capturing devices' perception (differences in the cameras due to model variation, batch production variation and ageing effects).
To perform a robust determination of specific metrics in processing on the captured image stream, pre-processing steps have to be performed to compensate for predictable results. Thus the following preventive means will need to be undertaken: • Motion blur on overhead imaging (perpendicular to conveyor belt) can be approximated as a fairly homogeneous blur, which can be compensated for using Wiener filtering with an appropriate deterministically constructed, fixed deterioration function.
• Background elimination will be performed on the basis of an analysis within an area of interest coupled with masking out areas with little change of information within a suitable time range.
• Illumination condition changes and variations in the devices' capturing characteristics can be dealt with in a single, but more complex, step. Using ICC profiles the images will be transformed into a normalized and device independent colour space (Image Colour Consortium specified process). This profile is specific for each individual camera, and is dependent on the illumination conditions. As the illumination conditions (time of day, weather 22 James & Wells Ref: 232069 conditions, natural/artificial light) as well as the operation conditions of the camera may vary (temperature dependence) the profile will be modified adaptively employing Artificial Intelligence (Al) means suitable for the purpose (using Neural Networks and/or Artificial Immune systems). The Al will perform the profile adaptation using detectible drifts of fixed and known image portions (belt, environment) and previously determined colour histogram metrics of the bag on the belt relative to current detection. Due to the nature of the Al, the process is capable of adapting to slow (time of day, weather and temperature) and fast (natural/artificial illumination) changes.
The background mask is also used for the ICC profile adaptation as it separates the non-changeable fragments of the image from the changing fractions (i.e. a bag).
Determination of Visual Fingerprint On the basis of the normalized image the visual metrics composing the visual fingerprint can be determined. A set of computable and robust properties (for multiple detection in changing locations and limited change in special orientation) is determined usable for the fingerprint, being: • Colour histograms (hue histogram and saturation/chroma histogram) • Basic size and shape (bounding box) • Pattern structure (extremes as solid colour or fine detailed regular pattern, can be determined analysing the frequency distribution after a Discrete Fourier Transformation) • Key Feature Extraction (these areas are determined through localization of high gradients in orthogonal directions of the image within a given colour space using the derivatives of the image in all dimensions). 23 James & Wells Ref: 232069 Matching Algorithm The bags are moved on a conveyor belt in a normally undisturbed order. Thus the corresponding fingerprint in the sequence of recorded fingerprints within the system is in the worst case approximately known. For matching the fingerprint of a given 5 bag against the recorded fingerprints for identification only, a sub set of records around the approximated location will need to be performed. The match will be positive for a match with a high certainty in proximity of significantly lower correlation certainties.
In the case of an insufficiently distinct positive correlation the algorithm extends the 10 search domain to locate the fingerprint within a wider range in case the order of the bags on the belt has been changed/disturbed.
Due to the nature of inevitable margins of error of measurements extracted from the image analysis, and the fact that details of bags may look different in different locations (e.g. due to a moved bag handle, etc.), the matching algorithm must 15 provide sufficient capabilities to withstand these errors. For the determination of a correlation certainty a combination of Artificial Intelligence (Al) means suitable for the purpose (Fuzzy Logic, Neural Networks and/or Artificial Immune Systems) is employed. These (hybrid) Al systems are well known for providing robust matching capabilities in the presence of disturbing influences.
To further reduce the potential margin of error in the fingerprint matching for a more robust approach the fingerprint attached to a piece of baggage needs to be updated. Slight changes in the fingerprint metrics will be updated through averaging of the metrics to decrease the statistical variation. In the case of significant changes in metrics (e.g. due to a moved bag handle, etc.) the fingerprint 25 will be replaced with the fingerprint of the current perception of the bag. 24 James & Wells Ref: 232069 As a basis for adaptation of the ICC profile the colour information (colour histograms) of a positive match is used as a means of relative quality measurement of the perceived colour in the capturing device.
Aspects of the present invention have been described by way of example only and 5 it should be appreciated that modifications and additions may be made thereto without departing from the scope thereof.
James & Wells Ref: 232069

Claims (50)

WHAT WE CLAIM IS:
1. A method of tracking objects within an object transportation system, the method including the steps of: a) providing an object identification code for an object; b) capturing visual identification information for the object at a first location within the system; c) relating the visual identification information captured at the first location with the object identification code; d) capturing visual identification information at a second location within the system; e) comparing the visual identification information captured at the second location to the visual identification information captured at the first location; and f) relating the visual identification information captured at the second location to the object identification code if the visual identification information captured at the first and second locations meet predefined comparison criteria.
2. A method as claimed in claim 1 wherein the method also includes the step of relating the object identification code to an object sorting code.
3. A method as claimed in any one of claims 1 to 2 wherein the first and second locations correspond to points along an object conveyor.
4. A method as claimed in claim 3 wherein the object conveyor is a baggage conveyor. 26 RECEIVED at IPONZ on 9 December
5. A method as claimed in any one of claims 1 to 4 wherein the method also includes the step of capturing a weight measured for the object at the first location within the system.
6. A method as claimed in claim 5 wherein the method also includes the step of relating said captured weight measurement to the visual identification information.
7. A method as claimed in claim 5 or claim 6 wherein the method includes the step of capturing a weight measurement of an object at a second location in the system.
8. A method as claimed in claim 7 wherein the method also includes the step of comparing the weight measurement captured at the second location to the visual identification information captured at the first location.
9. A method as claimed in claim 8 wherein the method also includes the step of relating the weight measurement captured at the second location to the object identification code if the weight measurement captured at the first and second locations meet predefined comparison criteria.
10. A method as claimed in any one of claims 2 to 9 wherein the method also includes the step of relating the visual identification information to the object sorting code if the visual identification criteria captured at the first and second locations meet said predefined comparison criteria.
11. A method as claimed in any one of claims 1 to 10 wherein the method includes the step of comparing visual identification information captured at the second location with visual identification information captured for a plurality of objects at the first location. 27 RECEIVED at IPONZ on 9 December
12. A method as claimed in any one of claims 1 to 11 wherein, the method includes capturing a visual image with a digital camera.
13. A method as claimed in any one of claims 1 to 12 wherein the visual identification information is provided by a machine vision algorithm.
14. A method as claimed in claim 13 wherein the method includes conducting the machine vision algorithm on a processor located within a camera housing.
15. A method as claimed in any one of claims 1 to 14 wherein the visual identification information includes colour information for the object.
16. A method as claimed in any one of claims 1 to 15 wherein, the visual identification information includes texture information of the object's surface.
17. A method as claimed in any one of claims 1 to 16 wherein the visual identification information includes size and shape information of the object.
18. A method as claimed in claim 17 wherein the size and shape information is adjusted for given orientations and positions of the object.
19. A method as claimed in any one of claims 1 to 18 wherein the visual identification information includes a record of distinguishing features of the object.
20. A method as claimed in any one of claims 1 to 19 wherein the visual identification information may comprise less than two kilobytes of data.
21. A method as claimed in any one of claims 1 to 4 and 6 to 20 wherein the method involves relating a weight measurement of the object to the object identifier code.
22. A method as claimed in claim 21 wherein the method involves capturing 28 RECEIVED at IPONZ on 9 December said weight measurement at the first location in the system.
23. A method as claimed in any one of claims 1 to 22 wherein the method includes capturing at the first location in the system a code provided on a label which is adapted to be physically associated with the object.
24. A method as claimed in any one of claims 1 to 23 wherein the method includes printing a code onto a label at the first location in the system.
25. A method as claimed in claim 24 wherein the label includes a barcode.
26. A method as claimed in any one of claims 1 to 25 wherein the method also includes the step of generating an object information packet by combining at least the visual identification information with the object identification code.
27. A method as claimed in claim 26 wherein the object information packet includes a weight measurement.
28. A method as claimed in claim 26 or claim 27 wherein the method includes generating a first object information packet with the visual identification information captured at a first location within the system.
29. A method as claimed in claim 28 wherein the method includes generating a second information packet with the visual identification information captured at the second location within the system.
30. A method as claimed in any one of claims 26 to 29 wherein the method includes the step of storing at least one object information packet at a central storage facility.
31. A method as claimed in claim 30 wherein the central storage facility is a server. 29 RECEIVED at IPONZ on 9 December
32. A method as claimed in any one of claims 29 to 31 wherein the method includes the step of verifying the object information packets captured at the first and the second locations within the system.
33. A method as claimed in claim 32 wherein the step of verifying the object information packets includes comparison of the visual identification information of at least two object information packets.
34. A method as claimed in any one of claims 32 to 33 wherein the verifying of the object information packets includes the step of updating visual identification information captured at the first location within the system with visual identification information captured at the second location within the system, if said visual identification information meet said predefined comparison criteria.
35. A method as claimed in any one of claims 1 to 34 wherein the visual identification information may include: colour histograms; size and shape determinations; pattern structures; key features extracted through localisation of high gradients in orthogonal directions in a captured image within a given colour space using the derivatives of the image in full dimensions; or any combination of the above.
36. A method as claimed in claim 35 wherein the colour histograms are hue histograms or saturation/chroma histograms. 30 RECEIVED at IPONZ on 9 December
37. A method as claimed in claim 35 wherein the size and shape determination is a boundary box.
38. A method as claimed in claim 35 wherein the pattern structures are extremes of solid colour or fine detailed regular patterns determined via a frequency distribution obtained from a Fourier transform.
39. A method as claimed in any one of claims 32 to 38 wherein verifying the object information packets includes the step of comparing the weight measurements stored in said two or more object information packets.
40. A method as claimed in any one of claims 1 to 39 wherein the method includes capturing visual identification information for a plurality of objects at a first location in the system.
41. A method as claimed in any one of claims 1 to 11 and 13 to 40 wherein the step of capturing visual identification information includes the step of capturing a visual image.
42. A method as claimed in any one of claims 1 to 41 wherein the step of capturing visual identification information also includes the step of processing said visual identification information.
43. A method as claimed in claim 42 wherein the processing of visual identification information is defined as a set of embedded processor executable instructions.
44. A method as claimed in claim 43 wherein the processor executable instructions are adapted to be executable on a computer network.
45. A method as claimed in any one of claims 43 to 44 wherein the processor 31 RECEIVED at IPONZ on 9 December executable instructions include instructions executable by a digital signal processing chip.
46. A method as claimed in any one of claims 43 to 45 wherein the processor executable instructions include instructions executable on a microprocessor.
47. A set of executable instructions stored on a computer readable medium adapted to carry out the method as claimed in claims 1 to 46.
48. A baggage tracking apparatus adapted to carry out a method of tracking objects within an object transportation system, the baggage tracking apparatus comprising: • a conveyer to convey the object; • a first camera to capture visual identification information for the object at a first location of the conveyer; • a second camera to capture visual identification information for the object at a second location of the conveyer; • a scanner to provide an object identification code for the object; and • a computer server adapted to carry out the steps of: o relating the visual identification information captured at the first location with the object identification code; o comparing the visual identification information captured at the second location to the visual identification information captured at the first location; and 32 RECEIVED at IPONZ on 9 December o relating the visual identification information captured at the second location to the object identification code if the visual identification information captured at the first and second locations meet predefined comparison criteria.
49. A method substantially as herein described and illustrated with reference to any one of Figures 2 to 10.
50. A baggage tracking apparatus substantially as herein described and illustrated with reference to any one of Figures 2 to 10. Baggage Sortation Management Limited by its Attorneys James & Wells Intellectual Property 33
NZ55625307A 2007-06-29 2007-06-29 Object tracking method and apparatus NZ556253A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
NZ55625307A NZ556253A (en) 2007-06-29 2007-06-29 Object tracking method and apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
NZ55625307A NZ556253A (en) 2007-06-29 2007-06-29 Object tracking method and apparatus

Publications (1)

Publication Number Publication Date
NZ556253A true NZ556253A (en) 2010-01-29

Family

ID=41717474

Family Applications (1)

Application Number Title Priority Date Filing Date
NZ55625307A NZ556253A (en) 2007-06-29 2007-06-29 Object tracking method and apparatus

Country Status (1)

Country Link
NZ (1) NZ556253A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11321548B2 (en) 2017-05-18 2022-05-03 Ssi Schäfer Automation Gmbh (At) Device and method for controlling a material flow at a material flow nod point

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11321548B2 (en) 2017-05-18 2022-05-03 Ssi Schäfer Automation Gmbh (At) Device and method for controlling a material flow at a material flow nod point
EP3625740B1 (en) * 2017-05-18 2023-08-09 SSI Schäfer Automation GmbH (AT) System and method for controlling a material flow at an intersection

Similar Documents

Publication Publication Date Title
US11741205B2 (en) Authentication-based tracking
US11049278B2 (en) System and method for visual identification, and system and method for classifying and sorting
US10062008B2 (en) Image based object classification
US20210104008A1 (en) Event-driven authentication of physical objects
CA2231450C (en) System and method for reading package information
US20210319582A1 (en) Method(s) and System(s) for Vehicular Cargo Management
EP3270342A1 (en) Database records and processes to identify and track physical objects during transportation
US9836635B2 (en) Systems and methods for tracking optical codes
US20230186509A1 (en) Article identification and tracking
EP3786836A1 (en) Article identification and tracking
WO2018103486A1 (en) Label for positioning item subjected to security inspection and method for positioning item subjected to security inspection
US11810064B2 (en) Method(s) and system(s) for vehicular cargo management
CN116363691A (en) Security information binding method and device, electronic equipment and readable storage medium
NZ556253A (en) Object tracking method and apparatus
Li et al. Computer vision based conveyor belt congestion recognition in logistics industrial parks
CN112069841B (en) X-ray contraband parcel tracking method and device
CN112149475B (en) Luggage case verification method, device, system and storage medium
US10587821B2 (en) High speed image registration system and methods of use
JP2020087172A (en) Image processing program, image processing method, and image processing device
Dering et al. A computer vision approach for automatically mining and classifying end of life products and components
US11734831B2 (en) Method for supporting X-RAY image reading using image transform model and system performing the same
Wang et al. Product quality inspection combining with structure light system, data mining and RFID technology
Ghosh et al. Automated framework for unsupervised counterfeit integrated circuit detection by physical inspection
Blateyron Automatic detection of manipulated packages by image comparison
Nakthanom et al. A 2d barcode inspection using template matching

Legal Events

Date Code Title Description
ASS Change of ownership

Owner name: BAGGAGE SORTATION MANAGEMENT (AUSTRALIA) PTY L, AU

Free format text: OLD OWNER(S): BAGGAGE SORTATION MANAGEMENT LIMITED

PSEA Patent sealed
RENW Renewal (renewal fees accepted)
LAPS Patent lapsed