WO2011087454A1 - Apparatus and methods for manipulating and acquiring images of a pallet load - Google Patents
Apparatus and methods for manipulating and acquiring images of a pallet load Download PDFInfo
- Publication number
- WO2011087454A1 WO2011087454A1 PCT/SG2010/000011 SG2010000011W WO2011087454A1 WO 2011087454 A1 WO2011087454 A1 WO 2011087454A1 SG 2010000011 W SG2010000011 W SG 2010000011W WO 2011087454 A1 WO2011087454 A1 WO 2011087454A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- colour
- scale
- camera
- pallet load
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
- G06T1/0007—Image acquisition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06K—GRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K7/00—Methods or arrangements for sensing record carriers, e.g. for reading patterns
- G06K7/10—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
- G06K7/14—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
- G06K7/1404—Methods for optical code recognition
- G06K7/1439—Methods for optical code recognition including a method step for retrieval of the optical code
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06K—GRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K7/00—Methods or arrangements for sensing record carriers, e.g. for reading patterns
- G06K7/10—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
- G06K7/14—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
- G06K7/1404—Methods for optical code recognition
- G06K7/146—Methods for optical code recognition the method including quality enhancement steps
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
Definitions
- the invention relates to an apparatus and a method for manipulating images of a pallet load.
- the invention also extends to a machine- (computer-) readable medium having stored thereon machine-readable instructions for executing, in a machine, the aforementioned method.
- the invention also relates to an apparatus and a method for acquiring images of a pallet load.
- the invention has particular, but not exclusive, application in implementing "machine vision" techniques for capturing images of a pallet of goods cartons or goods packages.
- inbound and outbound cargo control is typically an error-prone, expensive and time-consuming process requiring a substantial amount of work maintaining consistent data in WMS (Warehouse Management Systems) and E Ps (Enterprise Resource Planning Systems).
- WMS Warehouse Management Systems
- E Ps Enterprise Resource Planning Systems
- the results of this cargo control processes are often hard to evaluate and contain far too little data to be of any great assistance to the warehouse management process.
- the approaches '1' and '2' mostly apply to conveyor-oriented logistics facilities like sorting hubs, or to the production lines of the factories themselves. These two approaches are not applicable in some methods of cargo control. For instance, they cannot be used with palletised cargo because of technical limitations; the movement speed for approach 1 is not accurate and subject to fluctuations, thereby introducing uncertainty into the accuracy of the data captured. For approach 2, current techniques do not allow for image acquisition of sufficient resolution for the recognition of features on the cargo, such as small barcodes. When matrix cameras are proposed, the cost quickly becomes too high for this to be a practical solution.
- a black & white/greyscale (i.e. monochrome) image obtained from a linescan camera does not, clearly, include colour information, which can be very useful in analysing the goods packages.
- monochrome images it is possible a tracking ' system will not be able to discern between packing strips, packing tape etc. and gaps between the packages disposed upon a pallet.
- information relating to one goods package may be mistakenly be ascribed to two (or more) "phantom" packages, one of which is mistakenly analysed and the other(s) of which is/are nonexistent. Such errors might lead to the tracking system recognising excessive goods in the shipment and generating an error.
- any gap between the packages is undetectable, and thus instead of two packages being detected, only one large package is detected, again causing an error.
- Another problem solved by the techniques disclosed herein is that, when packages are wrapped in clear plastic film or the like, the film causes unwanted reflections from illuminating light sources, required for the camera to "see” the packages and acquire an images.
- the only realistic sort of illumination is laser light. Any LED solution capable to generate enough light for practical purposes, will be prohibitively expensive, will be bulky and be expensive to power due to high energy costs, due to the extremely high number of LEDs required. But lasers generate on only one major wavelength, so they are not suitable when it is desired to capture colour information from the goods packages.
- polarising filters do not solve the problem either, since these tend to exhibit very high losses, thus requiring high-intensity (and expensive) illumination. For instance, some illumination intensity in double-digit percentage figures can be lost in a polarising filter. If two polarising filters which are necessary (one for the light source and one for the camera), up to 60-80% of illumination intensity can be lost. So, if a camera requires, say, 10-20k lux illumination, 100k lux is required from the lighting source. However, lighting systems readily available at this time are limited to about 25-30k lux, and these are already very expensive items to purchase and bulky. Thus, the techniques disclosed herein allow for an automated asset tracking system to identify cartons more reliably using colour information together with
- the packages in question can be any type of goods package, including goods cartons made of cardboard (or similar) or plastic, metal containers, wooden boxes/crates, paper/textile bags, packages of or wrapped in plastic film - whether clear (transparent) plastic film, or opaque/partially opaque film - or trays for placing goods in or on, with or without wrapping.
- Figure 1 is a perspective diagram illustrating an apparatus for acquiring images of a pallet load
- Figure 2 is an elevational diagram illustrating fields of view and fields of illumination of the apparatus of Figure 1;
- Figure 3 is an architecture diagram for an apparatus for manipulating an image of a pallet load
- Figure 4 is a process flow diagram illustrating a process for manipulating an image of a pallet load in the apparatus of Figure 3;
- FIG. 5 is a series of images showing elevational views of a stack of goods cartons for edge detection analysis with the apparatus of Figure 3; and Figure 6 is a series of images showing goods packages represented in the images of Figure 5 after initial edge detection techniques have been applied to the images of Figure 5.
- the apparatus 100 comprises a line scan camera 102 for acquiring a monochrome image of pallet load 112 and a colour camera 103 for acquiring a colour image of the pallet load 112.
- a monochrome image is an image in one colour or shades of colour, and includes greyscale or "black & white" images.
- Monochromatic light is light of a single wavelength, though in practice it can refer to light of a wavelength range, usually a narrow range.
- a monochromatic image is one whose range of colours consists of shades of a single colour or hue.
- linescan camera 102 and colour camera 103 are arranged for movement in a movement plane 104 to acquire the monochrome image and the colour image of pallet load 112 disposed upon a pallet 110 which has been placed in position with, say, assistance from a fork lift truck (not shown).
- pallet load 112 comprises of a plurality of goods cartons 112a stacked upon pallet 110 in an orderly manner. In this case, the goods cartons 112a are stacked in an orderly 3 x 3 x 3 matrix.
- Linescan camera 102 and colour camera 103 are arranged to move vertically in plane 104.
- linescan camera 102 and colour camera 103 are mounted for linear movement on supports 106a, 106b which may comprise of linear actuators, having suitable drivers as will be known to the skilled person, and these are arranged upon a support base 108.
- Linescan camera 102 and colour camera 103 are moved, in the example of Figure 1, from top to bottom.
- the field of view 114 (optical plane) of linescan camera 102 is incident on pallet load 112 at line 114a and moves in the direction 116.
- the field of view 115 (optical plane) of colour camera 103 is incident on pallet load 112 at line 115a and also moves in the direction 116.
- linescan camera 102 acquires a monochrome image of the pallet load one line at a time.
- the result is a complete hi- resolution (for example, approximately 160 Mpixel) image.
- a suitable linescan camera is one which is set to grab about 6-10 lines per mm with a field of view 114 of a line being one pixel thick and 8000 pixel wide, but other, for example higher, resolutions are contemplated.
- colour camera 103 is also shown as having a single-line field of view 115.
- the colour camera pixel array will likely be a two-dimensional array, comprising, in one dimension, three pixels, for the associated R, G and B channels (8-bit per channel) of the colour image. So, while travelling, such a colour camera 103 acquires an image of the pallet load one line at a time, with three pixels for the R, G and B channels acquiring information for each line. The result is a complete image.
- a suitable colour camera is one which is set to grab about 6-10 lines per mm with a field of view 115 of a line being one pixel thick (but with three pixels acquiring information for the one-line thick field of view) and 2000 pixels wide, but other resolutions, whether higher or lower, are contemplated.
- linescan camera 102 has a row of pixels having a first number of pixels and the colour camera 103 has a row of pixels having a second number of pixels, the first number of pixels being higher than the second number of pixels.
- the first number of pixels is an integer multiple of the second number of pixels.
- the apparatus 100 of Figure 1 also comprises a first light source 118 having a first field of illumination (not shown in Figure 1 for the sake of clarity) for illuminating the field of view 114 of the linescan camera 102 and a second light source 120 having a second field of illumination (not shown in Figure 1 for the sake of clarity) for illuminating a second field of view 115 of the colour camera 103.
- first light source 118 is a coherent light source such as a lineiaser, arranged to generate a laser line, but other types of light sources, including other types of coherent light sources may also be used.
- a lineiaser As the light source 118, such a lineiaser 118 emits a laser line having an optical plane (not shown for the sake of clarity) and is set up so that the optical plane of the lineiaser 118 is aligned with an optical plane (e.g. field of view) 114 of the linescan camera 102; e.g.
- the effective brightness of the lineiaser backlight can, typically, be more than 10 times the brightness of the ambient light.
- a 50mW lineiaser can be considered to have a brightness equivalent to that of a 500W halogen light bulb, or even better.
- colour camera 103 Similar considerations also apply to the colour camera 103.
- a suitable colour camera which might be used is a DALSA Spyder3 2K colour linescan camera, although any similar camera can be used instead.
- LED with natural (daylight-like) light may be used.
- either or both of the first and second light sources 118, 120 are powered by one or more DC power sources 122 (connections between the power source 122 and the cameras omitted for the sake of clarity).
- DC power sources 122 connections between the power source 122 and the cameras omitted for the sake of clarity.
- AC power supplies clearly, provide sinusoidal current to the light source, whether LEDs, halogen bulbs or other types. At the peak of the sinusoid, the lamp emits light at maximum brightness, which then reduces till the sinusoid crosses the zero point. This causes "flicker" in the image: dark lines in the captured image corresponding to areas where linescan camera captured the image with minimum illumination intensity.
- DC power is relatively ripple-free (typically around 5% fluctuation, which can be considered to be not significant due to LED inertia, giving, effectively, even or relatively constant illumination.
- Figure 1 shows colour camera 103 being placed above monochrome camera 102 in the same vertical axis, this is not essential, and other arrangements, such as the colour camera 103 being below monochrome camera 102 are also contemplated.
- Monochrome and colour cameras 102, 103 can even be installed at different distances from the pallet load, but subsequent processing of the images must take this into account.
- field of view 114 is displaced in the movement plane 104 from the second field of view 115; that is, there is a gap between the two lines 114a, 115a. This is not essential, and it is also possible that the two fields of view are coincident on pallet load 112. However, when the fields of view 114, 115 are displaced, this provides a further beneficial arrangement, as wil
- the first light source 118 (not shown in Figure 2) has a first field of illumination 202 defined by the boundary lines 204a, 204b, on either side of the line 114a representing the line where the field of view 114 of monochrome camera 102 is incident upon the pallet load 112. It ill be appreciated that, practically speaking, the boundaries 204a, 204b of the first field of illumination 202 may not be so well-defined, and there may be a gradual drop in illumination intensity moving away from the line 114a.
- second light source 120 (not shown in Figure 2) has a second field of illumination 206 defined by the boundary lines 208a, 208b, on either side of the line 115a
- the boundaries 208a, 208b of the second field of illumination 206 may not be so well- defined, and there may be a gradual drop in illumination intensity moving away from the line 115a. It will also be appreciated that the relative distances of the boundary lines 208a, 208b from the line 115a are representative only for the sake of illustration and that, relatively speaking, the field of illumination may be focussed more narrowly in the region of line 115a.
- the first field of view 114 and the second field of view 115 are displaced in the movement plane; that is, there is a gap between the two lines 114a, 115a.
- a suitable gap has been found to be about 10cm.
- the first field of illumination 202 is arranged for illumination outside of the second field of view 115, 115a. This means that the first light source 118 is arranged not to illuminate the area of the pallet load 212 being scanned by the colour camera 103, in this instance, the light source 118 does not illuminate in the region of line 115a. This helps to avoid interference and unwanted reflection for the colour camera 103.
- the second field of illumination 206 is arranged for illumination outside of the first field of view 114, 114a. This means that the second light source 120 is arranged not to illuminate the area of the pallet load 212 being scanned by the monochrome camera 102, in this instance, the second light source 120 does not illuminate in the region of line 114a.
- the second light source 120 might illuminate the second field of view, but practically speaking, it is expected to be so minimal that it can be ignored.
- a second light source with a relatively well-defined beam profile such as a LED cluster or array generating white light
- Figure 3 illustrates an example of an apparatus for manipulating an image of a pallet load.
- the apparatus of Figure 3 may be utilised to manipulate an image acquired by the apparatus of Figure 1.
- the apparatus 300 comprises, principally, a processor 302 and a memory 304 for storing one or more routines 306 which, when executed under control of processor 302, control the apparatus 300 to map colour information from a colour image 314 of the pallet load 112 to a monochrome image 316 of the pallet load 112, the colour image 314 having a first scale 322 and the monochrome image 316 having a second scale 324, the first scale 322 being different from the second scale 324.
- a process for manipulating an image with the apparatus 300 will be described in greater detail with reference to Figure 4.
- apparatus 300 also comprises a frame grabber 308, storage 310,1/0 module 312 and edge detection module 313:
- Frame grabber 308 is used to process (e.g. gather) image data acquired from, say, monochrome camera 102 and/or colour camera 103 of Figure 1.
- the image data is received via I/O 312 and the data is stored in storage 310 in a format such as a pure uncompressed raw bitmap image which can be processed using various image processing and data manipulation techniques, such as those described in commonly-owned International Patent Application No. PCT/SG2009/000472.
- image processing and data manipulation techniques such as those described in commonly-owned International Patent Application No. PCT/SG2009/000472.
- Figure 3 is a schematic block diagram illustrating an architecture for an apparatus for manipulating an image of a pallet load. It is not essential that the components of the apparatus be included within a single device as might be inferred from Figure 3 and that components illustrated may be provided in one or more devices.
- the frame grabber 308 might be installed in a separate device and configured to transmit the image data to apparatus 300 for storage in storage memory 310. Indeed, storage 310 itself could be provided in a separate device.
- a process 400 for manipulating an image of a pallet load in the apparatus 300 of Figure 3 will now be described with respect to Figure 4.
- the process 400 starts at step 402.
- the colour and monochrome images 314, 316 are retrieved by apparatus 300 for processing from storage 310 at step 404.
- one (or both) of the images 314, 316 are re-sized (or re-scaled), as the two images have different scales 322, 324 from one another.
- the colour image 314 may have been acquired from a 2000 pixel-wide array (as discussed above) in camera 103
- the monochrome image 316 may have been acquired from an 8000 pixel- wide array in camera 102, also as discussed above.
- the "width" of the images, corresponding to the width of the sensor arrays of the respective cameras 102, 103 are on different scales.
- apparatus 300 To re-size one of the images, apparatus 300 first selects one or more reference points at step 406 in one or both of the images respective aspect ratios of the images 314, 316 are detected for the resizing.
- the cameras are installed one below other and have similar optical and focal distances, and the picture height is equal in both images. So, as reference point horizontal edges of the picture (in the monochrome image we have 8000 horizontal pixels, and in the colour image we have 2000 pixels). The vertical distance is calculated and deducted during camera calibration.
- Apparatus 300 resizes a first one of the colour image 314 and the monochrome image 316 at step 408 towards a scale 322, 324 of a second one of the colour image 314 and the monochrome image 316, for the apparatus 300 to map the colour information from the colour image 314 to the monochrome image 316. That is, one of the images has its scale either stretched (upscaled) or compressed
- apparatus 300 resizes a first one of the colour image and the monochrome image from its original scale to a scale of a second one of the colour image and the monochrome image. So, one of the images is re-scaled to be on the same scale as the other image. If the colour image 314 is 2000 pixels wide, apparatus 300 creates a larger colour image of more than 2000 effective pixels wide, duplicating pixel information. So for a particular colour pixel, the information is duplicated at least once thereby to have two or more effective pixels of the same colour information.
- apparatus 300 operates when the first scale is smaller than the second scale to upsize the colour image by mapping colour information from a pixel of the colour image in its original scale to multiple pixels of the colour image in its upsized scale.
- apparatus 300 maps co ⁇ our information from the colour image to the monochrome image on a pixel-by-pixel basis at step 410. That is, and referring again to Figure 3, colour information from the four effective pixels 318a are mapped to a corresponding four pixels 320 in monochrome image 324. Note that the four pixels 320 in monochrome image 316 effectively correspond with the one pixel 318a in colour image 314 in its original scale, by virtue of the monochrome image being a higher-resolution image. Note that, even with use of a 2000 pixel wide colour image and a 8000 pixel-wide monochrome image, exact re-sizing is not necessary. For instance, duplication of the colour image pixel information might result in a resized colour image of, say, 6000 effective pixels wide, and then colour information from some or all of these 6000 effective pixels is then mapped to corresponding pixels in the monochrome image 316.
- monochrome image 316 may be downscaled towards the scale of the colour image 316.
- apparatus 300 operates to downsize the monochrome image 316 by averaging intensity values for a plurality of pixels 320 of the monochrome image 316 in its original scale 324 to one pixel for the monochrome image in its downsized scale, and to map colour information from a pixel of the colour image to the one pixel of the monochrome image in its downsized scale.
- This provides a resultant colour image of not such high- resolution, but may be preferable in an apparatus 300 with limited processing power.
- apparatus 300 can be operated to map colour information from one colour pixel 318 to multiple monochrome pixels 320.
- colour camera 103 and monochrome camera 102 are offset from one another resulting in there being an offset between the fields of view 114, 115 of the monochrome and colour cameras 102, 103 respectively.
- This means the respective cameras are scanning different portions of the pallet load 112 at different times, resulting in an offset of the respective colour and monochrome images 312, 314.
- Apparatus 300 is configured to account for this offset by, for example, shifting one or other of the images to be in alignment with the other of the images, thereby to allow correct mapping of colour information from pixels 318 of colour image 314 to pixels 320 of monochrome image 316.
- Edge detection module 313 of apparatus 300 operates to provide a more
- Apparatus 300 performs edge detection on one (or both) of the colour image 314 and the monochrome image 316 thereby to detect an edge of a goods package within the pallet load. Apparatus 300 may also perform edge detection on the monochrome image 316 having colour information mapped thereto. In the example of Figure 3, the edge detection is performed by module 313, performing edge detection techniques on one or both of the greyscale and colour images which have been acquired by apparatus 100. For instance, the edges of the package(s) will typically present a lower chromatic value in the monochrome/greyscale images, and colour thresholding is applied in terms of the colour images in order to perform a detection of the goods carton edges.
- a benefit of using both colour and monochrome/greyscale images is that it is possible simple edge detection with greyscale images only would detect, for example, packing strips 532 and packing tape 534, 538 on the goods packages of Figure 5 as box edges, thus leading to spurious results, where apparatus 300 was fooled into thinking there were more goods packages than there actually were.
- a series 521 of images comprising four orthogonal views 520a, 520b, 520c, 520d of sides of a stack 520 of goods packages disposed upon a pallet 530 is illustrated.
- both monochrome (e.g. greyscale) and colour images are acquired for the sides of the goods package(s).
- a first side of the stack 520 is shown.
- Visible in the stack is a plurality of goods packages disposed in a non-uniform manner on the pallet 530, where the individual types of packages are denoted with a #mark - that is, #1 denotes a first type of package arid so on.
- Visible is a package #1/ two packages #2, two packages #3 and a package #4.
- Visible across packages #2 are packing strips 532 and across package #1 a broader line 534 which comprises packing tape.
- a second side of the stack 520 is shown. Visible in the stack in this view are three #1 packages, two #2 packages, one #3 package and two #4 packages. A broad line 538 is visible across the packages #2, which comprises packing tape.
- edges may be indistinguishable in the series 521 of images.
- each package #2 is recognised as being not two packages but one package.
- the same can be said for the package #1, now marked as a single package 636 where the earlier discrepancy caused by the presence of packing tape 534 had been removed.
- the apparatus can operate to perform edge detection with the monochrome image, and.assign the detected edge to a location, say an approximate location, in the colour image.
- polarising filters may still be necessary although, as it is not necessary to read text or barcodes, but being used to detect colour, light from a normal LED source with a polarising filter is generally sufficient. It will be appreciated that the invention has been described by way of example only. Various modifications may be made to the techniques described herein without departing from the spirit and scope of the appended claims.
- the disclosed techniques comprise techniques which may be provided in a stand-alone manner, or in combination with one another. Therefore, features described with respect to one technique may also be used and presented in combination with another technique.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Multimedia (AREA)
- Electromagnetism (AREA)
- General Health & Medical Sciences (AREA)
- Toxicology (AREA)
- Artificial Intelligence (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Quality & Reliability (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
An apparatus for manipulating an image of a pallet load operates to map colour information from a colour image of the pallet load to a monochrome image of the pallet load, the colour image having a first scale and the monochrome image having a second scale, the first scale being different from the second scale. An apparatus for acquiring images of a pallet load comprises a linescan camera for acquiring a monochrome image of the pallet load and a colour camera for acquiring a colour image of the pallet load.
Description
APPARATUS AND METHODS FOR MANIPULATING AND ACQUIRING IMAGES OF A
PALLET LOAD
The invention relates to an apparatus and a method for manipulating images of a pallet load. The invention also extends to a machine- (computer-) readable medium having stored thereon machine-readable instructions for executing, in a machine, the aforementioned method. The invention also relates to an apparatus and a method for acquiring images of a pallet load. The invention has particular, but not exclusive, application in implementing "machine vision" techniques for capturing images of a pallet of goods cartons or goods packages.
In logistics, inbound and outbound cargo control is typically an error-prone, expensive and time-consuming process requiring a substantial amount of work maintaining consistent data in WMS (Warehouse Management Systems) and E Ps (Enterprise Resource Planning Systems). The results of this cargo control processes are often hard to evaluate and contain far too little data to be of any great assistance to the warehouse management process.
Machine vision is commonly implemented in the modern logistics industry. However existing applications generally address highly-specific tasks with narrow fields of application such as:
• Detecting barcodes on cartons moving through a conveyor
• Reading labels of a pre-determined size from pallets
• Creating images of a pallet for post-shipment audit and survey report
There are two main methods implemented to address these tasks:
1. Installing linescan cameras near a conveyor belt, scanning the barcodes (and in some cases related text/graphical information) from the cartons moving besides the camera.
2. Installing matrix cameras with a relatively large matrix (up to 16 pixel in existing top-level mainstream models) for capturing photographs of goods while cartons move past the camera lens. However these approaches suffer substantial limitations.
The approaches '1' and '2' mostly apply to conveyor-oriented logistics facilities like sorting hubs, or to the production lines of the factories themselves. These two approaches are not applicable in some methods of cargo control. For instance, they cannot be used with palletised cargo because of technical limitations; the movement speed for approach 1 is not accurate and subject to fluctuations, thereby introducing uncertainty into the accuracy of the data captured. For approach 2, current techniques do not allow for image acquisition of sufficient resolution for the recognition of features on the cargo, such as small barcodes. When matrix cameras are proposed, the cost quickly becomes too high for this to be a practical solution.
Techniques to mitigate these issues have been proposed by the Applicants in commonly-owned International Patent Application No. PCT/SG2009/000157, the contents of which are hereby incorporated by reference in their entirety. Further techniques to mitigate these issues are set forth herein.
Accordingly, the invention is defined in the independent claims. Some optional features of the invention are defined in the dependent claims. The techniques disclosed herein develop and build upon those disclosed in
PCT/SG2009/000157. For instance the ability to acquire a colour image enhances the previous techniques significantly. A black & white/greyscale (i.e. monochrome) image obtained from a linescan camera does not, clearly, include colour information, which can be very useful in analysing the goods packages. For example, with monochrome images, it is possible a tracking' system will not be able to discern
between packing strips, packing tape etc. and gaps between the packages disposed upon a pallet. Thus information relating to one goods package may be mistakenly be ascribed to two (or more) "phantom" packages, one of which is mistakenly analysed and the other(s) of which is/are nonexistent. Such errors might lead to the tracking system recognising excessive goods in the shipment and generating an error.
Another error may arise when packages are dark-coloured. In a
greyscale/monochrome image of the two dark-coloured packages, any gap between the packages is undetectable, and thus instead of two packages being detected, only one large package is detected, again causing an error.
Another problem solved by the techniques disclosed herein is that, when packages are wrapped in clear plastic film or the like, the film causes unwanted reflections from illuminating light sources, required for the camera to "see" the packages and acquire an images. In terms of using a light source to mitigate problems caused by these reflections, for example, to obviate their occurrence, the only realistic sort of illumination is laser light. Any LED solution capable to generate enough light for practical purposes, will be prohibitively expensive, will be bulky and be expensive to power due to high energy costs, due to the extremely high number of LEDs required. But lasers generate on only one major wavelength, so they are not suitable when it is desired to capture colour information from the goods packages.
Use of polarising filters does not solve the problem either, since these tend to exhibit very high losses, thus requiring high-intensity (and expensive) illumination. For instance, some illumination intensity in double-digit percentage figures can be lost in a polarising filter. If two polarising filters which are necessary (one for the light source and one for the camera), up to 60-80% of illumination intensity can be lost. So, if a camera requires, say, 10-20k lux illumination, 100k lux is required from the lighting source. However, lighting systems readily available at this time are limited to about 25-30k lux, and these are already very expensive items to purchase and bulky.
Thus, the techniques disclosed herein allow for an automated asset tracking system to identify cartons more reliably using colour information together with
monochrome information. The packages in question can be any type of goods package, including goods cartons made of cardboard (or similar) or plastic, metal containers, wooden boxes/crates, paper/textile bags, packages of or wrapped in plastic film - whether clear (transparent) plastic film, or opaque/partially opaque film - or trays for placing goods in or on, with or without wrapping.
High-resolution colour cameras are also extremely expensive. Thus, an apparatus implementing the disclosed techniques where the colour camera is of lower resolution than the monochrome/linescan camera, and then subsequent mapping of the colour information from the colour image to the monochrome image allows higher-resolution colour information to be created at significantly lower cost. Such colour information therefore is of sufficiently high resolution to read even small barcodes and text on cartons and their labels. As noted above, problems caused by reflections from stretch-film and shrink-wrap film, or glossy substrate are mitigated, allowing the automated asset tracking system to read information therethrough.
The invention will now be described, by way of example only, and with reference to the accompanying drawings in which:
Figure 1 is a perspective diagram illustrating an apparatus for acquiring images of a pallet load;
Figure 2 is an elevational diagram illustrating fields of view and fields of illumination of the apparatus of Figure 1;
Figure 3 is an architecture diagram for an apparatus for manipulating an image of a pallet load;
Figure 4 is a process flow diagram illustrating a process for manipulating an image of a pallet load in the apparatus of Figure 3;
Figure 5 is a series of images showing elevational views of a stack of goods cartons for edge detection analysis with the apparatus of Figure 3; and
Figure 6 is a series of images showing goods packages represented in the images of Figure 5 after initial edge detection techniques have been applied to the images of Figure 5. Referring first to Figure 1, an apparatus for acquiring images of a pallet load will now be described. The apparatus 100 comprises a line scan camera 102 for acquiring a monochrome image of pallet load 112 and a colour camera 103 for acquiring a colour image of the pallet load 112. As will be understood herein, a monochrome image is an image in one colour or shades of colour, and includes greyscale or "black & white" images. Monochromatic light is light of a single wavelength, though in practice it can refer to light of a wavelength range, usually a narrow range. A monochromatic image is one whose range of colours consists of shades of a single colour or hue. In the example of Figure 1, linescan camera 102 and colour camera 103 are arranged for movement in a movement plane 104 to acquire the monochrome image and the colour image of pallet load 112 disposed upon a pallet 110 which has been placed in position with, say, assistance from a fork lift truck (not shown). In the example of Figure 1, pallet load 112 comprises of a plurality of goods cartons 112a stacked upon pallet 110 in an orderly manner. In this case, the goods cartons 112a are stacked in an orderly 3 x 3 x 3 matrix. Linescan camera 102 and colour camera 103 are arranged to move vertically in plane 104. In the example, linescan camera 102 and colour camera 103 are mounted for linear movement on supports 106a, 106b which may comprise of linear actuators, having suitable drivers as will be known to the skilled person, and these are arranged upon a support base 108. Linescan camera 102 and colour camera 103 are moved, in the example of Figure 1, from top to bottom. The field of view 114 (optical plane) of linescan camera 102 is incident on pallet load 112 at line 114a and moves in the direction 116. Additionally; the field of view 115 (optical plane) of colour camera 103 is incident on pallet load 112 at line 115a and also moves in the direction 116. So, while travelling, linescan camera 102 acquires a
monochrome image of the pallet load one line at a time. The result is a complete hi- resolution (for example, approximately 160 Mpixel) image. A suitable linescan camera is one which is set to grab about 6-10 lines per mm with a field of view 114 of a line being one pixel thick and 8000 pixel wide, but other, for example higher, resolutions are contemplated.
For the sake of convenience, colour camera 103 is also shown as having a single-line field of view 115. However, in practice, the colour camera pixel array will likely be a two-dimensional array, comprising, in one dimension, three pixels, for the associated R, G and B channels (8-bit per channel) of the colour image. So, while travelling, such a colour camera 103 acquires an image of the pallet load one line at a time, with three pixels for the R, G and B channels acquiring information for each line. The result is a complete image. A suitable colour camera is one which is set to grab about 6-10 lines per mm with a field of view 115 of a line being one pixel thick (but with three pixels acquiring information for the one-line thick field of view) and 2000 pixels wide, but other resolutions, whether higher or lower, are contemplated.
Thus, linescan camera 102 has a row of pixels having a first number of pixels and the colour camera 103 has a row of pixels having a second number of pixels, the first number of pixels being higher than the second number of pixels. In the example given above, the first number of pixels is an integer multiple of the second number of pixels. The significance of these features will become apparent from a discussion below of Figures 3 and 4. It will be appreciated that in Figure 1, the linescan camera is arranged for linear movement in the vertical plane 104, but other arrangements - e.g. other directions of movement - are not excluded. For example, linescan camera could be arranged for (e.g. linear) movement in the horizontal plane as an alternative or in addition to movement in the vertical plane.
The apparatus 100 of Figure 1 also comprises a first light source 118 having a first field of illumination (not shown in Figure 1 for the sake of clarity) for illuminating the field of view 114 of the linescan camera 102 and a second light source 120 having a second field of illumination (not shown in Figure 1 for the sake of clarity) for illuminating a second field of view 115 of the colour camera 103. In the example of Figure 1, first light source 118 is a coherent light source such as a lineiaser, arranged to generate a laser line, but other types of light sources, including other types of coherent light sources may also be used. When using a lineiaser as the light source 118, such a lineiaser 118 emits a laser line having an optical plane (not shown for the sake of clarity) and is set up so that the optical plane of the lineiaser 118 is aligned with an optical plane (e.g. field of view) 114 of the linescan camera 102; e.g.
therefore it is aligned with the CCD matrix of linescan camera 102. It is believed that the effective brightness of the lineiaser backlight can, typically, be more than 10 times the brightness of the ambient light. A 50mW lineiaser can be considered to have a brightness equivalent to that of a 500W halogen light bulb, or even better.
Similar considerations also apply to the colour camera 103. A suitable colour camera which might be used is a DALSA Spyder3 2K colour linescan camera, although any similar camera can be used instead. As the source of illumination, LED with natural (daylight-like) light may be used.
Preferably, either or both of the first and second light sources 118, 120 are powered by one or more DC power sources 122 (connections between the power source 122 and the cameras omitted for the sake of clarity). At first glance, this may not seem to be a significant advance. However, subtle changes to lighting schemes in typical linescan camera installations are not usually an issue. Linescan cameras are typically installed in very well-illuminated facilities and powered by AC power sources.
However, the present use of a linescan camera is more sensitive to subtle changes in the lighting scheme, and the inventors have found that use of AC power sources for the light source causes substantial flicker in the image acquired from a linescan
camera, an issue mitigated with the use of a DC power source. AC power supplies, clearly, provide sinusoidal current to the light source, whether LEDs, halogen bulbs or other types. At the peak of the sinusoid, the lamp emits light at maximum brightness, which then reduces till the sinusoid crosses the zero point. This causes "flicker" in the image: dark lines in the captured image corresponding to areas where linescan camera captured the image with minimum illumination intensity. DC power is relatively ripple-free (typically around 5% fluctuation, which can be considered to be not significant due to LED inertia, giving, effectively, even or relatively constant illumination.
Although Figure 1 shows colour camera 103 being placed above monochrome camera 102 in the same vertical axis, this is not essential, and other arrangements, such as the colour camera 103 being below monochrome camera 102 are also contemplated.
Monochrome and colour cameras 102, 103 can even be installed at different distances from the pallet load, but subsequent processing of the images must take this into account. In the example of Figure 1, field of view 114 is displaced in the movement plane 104 from the second field of view 115; that is, there is a gap between the two lines 114a, 115a. This is not essential, and it is also possible that the two fields of view are coincident on pallet load 112. However, when the fields of view 114, 115 are displaced, this provides a further beneficial arrangement, as wil| now be described with respect to Figure 2.
In one implementation, as illustrated in Figure 2, the first light source 118 (not shown in Figure 2) has a first field of illumination 202 defined by the boundary lines 204a, 204b, on either side of the line 114a representing the line where the field of view 114 of monochrome camera 102 is incident upon the pallet load 112. It ill be
appreciated that, practically speaking, the boundaries 204a, 204b of the first field of illumination 202 may not be so well-defined, and there may be a gradual drop in illumination intensity moving away from the line 114a. It will also be appreciated that the relative distances of the boundary lines 204a, 204b from the line 114a are representative only for the sake of illustration and that, relatively speaking, the field of illumination may be focussed more narrowly in the region of line 114a. Likewise, second light source 120 (not shown in Figure 2) has a second field of illumination 206 defined by the boundary lines 208a, 208b, on either side of the line 115a
representing the line where the field of view 115 of colour camera 103 is incident upon the pallet load 112. It will be appreciated that, practically speaking, the boundaries 208a, 208b of the second field of illumination 206 may not be so well- defined, and there may be a gradual drop in illumination intensity moving away from the line 115a. It will also be appreciated that the relative distances of the boundary lines 208a, 208b from the line 115a are representative only for the sake of illustration and that, relatively speaking, the field of illumination may be focussed more narrowly in the region of line 115a.
As illustrated in the example of Figure 2, the first field of view 114 and the second field of view 115 are displaced in the movement plane; that is, there is a gap between the two lines 114a, 115a. In one implementation, a suitable gap has been found to be about 10cm. Further in this example, the first field of illumination 202 is arranged for illumination outside of the second field of view 115, 115a. This means that the first light source 118 is arranged not to illuminate the area of the pallet load 212 being scanned by the colour camera 103, in this instance, the light source 118 does not illuminate in the region of line 115a. This helps to avoid interference and unwanted reflection for the colour camera 103. It will be appreciated that it is possible some stray light from the first light source 118 might illuminate the second field of view 115, but practically speaking, it is expected to be so minimal that it can be ignored. However, using a first light source 118 with a well-defined beam profile, such as a line laser, there should be no stray illumination from the first light source.
As also illustrated in the example of Figure 2, the second field of illumination 206 is arranged for illumination outside of the first field of view 114, 114a. This means that the second light source 120 is arranged not to illuminate the area of the pallet load 212 being scanned by the monochrome camera 102, in this instance, the second light source 120 does not illuminate in the region of line 114a. It will be appreciated that it is possible some stray light from the second light source 120 might illuminate the second field of view, but practically speaking, it is expected to be so minimal that it can be ignored. However, using a second light source with a relatively well-defined beam profile, such as a LED cluster or array generating white light, there should be no stray illumination from the first light source. If light from an LED cluster coincides with the field of view 114 of the monochrome camera, it is a relatively simple operation to include a polarising filter on the monochrome camera. Loss of intensity for the monochrome camera is less of a concern than for a colour camera. In any event, the field of view 114 will be relatively far from the most intense illumination area in the field of view of the LED light source, so the influence may be negligible.
Figure 3 illustrates an example of an apparatus for manipulating an image of a pallet load. The apparatus of Figure 3 may be utilised to manipulate an image acquired by the apparatus of Figure 1. The apparatus 300 comprises, principally, a processor 302 and a memory 304 for storing one or more routines 306 which, when executed under control of processor 302, control the apparatus 300 to map colour information from a colour image 314 of the pallet load 112 to a monochrome image 316 of the pallet load 112, the colour image 314 having a first scale 322 and the monochrome image 316 having a second scale 324, the first scale 322 being different from the second scale 324. A process for manipulating an image with the apparatus 300 will be described in greater detail with reference to Figure 4.
In the example of Figure 3, apparatus 300 also comprises a frame grabber 308, storage 310,1/0 module 312 and edge detection module 313: Frame grabber 308 is
used to process (e.g. gather) image data acquired from, say, monochrome camera 102 and/or colour camera 103 of Figure 1. The image data is received via I/O 312 and the data is stored in storage 310 in a format such as a pure uncompressed raw bitmap image which can be processed using various image processing and data manipulation techniques, such as those described in commonly-owned International Patent Application No. PCT/SG2009/000472. When used to provide images for use with the image processing and data manipulation techniques disclosed in
PCT/SG2009/000472, all information - e.g. text, barcodes, logos, labels, shipping marks, etc. - in the image for the pallet load (which may comprise a partial image of the pallet load) are used which provides a significant advantage over prior art "approach 1" and prior art "approach 2" discussed above.
It will be appreciated that Figure 3 is a schematic block diagram illustrating an architecture for an apparatus for manipulating an image of a pallet load. It is not essential that the components of the apparatus be included within a single device as might be inferred from Figure 3 and that components illustrated may be provided in one or more devices. For example, the frame grabber 308 might be installed in a separate device and configured to transmit the image data to apparatus 300 for storage in storage memory 310. Indeed, storage 310 itself could be provided in a separate device.
A process 400 for manipulating an image of a pallet load in the apparatus 300 of Figure 3 will now be described with respect to Figure 4. The process 400 starts at step 402. The colour and monochrome images 314, 316 are retrieved by apparatus 300 for processing from storage 310 at step 404. In the example of Figure 4, one (or both) of the images 314, 316 are re-sized (or re-scaled), as the two images have different scales 322, 324 from one another. For example, the colour image 314 may have been acquired from a 2000 pixel-wide array (as discussed above) in camera 103, and the monochrome image 316 may have been acquired from an 8000 pixel- wide array in camera 102, also as discussed above. Thus, the "width" of the images,
corresponding to the width of the sensor arrays of the respective cameras 102, 103 are on different scales.
To re-size one of the images, apparatus 300 first selects one or more reference points at step 406 in one or both of the images respective aspect ratios of the images 314, 316 are detected for the resizing. In the example of Figure 1, the cameras are installed one below other and have similar optical and focal distances, and the picture height is equal in both images. So, as reference point horizontal edges of the picture (in the monochrome image we have 8000 horizontal pixels, and in the colour image we have 2000 pixels). The vertical distance is calculated and deducted during camera calibration. Apparatus 300 resizes a first one of the colour image 314 and the monochrome image 316 at step 408 towards a scale 322, 324 of a second one of the colour image 314 and the monochrome image 316, for the apparatus 300 to map the colour information from the colour image 314 to the monochrome image 316. That is, one of the images has its scale either stretched (upscaled) or compressed
(downscaled) towards the scale of the other image. Note that it is not necessary for the two images to be on precisely the same scale. However such a re-scaling is performed in one implementation in which apparatus 300 resizes a first one of the colour image and the monochrome image from its original scale to a scale of a second one of the colour image and the monochrome image. So, one of the images is re-scaled to be on the same scale as the other image. If the colour image 314 is 2000 pixels wide, apparatus 300 creates a larger colour image of more than 2000 effective pixels wide, duplicating pixel information. So for a particular colour pixel, the information is duplicated at least once thereby to have two or more effective pixels of the same colour information. If the colour image 314 is to be resized from its original scale to the scale of the monochrome image, then for a 2000 pixel-wide colour image 314 and an 8000 pixel-wide monochrome image, information for each colour image pixel, say pixel 318a, is duplicated three times, thereby meaning apparatus 300 has information for four effective colour pixels 318a. Thus, apparatus 300 operates when the first scale is smaller than the second scale to upsize the
colour image by mapping colour information from a pixel of the colour image in its original scale to multiple pixels of the colour image in its upsized scale.
There are several ways to perform the re-sizing algorithm, but the inventors have found that a bi-cubic interpolation resizing algorithm provides acceptable results.
After re-sizing, apparatus 300 maps co\our information from the colour image to the monochrome image on a pixel-by-pixel basis at step 410. That is, and referring again to Figure 3, colour information from the four effective pixels 318a are mapped to a corresponding four pixels 320 in monochrome image 324. Note that the four pixels 320 in monochrome image 316 effectively correspond with the one pixel 318a in colour image 314 in its original scale, by virtue of the monochrome image being a higher-resolution image. Note that, even with use of a 2000 pixel wide colour image and a 8000 pixel-wide monochrome image, exact re-sizing is not necessary. For instance, duplication of the colour image pixel information might result in a resized colour image of, say, 6000 effective pixels wide, and then colour information from some or all of these 6000 effective pixels is then mapped to corresponding pixels in the monochrome image 316.
So, such a process allows for colour information from pixels in the colour image to be "assigned" to the monochrome image. In this way, the monochrome image 316 is manipulated to provide a cost-effective high-resolution image containing colour information for the subsequent processing of the image, without the cost associated with acquisition of high-resolution colour images. This allows for effective edge detection of goods packages 112a in the pallet load 112 of Figure 1, mitigating the effects of spurious results caused by packing tape and strip, and dark-coloured packages where it is difficult to detect edges of the packages in a monochrome image. Edge detection is performed by edge detection module 313 Of apparatus 300,
as described in further detail below. Additionally, colour information of enhanced effective resolution helps to provide reliable results with the techniques disclosed in PCT/SG 2009/000472. Alternatively, rather than upscaling colour image 314, monochrome image 316 may be downscaled towards the scale of the colour image 316. In which case apparatus 300 operates to downsize the monochrome image 316 by averaging intensity values for a plurality of pixels 320 of the monochrome image 316 in its original scale 324 to one pixel for the monochrome image in its downsized scale, and to map colour information from a pixel of the colour image to the one pixel of the monochrome image in its downsized scale. This provides a resultant colour image of not such high- resolution, but may be preferable in an apparatus 300 with limited processing power. Although the techniques discussed above with respect to Figure 4 use re-sizing techniques, it will be appreciated that re-sizing is not absolutely necessary. For instance, and following the example where the colour image 314 is a 2000 pixel-wide image and the monochrome image 316 is a 8000 pixel-wide image, apparatus 300 can be operated to map colour information from one colour pixel 318 to multiple monochrome pixels 320.
Referring again to Figures 1 and 2, colour camera 103 and monochrome camera 102 are offset from one another resulting in there being an offset between the fields of view 114, 115 of the monochrome and colour cameras 102, 103 respectively. This means the respective cameras are scanning different portions of the pallet load 112 at different times, resulting in an offset of the respective colour and monochrome images 312, 314. Apparatus 300 is configured to account for this offset by, for example, shifting one or other of the images to be in alignment with the other of the images, thereby to allow correct mapping of colour information from pixels 318 of colour image 314 to pixels 320 of monochrome image 316.
Edge detection module 313 of apparatus 300 operates to provide a more
sophisticated package recognition technique. Apparatus 300 performs edge detection on one (or both) of the colour image 314 and the monochrome image 316 thereby to detect an edge of a goods package within the pallet load. Apparatus 300 may also perform edge detection on the monochrome image 316 having colour information mapped thereto. In the example of Figure 3, the edge detection is performed by module 313, performing edge detection techniques on one or both of the greyscale and colour images which have been acquired by apparatus 100. For instance, the edges of the package(s) will typically present a lower chromatic value in the monochrome/greyscale images, and colour thresholding is applied in terms of the colour images in order to perform a detection of the goods carton edges. A benefit of using both colour and monochrome/greyscale images is that it is possible simple edge detection with greyscale images only would detect, for example, packing strips 532 and packing tape 534, 538 on the goods packages of Figure 5 as box edges, thus leading to spurious results, where apparatus 300 was fooled into thinking there were more goods packages than there actually were.
Referring back to Figure 5, a series 521 of images comprising four orthogonal views 520a, 520b, 520c, 520d of sides of a stack 520 of goods packages disposed upon a pallet 530 is illustrated. In the present example, both monochrome (e.g. greyscale) and colour images are acquired for the sides of the goods package(s). As a further option, it is possible to obtain a plan view of the goods package(s) from an overhead camera (not shown) to obtain additional information concerning the layout of goods package(s).
As shown in view 520a, a first side of the stack 520 is shown. Visible in the stack is a plurality of goods packages disposed in a non-uniform manner on the pallet 530, where the individual types of packages are denoted with a #mark - that is, #1 denotes a first type of package arid so on. Visible is a package #1/ two packages #2,
two packages #3 and a package #4. Visible across packages #2 are packing strips 532 and across package #1 a broader line 534 which comprises packing tape.
As shown in view 520b, a second side of the stack 520 is shown. Visible in the stack in this view are three #1 packages, two #2 packages, one #3 package and two #4 packages. A broad line 538 is visible across the packages #2, which comprises packing tape.
Similar observations may be made about objects which are visible in views 520c and 520d.
Because of items such as the packing strips 532 and packing tape 534, 538, then relatively straightforward edge-detection techniques are likely to be insufficient to recognise the goods packages correctly. Furthermore, if the goods packages are of a dark-coloured material, then edges may be indistinguishable in the series 521 of images.
For instance, simple black and white/greyscale edge detection techniques applied to an image having packing strips and/or tape would lead to the result shown in Figure 6a, where packages #4 and 3 are correctly detected, but for packages #2, the edge detection technique incorrectly recognises these as being made up if discrete packages #2c, #2d, #2e and #2f because of packing strips 532. Additionally, package #1 is incorrectly recognised as two separate packages #lh and #li (622bl and 622b2) because of the presence of packing tape 534.
After apparatus 300 applies colour thresholding, the results are shown in Figure 6b. Some correction is made in that each package #2 is recognised as being not two packages but one package. The same can be said for the package #1, now marked as a single package 636 where the earlier discrepancy caused by the presence of packing tape 534 had been removed.
When polarised light is not used with colour cameras, there may be multiple reflections from film packing the goods packages. Therefore, the apparatus can operate to perform edge detection with the monochrome image, and.assign the detected edge to a location, say an approximate location, in the colour image.
However, in some cases, polarising filters may still be necessary although, as it is not necessary to read text or barcodes, but being used to detect colour, light from a normal LED source with a polarising filter is generally sufficient. It will be appreciated that the invention has been described by way of example only. Various modifications may be made to the techniques described herein without departing from the spirit and scope of the appended claims. The disclosed techniques comprise techniques which may be provided in a stand-alone manner, or in combination with one another. Therefore, features described with respect to one technique may also be used and presented in combination with another technique.
Claims
1. Apparatus for acquiring images of a pallet load, the apparatus comprising: a linescan camera for acquiring a monochrome image of the pallet load; and a colour camera for acquiring a colour image of the pallet load.
2. Apparatus according to claim 1, wherein the linescan camera and the colour camera are arranged for movement in a movement plane to acquire the
monochrome image and the colour image respectively.
3. Apparatus according to claim 1 or claim 2, the apparatus comprising a first light source having a first field of illumination for illuminating the first field of view of the linescan camera and a second light source having a second field of illumination for illuminating a second field of view of the colour camera.
4. Apparatus according to claim 3, wherein the first light source comprises a linelaser, and the second light source comprises of an LED.
5. Apparatus according to claim 3 or claim 4, wherein one of the first light source and the second light source are powered by a DC power source
6. Apparatus according to any of claims 3 to 5, wherein the first field of view and the second field of view are displaced in the movement plane, and the first field of illumination is arranged for illumination outside of the second field of view.
7. Apparatus according to claim 6, wherein the second field of illumination is arranged for illumination outside of the first field of view.
8. Apparatus according to any preceding claim, wherein the linescan camera has a row of pixels having a first number of pixels arid the colour camera has a row of pixels having a second number of pixels, the first number of pixels being higher than the second number of pixels.
9. Apparatus according to claim 8, wherein the first number of pixels comprises of an integer multiple of the second number of pixels.
10. Apparatus for manipulating an image of a pallet load, the apparatus comprising:
a processor; and
a memory for storing one or more routines which, when executed under control of the processor, control the apparatus to map colour information from a colour image of the pallet load to a monochrome image of the pallet load, the colour image having a first scale and the monochrome image having a second scale, the first scale being different from the second scale.
11. The apparatus of claim 10 configured, under control of the processor, to resize a first one of the colour image and the monochrome image towards a scale of a second one of the colour image and the monochrome image, for the apparatus to map the colour information from the colour image to the monochrome image.
12. The apparatus of claim 10 or claim 11 configured, under control of the processor, to resize a first one of the colour image and the monochrome image from its original scale to a scale of a second one of the colour image and the monochrome image.
13. The apparatus of any of claims 10 to 12 configured, under control of the processor, to map colour information from the colour image to the monochrome image on a pixel-by-pixel basis.
14. The apparatus of any of claims 10 to 13, configured, under control of the processor, to operate when the first scale is smaller than the second scale, the apparatus being configured to upsize the colour image by mapping colour information from a pixel of the colour image in its original scale to multiple pixels of the colour image in its upsized scale.
15. The apparatus of any of claims 10 to 14, configured, under control of the processor, to operate when the first scale is larger than the second scale, the apparatus being configured to downsize the monochrome image by averaging intensity values for a plurality of pixels of the monochrome image in its original scale to one pixel for the monochrome image in its downsized scale, and to map colour information from a pixel of the colour image to the one pixel of the monochrome image in its downsized scale.
16. The apparatus of any of claims 10 to 15 configured, under control of the processor, to perform edge detection on one of the colour image and the monochrome image, thereby to detect an edge of a goods package within the pallet load.
17. The apparatus of claim 16 configured, under control of the processor, to perform edge detection on the monochrome image having colour information mapped thereto.
18. The apparatus of claim 16 configured, under control of the processor, to perform edge detection on the monochrome image, and assign a detected edge to a location in the colour image.
19. The apparatus of any of claims 10 to 18 configured, under control of the processor, to operate on the colour image from a colour camera and the
monochrome image from a monochrome camera, the colour camera and the monochrome camera being offset from one another during an image acquisition process, the apparatus being configured to perform image mapping accounting for the offset.
20. A method for acquiring images of a pallet load, the method:
acquiring a monochrome image of the pallet load using a linescan camera; and
acquiring a colour image of the pallet load using a colour camera.
21. A method for manipulating an image of a pallet load, the method comprising controlling a processor of the apparatus to map colour information from a colour image of the pallet load to a monochrome image of the pallet load, the colour image having a first scale and the monochrome image having a second scale, the first scale being different from the second scale.
22. A machine-readable medium, having stored thereon machine-readable instructions for executing, in a machine, a method for manipulating an image of a pallet load, the method comprising controlling a processor of the apparatus to map colour information from a colour image of the pallet load to a monochrome image of the pallet load, the colour image having a first scale and the monochrome image having a second scale, the first scale being different from the second scale
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/SG2010/000011 WO2011087454A1 (en) | 2010-01-18 | 2010-01-18 | Apparatus and methods for manipulating and acquiring images of a pallet load |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/SG2010/000011 WO2011087454A1 (en) | 2010-01-18 | 2010-01-18 | Apparatus and methods for manipulating and acquiring images of a pallet load |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011087454A1 true WO2011087454A1 (en) | 2011-07-21 |
Family
ID=44304514
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/SG2010/000011 WO2011087454A1 (en) | 2010-01-18 | 2010-01-18 | Apparatus and methods for manipulating and acquiring images of a pallet load |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2011087454A1 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
BE1021836B1 (en) * | 2014-05-05 | 2016-01-21 | Bellivo, Société Anonyme | DEVICE FOR DETERMINING A NUMBER OF CRATES AND METHOD OF PLACING OR REMOVING CRATES IN A TRANSPORTATION VEHICLE |
US9826213B1 (en) | 2015-09-22 | 2017-11-21 | X Development Llc | Generating an image-based identifier for a stretch wrapped loaded pallet based on images captured in association with application of stretch wrap to the loaded pallet |
EP3572996B1 (en) * | 2018-05-21 | 2022-11-30 | Automatizacion y Sistemas de Inspeccion en Linea Global, S.L. | Device and method for containers classification |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5761070A (en) * | 1995-11-02 | 1998-06-02 | Virginia Tech Intellectual Properties, Inc. | Automatic color and grain sorting of materials |
US5852502A (en) * | 1996-05-31 | 1998-12-22 | American Digital Imaging, Inc. | Apparatus and method for digital camera and recorder having a high resolution color composite image output |
JP2000304704A (en) * | 1999-04-23 | 2000-11-02 | Hitachi Chem Co Ltd | Method and apparatus for inspecting wiring board |
US6477270B1 (en) * | 1999-10-21 | 2002-11-05 | Yecheng Wu | Method for converting a high resolution image to true color using a low resolution color image |
US20080187243A1 (en) * | 2007-02-02 | 2008-08-07 | Kabushiki Kaisha Toshiba | Image reading apparatus and image reading method |
BRPI0704329A2 (en) * | 2007-10-23 | 2009-06-23 | De Oliveira Luiz Eduardo Soares | computational process of wood image analysis |
-
2010
- 2010-01-18 WO PCT/SG2010/000011 patent/WO2011087454A1/en active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5761070A (en) * | 1995-11-02 | 1998-06-02 | Virginia Tech Intellectual Properties, Inc. | Automatic color and grain sorting of materials |
US5852502A (en) * | 1996-05-31 | 1998-12-22 | American Digital Imaging, Inc. | Apparatus and method for digital camera and recorder having a high resolution color composite image output |
JP2000304704A (en) * | 1999-04-23 | 2000-11-02 | Hitachi Chem Co Ltd | Method and apparatus for inspecting wiring board |
US6477270B1 (en) * | 1999-10-21 | 2002-11-05 | Yecheng Wu | Method for converting a high resolution image to true color using a low resolution color image |
US20080187243A1 (en) * | 2007-02-02 | 2008-08-07 | Kabushiki Kaisha Toshiba | Image reading apparatus and image reading method |
BRPI0704329A2 (en) * | 2007-10-23 | 2009-06-23 | De Oliveira Luiz Eduardo Soares | computational process of wood image analysis |
Non-Patent Citations (2)
Title |
---|
ANONYMOUS: "AD-080CL - a new 2 CCD camera for multi-spectral imaging", JAI, 2007, Retrieved from the Internet <URL:http://www-jai.com/EN/NewsEvents/News/Pages/2CCD-Camera-AD080CL.aspx> [retrieved on 20100312] * |
BYLER, E. ET AL.: "Autonomous Hazardous Waste Drum Inspection Vehicle", IEEE ROBOTICS AND AUTOMATION MAGAZINE, vol. 2, no. 1, 1995, pages 6 - 17, XP011089652 * |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
BE1021836B1 (en) * | 2014-05-05 | 2016-01-21 | Bellivo, Société Anonyme | DEVICE FOR DETERMINING A NUMBER OF CRATES AND METHOD OF PLACING OR REMOVING CRATES IN A TRANSPORTATION VEHICLE |
US9826213B1 (en) | 2015-09-22 | 2017-11-21 | X Development Llc | Generating an image-based identifier for a stretch wrapped loaded pallet based on images captured in association with application of stretch wrap to the loaded pallet |
US10491881B1 (en) | 2015-09-22 | 2019-11-26 | X Development Llc | Generating an image-based identifier for a stretch wrapped loaded pallet based on images captured in association with application of stretch wrap to the loaded pallet |
US10616553B1 (en) | 2015-09-22 | 2020-04-07 | X Development Llc | Generating an image-based identifier for a stretch wrapped loaded pallet based on images captured in association with application of stretch wrap to the loaded pallet |
US11228751B1 (en) | 2015-09-22 | 2022-01-18 | X Development Llc | Generating an image-based identifier for a stretch wrapped loaded pallet based on images captured in association with application of stretch wrap to the loaded pallet |
EP3572996B1 (en) * | 2018-05-21 | 2022-11-30 | Automatizacion y Sistemas de Inspeccion en Linea Global, S.L. | Device and method for containers classification |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20220108264A1 (en) | System and method for determining out-of-stock products | |
EP3561764B2 (en) | Systems and methods for stitching sequential images of an object | |
EP2425204A1 (en) | Apparatus and method for acquiring an image of a pallet load | |
WO2018156884A1 (en) | Optical vibrometric testing of container for items | |
US20190311486A1 (en) | Method, system and apparatus for correcting translucency artifacts in data representing a support structure | |
US11915463B2 (en) | System and method for the automatic enrollment of object images into a gallery | |
JP6203538B2 (en) | Product inspection method for packaging machines | |
JP2015114292A (en) | Workpiece position information identification apparatus and workpiece position information identification method | |
US20220051177A1 (en) | System and method for identifying misplaced products in a shelf management system | |
JP2014525042A (en) | Inspection unit | |
US20230196599A1 (en) | Systems and methods for machine vision robotic processing | |
WO2011087454A1 (en) | Apparatus and methods for manipulating and acquiring images of a pallet load | |
CN116256366A (en) | Chip defect detection method, detection system and storage medium | |
DK2297674T3 (en) | Optical registration and classification of returned packaging items in a return system | |
US20240289573A1 (en) | Reading an Optical Code | |
CN1052322C (en) | Improvements in image processing | |
US12025566B2 (en) | Method and device for inspecting containers | |
BR112018071487B1 (en) | IMAGE CAPTURE SYSTEM AND METHOD FOR DETERMINING THE POSITION OF A STRUCTURE ENGRAVED IN A LAMINATED ELEMENT | |
JP6359363B2 (en) | Container inspection device and container inspection method | |
CA2153647A1 (en) | Method and apparatus for recognizing geometrical features of parallelepiped-shaped parts of polygonal section | |
CN208547582U (en) | Optical profile type Defect Detection system | |
CN208206812U (en) | Product appearance characteristic inspection automation equipment | |
WO2011123057A1 (en) | Apparatus and methods for acquiring and analysing images of a pallet load | |
CN111225141B (en) | Method for reading image sensor | |
US20240355085A1 (en) | System And Method For Matching Products And Determining Spreads And Plugs |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10843350 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 10843350 Country of ref document: EP Kind code of ref document: A1 |