WO1999049277A1 - An optical sensor system for incorporation in a conveyor system and a method for determining the geometry and/or angular position of a moving object - Google Patents

An optical sensor system for incorporation in a conveyor system and a method for determining the geometry and/or angular position of a moving object Download PDF

Info

Publication number
WO1999049277A1
WO1999049277A1 PCT/DK1999/000180 DK9900180W WO9949277A1 WO 1999049277 A1 WO1999049277 A1 WO 1999049277A1 DK 9900180 W DK9900180 W DK 9900180W WO 9949277 A1 WO9949277 A1 WO 9949277A1
Authority
WO
WIPO (PCT)
Prior art keywords
optical sensor
conveyor
image information
sensor units
information
Prior art date
Application number
PCT/DK1999/000180
Other languages
French (fr)
Inventor
Ole Helbo
Ralph Kofoed
Paul Erik Thidemann Farre
Leif Ostenfeld Johansen
Original Assignee
Crisplant A/S
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Crisplant A/S filed Critical Crisplant A/S
Priority to AU30249/99A priority Critical patent/AU3024999A/en
Publication of WO1999049277A1 publication Critical patent/WO1999049277A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/245Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures using a plurality of fixed, simultaneously operating transducers
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques

Definitions

  • the present invention relates to an optical sensor system for determining the geometry and/or angular position of a moving object, such as an object being conveyed in a conveyor system.
  • the present invention relates to an optical sensor system for determining a three-dimensional image or measurement of an object being conveyed in a conveyor system.
  • CCD cameras provide a high quality of the measurements; thus for example, a CCD array comprises a large number of photo diodes, often many thousands of photo diodes, and provides a correspondingly high picture resolution.
  • the conveyor section comprises a belt conveyor
  • the belt is usually divided into a number of sub-belts defining slits there between, light for the photo detectors passing through the slits.
  • the resolution of such prior art systems is limited to the width of the sub-belts, usually limiting the accuracy to 40-50 mm. 2
  • French patent application FR 2 396 953 discloses a system for measuring the dimensions of a rectangular object being conveyed along a conveyor.
  • the system disclosed in FR 2 396 953 comprises three photo sensors, a first one of which is arranged at a right angle in relation to the object, thereby measuring the length of the object by measuring the travelling distance of the object while the first photo detector detects the object.
  • a second photo detector is arranged at an acute angle in the sectional plane of the object, whereby the sum of the length and the width is being measured in the same way as the length.
  • FR 2 396 953 discloses a relatively simple on/off use of one-dimensional photo detectors for determination of the size of a rectangular object.
  • objects are often transferred from one conveyor to another, such as, e.g., from a feed conveyor to a receiving conveyor.
  • a feed conveyor to a receiving conveyor.
  • European Patent No. 0 305 755 and corresponding German patent no. 37 29 081 disclose a feed conveyor for a conveyor system of the above-mentioned type.
  • a system having a feed conveyor and a receiving conveyor is disclosed, the travelling direction of the receiving conveyor forming an acute angle, ⁇ , with the travelling direction of the feed conveyor.
  • the length of the object is measured before the object is transferred to the receiving conveyor.
  • Two measuring lines are defined, e.g., by two photo sensors, the two measuring lines being arranged at an acute angle which may be equal to ⁇ .
  • an object of the present invention to provide an optical sensor system for determining both the two- and three-dimensional geometry of a moving object which is both easy and cheap to implement. It is a further object of the invention to provide a system which may be used for a number of different purposes and which is easy and cheap to manufacture and easy to implement. A still further object of the invention is to provide a system which constitutes a cost efficient and reliable alternative to prior art systems, such as, e.g., systems comprising one or more CCD cameras.
  • a first optical sensor system for incorporation in a conveyor system, the conveyor system comprising a conveyor control system for controlling operation of the conveyor system, the optical sensor system comprising:
  • an optical sensor unit comprising:
  • an array of photo sensors adapted to sense intensity of light emitted or reflected in the visual field of the array of photo sensors, the visual field encompassing a path in which a part of the conveyor system and/or an object moves,
  • first signal transmission means for transmission of image information representing the intensity of light sensed by each photo sensor to a processor unit
  • the processor unit comprising:
  • memory means for storing one or more algorithms which allow for appropriate processing of the image information by the processor unit depending on the operation of the first optical sensor system, the processor unit being adapted to process the image information which is being captured in the visual field, so as to generate an output signal representing the image of a part of the conveyor system and/or the object,
  • the optical sensor unit being comprised in an integrated unit, the first optical sensor system being adapted to perform at least two of the following operations:
  • the optical sensor system being adapted to perform at least one of said operations at a time.
  • the conveyor system may further comprise a plurality of conveyor units, the first optical sensor system further being adapted to perform the following operation:
  • photo sensors covers photo-electric sensors, diode- based sensors and CCD-sensors.
  • the first optical sensor unit may be adapted to perform any number of operations, and may also be adapted to perform two or more operations simultaneously.
  • a particular possibility of the system according to the first aspect of the invention is that it may be used also as an on/off detection system for, e.g., detecting if an object is present in the visual field defined by one or more of the photo sensors.
  • the system may be applied for determining the sideways position of an object in relation to the supporting surface of the conveyor. This has the particular advantage that the sideways position of the object may be determined by means of only a single unit. Thus, two or more light scanners need not be applied.
  • a determination of the sideways position of an object being supported and/or conveyed by the conveyor system may provide valuable information e.g. when the conveyor system comprises a feed conveyor conveying objects in a first direction and a receiving conveyor conveying objects in a second direction.
  • the conveyor system comprises a feed conveyor conveying objects in a first direction and a receiving conveyor conveying objects in a second direction.
  • information about the sideways position of the object is required in order to properly control the starting time and/or speed of the feed conveyor in order for the feed conveyor to transfer the object to the receiving conveyor at an appropriate position at an appropriate time, e.g., a position where the object may be accommodated at a suitable position on the receiving conveyor.
  • a detection of object identification information such as a barcode, a binary number, provided or printed on the object and/or detection of conveyor unit identification information, such as a barcode, a binary number, etc., provided or printed on at least some of the conveyor units may provide information to the processor unit or the conveyor control system to enable monitoring of the conveyor system.
  • a contact less velocity determination may be performed.
  • the two images may be any images, such as any characteristic edge point or part of a conveyor unit, such as any identification 6
  • the measured velocity may be transmitted to e.g. the processor unit and/or the conveyor control system at inquiries, or the velocity may be transmitted to the processor unit and/or the conveyor control system each time an image is captured, or the optical sensor system may provide a continuos or partly continuos set of pulses, wherein a pulse is transmitted for each given distance traversed.
  • the condition of said predetermined part of the conveyor system may be surveyed or inspected.
  • detection of broken or missing parts such as a broken or defect tilting arm, a missing or defect wheel, inadequate distance from e.g. wheels to any actuators or motors caused by wear and tear or by defects in e.g. the wheel suspensions.
  • At least part of the surface of the conveyor system may have a predetermined pattern, the processor unit of the first optical sensor system being adapted to process the image information, so as to distinguish the surface of the conveyor system from a part of the conveyor system and/or the object.
  • the predetermined pattern may have a plurality of mutually contrasting areas, whereby an output signal representing an image of a part of the conveyor system and/or the object may be generated by superimposing the image information and previously stored data representing said contrasting areas.
  • the predetermined pattern may be any characteristic surface, such as flecked, chequered, ruled, etc, further the mutually contrasting areas may be any light/dark areas, shining/non- shining areas, reflecting/non-reflecting areas, etc. Still further the optical sensor unit may be positioned above a slot wherein a discrete number of light sources is positioned, thereby defining light and dark areas of the surface. 7
  • an object and/or a part of the conveyor system may be detected by detection of missing contrasting areas or by detection of additional contrasting areas.
  • the contrasting areas may be detected by detection of gradients in the sensed light intensity or by detection of predetermined levels of intensity.
  • the conveyor control system may comprise the processor unit or the processor unit may be an independent unit or the processor unit may form part of the integrated unit.
  • the integrated unit may further comprise the conveyor control system.
  • the integrated unit may comprise a lens system provided in front of the optical sensor unit so as to facilitate regulation of the visual field of the sensor unit and to ensure that the objects to be detected are in the plane of focus of the array of photo sensors.
  • the integrated unit may further comprise a light source for illumination of the visual field.
  • This light source may be a combination of any number of light sources positioned at the integrated unit, such as halogen spots, light emitting diodes, LED's, such as laser-diodes.
  • a lens system may be provided in front of the lens systems.
  • the light sources may be comprised in the optical sensor unit and using the same optical lens system as the lens system provided in front of the optical sensor unit to illuminated the visual field.
  • the optical sensor system may control the light source so that the sensor unit sampling frequency is adjusted to the frequency of the light source.
  • the intensity of light is integrated during an integration time which may be at least 1 ms, such as at least 50 ⁇ s, such as at least 20 ⁇ s, preferably such as at least 5 ⁇ s, such as at least 2 ⁇ s, such as at least 0,5 ⁇ s.
  • the photo sensor array of the optical sensor units may, e.g., comprise 32-1024 photo sensors, such as 64-512, or 128-256. In preferred embodiments comprise 64, 128, 256 or 512 photo sensors are provided.
  • This optical sensor may for example comprise a number of photo sensors (pixels) which measure incident light over an integration/exposure time and each photo sensor may generate an analog output voltage.
  • the output from each photo sensor is proportional to the exposure time multiplied by the intensity of light incident at the photo sensor.
  • analog output voltage from each pixel is provided to an analog-to-digital converter and the converted digital signal is an 8-bit number between 0 and 255, corresponding to a 256 grey-scale resolution.
  • an optical sensor system for determining the geometry and/or angular position of a moving object, said system preferably comprising:
  • At least two optical sensor units each comprising:
  • an array of photo sensors adapted to sense intensity of light emitted or reflected in the visual field of the array of photo sensors, the visual field encompassing a path in which the object moves,
  • signal transmission means for transmission of image information representing the intensity of light sensed by each photo sensor
  • a processor unit comprising:
  • the processor unit being adapted to process the image information and the moving means position information together with information concerning spatial position of each of the at least two optical sensor units, in a sectional plane corresponding to a moving means position or a sequence of sectional planes corresponding to a sequence of moving means positions so as to generate an output signal representing the geometry and/or the angular position of the object.
  • the second aspect of the present invention provides an optical sensor system which is both easy and cheap to implement as it requires only two optical sensor units while allowing for a relatively high resolution of both two- and three-dimensional images and/or geometry measurements of moving objects.
  • the system according to the first and second aspect of the invention has the further advantage that no sets of photo detectors/photo diodes are required, thereby increasing both the obtainable accuracy /resolution and the robustness of the system in relation to prior art systems comprising conventional photo detectors/photo diodes.
  • the system according to the second aspect of the present invention is capable of performing two- and three-dimensional measurements and determining two- and three-dimensional images of objects of substantially any shape.
  • the above-mentioned and other objects are accomplished by a method of determining the geometry and/or angular position of a moving object, the method comprising:
  • an array of photo sensor sensing intensity of light emitted or reflected in the visual field of the array of photo sensors, the visual field encompassing a path in which the object moves,
  • the optical sensor units may be positioned above a conveyor.
  • the heights above the conveyor and the angular positions of the optical sensor units which define the visual field of the optical sensor constitute the spatial position information.
  • the spatial position information determines the visual field of each optical sensor which is the entire expanse of space visible from the optical sensor unit at the given height.
  • the optical sensor units may be positioned so that the visual field of each optical sensor unit encompasses the entire width of the path in which the object may be moved. For example, if the object is moved along a conveyor, the visual field of each optical sensor unit should preferably encompass the width of the conveyor belt.
  • the spatial position information may be electronically transmitted from each of the optical sensor units to the processor unit, or the information may be entered directly to the processor unit, for example via a keyboard connected to the processor unit.
  • the intensity of light is integrated during an integration time which may be at least 1 ms, such as at least 50 ⁇ s, such as at least 20 ⁇ s, preferably such as at least 5 ⁇ s, such as at least 2 ⁇ s, such as at least 0,5 ⁇ s.
  • the photo sensor array of the optical sensor units may, e.g., comprise 32-1024 photo sensors, such as 64-512, or 128-256. In preferred embodiments comprise 64, 128, 256 or 512 photo sensors are provided.
  • the system according to the first and second aspect of the present invention may comprise a comparator means which is adapted to compare the image information with stored 1 1
  • reference information so as to distinguish object image information from background information and to provide object image information to the processor unit.
  • the comparator means may comprise a micro controller adapted to compare the image information with stored reference information representing information as to the intensity of the signals received from the background of the object when no object is present, i.e. usually the surface of the conveyor.
  • the reference information may be stored in an electronic memory, such as an EEPROM. Different sets of reference information stored in the memory may, e.g., contain information as to light intensity variations corresponding to signals received when there is no object present in the visual field of the optical sensor units.
  • the electronic memory may be comprised in either the optical sensor unit, the comparator means or the processor unit.
  • Each optical sensor unit may comprise comparator means, so as to facilitate that only object image information is transferred to the processor unit.
  • comparator means By integrating a comparator means in each optical sensor unit, it is possible to transfer only the number of the specific pixel or photo sensor of the array of photo sensors between which the object is visible; thus, only information about the width and the sideways position of the object is transmitted from each of the optical sensor units to the processor unit.
  • the comparator means may also be comprised in the processor unit, so that substantially all image information is transferred from the optical sensor units to the comparator means of the processor unit.
  • the position of the object may be acquired from a pulse emitting sensor connected to the moving means.
  • the pulse emitting sensor may be adapted to provide a pulse whenever the moving means has moved the object a certain, predetermined distance, such as a distance between 1 and 100 mm, such as a distance between 1 and 50 mm, such as a distance between 1 and 20 mm, preferably a distance between 1 and 10 mm, and more preferably a distance between 1 and 6 mm, such as, e.g., 1 , 2 or 5 mm.
  • the processor unit is preferably adapted to process the image information and the moving means position information together with the information concerning the spatial position of 1 2
  • the optical sensor units by means of computational geometry for substantially each of the sectional planes so as to determine the geometry and/or the angular position of the object.
  • computational geometry principle used in this connection is described further below.
  • the object moving in a system may be an object chosen from the group of: parcels, letters, luggage, parcel post packets, totes, spare parts, newspapers, magazines, pharmaceuticals, articles of food, videotapes, magnetic tape cassettes, compact disks, floppy disks, and all other conveyable items, including all items deliverable by mail.
  • the moving means may be a conveyor moving the object, such as a conveyor belt, a tray or tilt-tray conveyor, a cross- belt conveyor or any other kind of conveyor.
  • the conveyor may be a feed conveyor for a receiving conveyor, the receiving conveyor being, e.g., part of a sorting or handling system.
  • the upper surface of the conveyor may have a pattern which facilitates distinguishing of the belt from the moving object.
  • the surface may, e.g., be provided with a pattern having characteristic light/dark nuances, such as a stripe or check pattern, so that the objects are easily distinguished from the background signals of the conveyor.
  • the comparator means may effectively distinguish the object image information from the reference information, and thus the object image information may be determined relatively accurately.
  • the optical sensor system may be used with the standard light provided by the surroundings of the conveyor, e.g., daylight or artificial light.
  • an external light source such as an electric bulb light source, a halogen bulb, a halide, such as a metal halide, such as a HQI, a neon tube, infrared light source and/or a fluorescent light source for providing illumination of the moving object and/or the background of the object.
  • This external light source may shine on the surface of the conveyor and/or on the moving object, the image of which surface is to be focused at the optical sensor.
  • the light source may be positioned underneath the conveyor, the light source being able to illuminate the conveyor from below, thus providing a stronger contrast in the image and thereby improving the distinguishability between object image information and background information.
  • the background in the visual field of the sensor units may be a surface having light-reflecting properties which do not change over time, or the background may be a reflective surface, a fluorescent surface or a luminous surface. Furthermore, the light source may be positioned below an intersection between a feed conveyor and a receiving conveyor.
  • Two or more optical sensor units may be positioned at a distance from each other along the travelling direction.
  • the processor means may further comprise synchronization means for synchronization of the image information from each of the at least two optical sensor units and the moving means position information so as to compensate for the difference in position along the travelling direction relative to each one of the at least two or more optical sensor units.
  • the invention further relates to a method for synchronization of the image information, the method comprising the steps outlined in connection with the system as described above.
  • the width and height of an object, and thereby the geometry of a moving object, may as mentioned above be determined by means of computational geometry.
  • the width of the object in a sectional plane may be determined by the steps of:
  • the object visual fields being defined as the visual fields of the individual optical sensor units which contain object image information, to extract the common area of the object visual fields for the individual optical sensor units of the system,
  • the width of the object in each sectional plane being determined as the difference between the outermost left and the outermost right limit of the common area.
  • the at least two optical sensor units do not necessarily need to be positioned in the same plane, i.e. do not need to see the object under the same angle or in the same sectional plane.
  • the image information from each optical sensor unit may then be stored and the width of the object may be determined by cross-line calculations in the sectional plane of the cross-section of the visual fields of the at least two optical sensor units.
  • the height is determined as described below.
  • at least two optical sensor units are preferably provided in the same 14 plane. Firstly, the highest possible uppermost point of the object and the lowest possible uppermost point of the object in a sectional plane are determined by the steps of:
  • the limits of the common area being defined by substantially straight lines connecting points of intersection of the visual fields of the individual optical sensor units within which all of the at least two optical sensor units sense the object,
  • the highest possible uppermost point of the object is the highest possible height of the object reckoned from the upper surface of the conveyor and the lowest possible uppermost point is the lowest possible height of the object reckoned from the upper surface of the conveyor.
  • a second approximated lowest uppermost point of the object in a sectional plane may be determined by:
  • Third and fourth approximated lowest uppermost points in a sectional plane may be determined by the steps of:
  • the correlating process may be an iterative process being repeated until 16
  • a predetermined difference between the respective resulting third and fourth approximated lowest uppermost points and the respective resulting second and third approximated highest uppermost points is reached, the predetermined difference being less than a predetermined value of 5-50 mm, such as 5-30 mm, such as 10-20 mm, such as 5-20 mm, preferably such as 5-10 mm,
  • a predetermined number of iterations such as a number of iterations between 1-1000, such as between 10-100, preferably 40-60,
  • the second predetermined difference being 0.5-5 mm, such as 0.5-3 mm, preferably 2-3 mm.
  • the predetermined difference between the respective resulting approximated lowest uppermost points and the respective resulting approximated highest uppermost points may alternatively be expressed as a percentage of an expected resulting height of the moving object, such as, e.g., 0.1-10%, such as 0.5-5%.
  • the volume of the object may be determined be a rough height measurement provided by a row of photo sensors positioned along the height direction of the object. By combination with the above-mentioned length and width determination a rough measurement of the volume of object is obtained. This rough measurement of volume may be sufficient for many purposes, such as for determination of the degree to which the containers wherein the objects are collected are filled.
  • the volume of the object may be determined by the steps of:
  • the length of the object may be obtained by methods known perse.
  • the moving means position information may be recorded the first time the image information contains object image information and a counter starts counting until the image information does not contain object image information. The value of the counter thereby determines the length of the object.
  • the integration is preferably substantially a summation of the circumference of the areas of each sectional plane of the object over which the volume is determined, the area of each sectional plane being defined as the circumference of a small volume part comprising the sectional plane, the size of the volume part being determined by the length between two adjacent sectional planes.
  • the volume of an object having a predetermined height may be determined with only a width and a length determination according to the above-mentioned methods.
  • the geometry and/or angular position of an object to be transferred from a feed conveyor to a receiving conveyor may be determined;
  • the conveyor system for transporting objects such as goods or articles comprising:
  • a feed conveyor for conveying objects in a first direction
  • control system being connected to an optical sensor system as described above said control system being adapted to control the speed of the feed conveyor in response to at least the output signal from the processor unit comprised in the optical sensor system.
  • the sideways and angular positions of the first object conveyed along the feed conveyor are usually unknown and in many cases unpredictable. However, at least the sideways position, and often also the angular position, is required in order to properly and effectively transfer the object from the feed conveyor to the receiving conveyor. In particular when the conveying direction of the feed conveyor forms an acute angle in relation to the conveying direction of the receiving conveyor, the sideways position of the object is required in order for the control system to properly control the speed of the feed conveyor in order for the 18
  • feed conveyor to transfer the object to the receiving conveyor at an appropriate position at an appropriate time, i.e. at a time and a position where the object may be accommodated at a suitable position on the receiving conveyor.
  • the sideways position of the object may be extracted from the object image information.
  • the object image information of the at least two optical sensor units is substantially described by intensity information and photo sensor position information.
  • the sideways position of the object is then determined, by the processor unit, from the photo sensor position information of the photo sensors sensing the object image information.
  • the position of each of the photo sensors of an array of photo sensors comprising 128 photo sensors may be expressed as a number between 0 and 127, each number characterising an angle of visual field for the corresponding photo sensor.
  • the present invention provides a conveyor system as described above wherein only one optical sensor unit is comprised in the optical sensor system.
  • a single optical sensor unit is sufficient to determine a two-dimensional image of a moving object though the accuracy of the image is considerably reduced in relation to the accuracy of a two-dimensional image determined by two or more optical sensor units. However, for some applications this reduced accuracy is sufficient.
  • a further aspect of the invention relates to an optical sensor unit for determining the geometry and/or angular position of an object to be transferred from a feed conveyor to a receiving conveyor, the conveyor system for transporting objects comprising:
  • a feed conveyor for conveying objects in a first direction
  • a receiving conveyor for conveying objects in a second direction, this second direction making an angle ⁇ with the receiving conveyor direction
  • control system being connected to an optical sensor system comprising an optical sensor unit, said control system being adapted to control the speed of the feed conveyor in response to at least an output signal from the processor unit comprised in the optical sensor system, 1 9
  • the optical sensor unit comprising:
  • an array of photo sensors adapted to sense intensity of light emitted or reflected in the visual field of the array of photo sensors, the visual field encompassing a path in which the object moves,
  • signal transmission means for transmission of image information representing the intensity of light sensed by each photo sensor
  • the optical sensor unit being positioned at one side of the feed conveyor, the visual field of the optical sensor unit thereby defining a sectional plane which may be substantially parallel to the feed conveyor, and a light source being positioned at the other side of the feed conveyor, so as to form a grid of light,
  • a measuring zone being defined by the visual field of two or more specific photo sensors each having a visual field, a processor unit connected to the optical sensor unit determining whether there is an object present in the measuring zone,
  • each photo sensor defines a measuring line
  • the present invention provides an improvement for prior art feed conveyor systems, such as, e.g., the feed conveyor disclosed in EP 0 305 755 and DE 37 29 081.
  • Fig. 1 shows schematically a measuring principle using linear array scanning for determining the width and the position of a moving object
  • Fig. 2 shows an optical sensor system for determining the width and sideways position of a moving object, 20
  • Fig. 3 shows schematically a measuring principle according to an embodiment of the invention where computational geometry, such as cross-line calculation, is used for determining geometry and/or angular position of a moving object
  • Fig. 4 shows a functional diagram of an optical sensor unit according to the present invention
  • Fig. 5 shows a set-up of an interface module connecting a number of optical sensor units to the processor unit
  • Figs. 6-9 shows diagrammatically a method of determining the width of a moving object
  • Figs. 10-19 shows diagrammatically a method of determining the height of a moving object
  • Figs. 20 and 21 show a conveyor or sorter system, wherein an optical sensor system for determining the geometry and/or angular position of a moving object, having one respectively two optical sensor units, is provided,
  • Fig. 22 shows a conveyor or sorter system wherein measuring lines are used for determining the position of an object.
  • Fig. 23 shows a conveyor or sorter system, wherein an optical sensor system for detecting defects beneath the conveyor unit or for monitoring sorting means or operations on the conveyor unit is provided.
  • Fig. 24 shows a conveyor or sorter system, wherein a single optical sensor for determining the sideways position of an object on a feed conveyor is provided.
  • Fig. 25 shows a conveyor or sorter system, wherein an optical sensor system for identifying the conveyor units by reading a barcode on the unit is provided.
  • Fig. 26 shows a conveyor or sorter system, wherein an optical sensor system for determining the speed of the conveyor track by reading barcodes is provided.
  • Fig. 27 shows an optical sensor unit, wherein a light source is integrated, such that both the light source and the optical sensor is provided in the same unit.
  • Fig. 1 shows schematically a prior art system for linear array scanning of a moving object 1 on e.g. a conveyor belt.
  • the system shown comprises eight photo diodes 2 placed above the conveyor belt and eight corresponding photo detectors 3 placed in slits (not shown) in the belt 4, a photo diode and a corresponding photo detector forming a photo sensor set, and the 8 photo sensor sets form a grid of light 5.
  • the grid 5 is broken and from the detection of which photo sensor sets have been obscured and when they have been obscured, it is possible to determine a width of the object 1 , and also a length of the object 1 if the speed of the moving object is provided. It is also possible to determine information as to the lateral placement of the object 1 on the conveyor belt 4.
  • the accuracy of the measurement is limited by the distance between two adjacent photo sensor sets. This distance is limited by the fact that a slit for each photo detector must be provided in the conveyor belt thus normally limiting the distance between two adjacent photo sensor sets to 40-50 mm and thus limiting the overall accuracy of the measurement to 40-50 mm.
  • a system for determining the width and sideways position of the object 1 can be determined by use of a single optical sensor unit 20.
  • the optical sensor unit 20 is positioned in such a way that the visual field of the optical sensor unit 20 encompasses at least part of the width of the path in which the object 1 is moved.
  • the visual field of the optical sensor unit 20 is illustrated by the solid lines 22, 23. It is preferred that the visual field of the optical sensor unit 20 encompasses the full width of the path.
  • the visual field of the optical sensor unit, in which the object is visible is illustrated by the dashed lines 26, 27.
  • the cross-sections 30 and 31 of the dashed lines and the conveyor belt 4, provide information about the maximum width of the object in the sectional plane of the object coinciding with the plane of the visual field of the optical sensor unit at the time 22
  • the width determination of the object is not very accurate and the higher the object, the more inaccurate is the width determined. Furthermore, the system is not capable of providing any height determination of the object.
  • Fig. 2b the position of the optical sensor unit 20 mounted at a support 240 is shown.
  • FIG. 3a an optical sensor system according to an embodiment of the present invention is shown schematically.
  • the width and sideways position of the object 1 can be determined by means of computational geometry, such as cross-line calculation.
  • two optical sensor units 20, 21 are positioned in such a way that the visual field of each optical sensor unit encompasses at least part of the width of the path in which the object 1 is moved.
  • the visual field of the optical sensor unit 20 is illustrated by the solid lines 22, 23 and the visual field of the optical sensor unit 21 is illustrated by the solid lines 24, 25. It is preferred that the visual field of each optical sensor unit encompasses the full width of the path.
  • the visual field of each of the two optical sensor units, in which the object is visible is illustrated by the dashed lines 26, 27, 28 and 29.
  • the cross-sections 30, 31 , and 32 of the dashed lines provide, in the embodiment shown, information about the maximum height and the maximum width of the object, respectively, in the sectional plane of the object coinciding with the plane of the visual field of the optical sensor units at the time of sensing.
  • Fig. 3b the position of the optical sensor units 20 and 21 mounted at a support 240 are shown.
  • more optical sensor units such as at least 3 optical sensor units, such as at least 4, 5, 6 or 7 optical sensor units may be provided.
  • the method of the width and the height determination is described further below in a system where three optical sensor units are provided.
  • Fig. 4 shows a functional diagram of an exemplification of an optical sensor unit comprising an optical sensor and a micro controller.
  • the optical sensor 33 is a CMOS-based opto sensor, such as a Texas Instrument Line Imager, such as a TSL1401. This optical sensor comprises 128 23
  • each pixel which measure incident light over an integration/exposure time and each pixel generate an analog output voltage at 0-2 V.
  • the output from each pixel is proportional to the exposure time multiplied by the intensity of light incident at the pixel.
  • the analog output voltage from each pixel is provided to an analog-to-digital converter 34, such as a TLC5510.
  • the digital signal is an 8-bit number between 0 and 255, corresponding to a 256 grey-scale resolution.
  • This digital signal is submitted to comparator means comprising a micro controller unit 35, such as a Microchip micro controller, such as a PIC16C77.
  • a standard lens system is provided at the optical sensor 33 for focusing an image of the conveyor and/or moving object at the optical sensor.
  • An EEPROM 36 is connected to the micro controller 35 for storing reference image information.
  • the reference image information which is also 8-bit signals for each of the 128 pixels, can be stored at a time prior to the actual measurements and being read into the micro controller unit at the time of measuring.
  • the reference image information contains information of the intensity of the signals normally received from the conveyor belt when no object is present, different set of reference image information stored in the EEPROM may i.e. contain information of the largest and smallest intensity, respectively, of signals received when there is no object on the conveyor.
  • the signals obtained from the optical sensor 33 are compared to the reference images, pixel by pixel.
  • a pixel of the image information having an intensity which differs from the intensity of the corresponding reference image pixel is a pixel, sensing part of the object.
  • the output of the micro controller unit 35 is hereby limited to the information of the intensity sensed at each pixel sensing part of the image, the "object image information". Of course the output may further be limited to information of which pixels of the optical sensor unit that senses part of the object.
  • the output port of the micro controller is a standard RS485 serial port, so that information from the optical sensor unit can be provided to a processor unit.
  • the micro controller was comprised in the optical sensor unit. This limits the information to be transferred from the optical sensor unit to the processor unit which may be positioned at a distance to the optical sensor. Typically, the distance is 1-10 meters, such as 5-10 meters. It is therefore advantageous to transfer only object relevant 24
  • micro controller may be an independent unit transferring information to the processor unit, or the micro controller may form part of the processor unit.
  • each optical sensor unit 151 , 152, 153, 154 provides object image information to the processor unit.
  • the object image information from each of the optical sensor units are processed together with information about the position of the moving means at the time of sensing and together with information concerning the spatial position of the optical sensor units, i.e. the angle in which the visual field of the optical sensor unit is encompassing the path of the object and the distance from the optical sensor unit to the conveyor belt. This information is processed in the processor unit according to the procedures which are shown in the following figures.
  • a conveyor belt with a travelling direction 41 is shown.
  • the moving means of the conveyor belt is defined as the moving part of the conveyor.
  • Three optical sensor units 42, 43, and 44 are positioned above the conveyor on a support which is positioned parallel to the surface of the conveyor belt and at a right angle to the travelling direction of the conveyor belt.
  • the visual fields of all three optical sensor units are identical at the level of the conveyor belt 45 and extend over the entire width of the conveyor belt 46.
  • the optical sensor units 42, 43, and 44 sense the object in the sectional plane of the object coinciding with the plane of the visual field of the optical sensor units and the sensed signals are recorded.
  • the position of the moving means is also recorded. All further processing of the image information and the position information in the specific sectional plane is made on the basis of these recordings.
  • optical sensor system of Fig. 6 is the system which will be considered in more detail in the following figures.
  • Fig. 7 shows, the visual fields of the optical sensor units 42, 43, and 44 of Fig. 6 in a right angle to the plane defined by the visual fields of the optical sensor units 42, 43, and 44.
  • the height 51 of each of the optical sensor units 42, 43, and 44 over the conveyor belt is the same for all three optical sensor units 42, 43, and 44.
  • a maximum height of the objects to be measured is defined, as indicated by the line 52 and the line 53 is the width of the conveyor belt.
  • Fig. 8 shows, the visual field for each of the optical sensor units in which the object is visible from the optical sensor unit. These visual fields have been found by comparing the image information to reference information so as to distinguish the visual fields in which a sensing different form the background is recorded.
  • this "object visual field" from each optical sensor unit is illustrated by solid lines 61.
  • the object visual field of each of the optical sensor units 42, 43, and 44 are defined by angles ⁇ , ⁇ , and ⁇ , respectively.
  • the intersections 62-68 of the lines 61 define the area in which the object is visible from all the optical sensor units 42, 43, and 44.
  • the width of the object in the particular sectional plane is determined by the largest width of the common area of the object visual fields of each of the optical sensor units 42, 43, and 44, respectively, and the outer limits of the object in the specific sectional plane are found by drawing straight lines connecting the points 62-68. As mentioned above the distance between the outermost right and the outermost left point is defining the width 69 of the object in the specific sectional plane.
  • an area limited by the highest possible uppermost point 111 of the object and the lowest possible uppermost point 112 of the object is defined.
  • the maximum possible height of the object in the specific sectional plane is the uppermost point, 62, of the intersection of the lines 61
  • the minimum possible height of the object is the second uppermost point, 66, of the intersection of the lines 61.
  • the maximum possible height is defined by the point 62
  • the minimum possible height is defined by a line parallel to the conveyor belt and traversing the point 66.
  • this area is a triangle and a second approximated lowest uppermost point of the object in the specific sectional plane, in the following called the "second minimum possible height", will have to be found in the hatched area 90.
  • Fig. 12 the altitude line 101 of the triangle is shown and the second minimum possible height of the object is found at a point along the line, 101 , starting from the intersection 103 of the altitude line 101 and the baseline 102 of the triangle 90.
  • the correlation between the image information from each of the optical sensor units 42, 43, and 44 is calculated, starting from the point of intersection 103, as shown in Fig. 13.
  • the image information in this point is defined by an angle in the visual field for each optical sensor unit.
  • the correlation of the image information of each of the optical sensor units 42, 43, and 44 is calculated in the point of intersection 103 and in a sequence of points along the altitude line 101 , the number of calculations determining the accuracy of the second minimum possible height determination of the object.
  • Fig. 15 shows a second hatched area 130, the upper limit of which is the maximum possible height 62, and the lower limit of which is a line 131 traversing the point where the second minimum possible height 120 is found.
  • the angle 140 is assumed to be 45°.
  • the intersection 141 of the line 143 forming an angle of 45° with the baseline 131 and the line connecting the points 62 and 63 is a second maximum possible height of the object, and the intersection 142 of the line 144 forming an angle of 45° with the baseline 131 and the line connecting the point 62 and the point 66 is a third maximum possible height of the object.
  • the second and third approximated highest uppermost points in this embodiment corresponds to the second 153 and third 154 maximum possible 27 heights of the object is the vertices of two new triangles respectively 151 and 152 each with the baseline formed by the line 131 of the second minimum possible height of the object 155.
  • Fig. 18 the two new correlation lines 161 and 162 corresponding to the altitudes of the triangles 151 and 152 are shown.
  • the height of the object along the two lines 161 and 162 are then found by correlation of the image information of each of the optical sensor units 42, 43, and 44, and two new minimum possible heights of the object are found as illustrated in Figs. 12-15.
  • the process is an iterative process which can be terminated when the difference between the minimum possible height and the maximum possible height is less than a predetermined value of 5-50 mm, such as 5-30 mm, such as 10-20 mm, such as 5-20 mm, preferably such as 5-10 mm.
  • Fig. 19 the area 171 of the object in the specific sectional plane is shown, the maximum width 172 and the maximum height 173 is shown.
  • the volume of an object can be calculated by combining the calculated heights and widths of the object and the information of the position of the moving means at the time of intensity measurement.
  • the grid formed by these calculated heights and widths and the object position information encloses the object.
  • the system described above may, according to the fourth aspect of the present invention, be used in conveyor systems as shown in Figs. 20 and 21. 28
  • the conveyor systems of Figs. 20 and 21 comprise two conveyors: a receiving conveyor 180 conveying objects in a first direction 182, and a feed conveyor 184 conveying objects in a second direction 186 making an angle 188 with the first conveyor direction 182.
  • Objects 190, 192 being moved at the feed conveyor 184 are transferred to the receiving conveyor 180.
  • An object 190 is transferred from the feed conveyor 184 to the receiving conveyor 180 whenever the object 190 has a geometry so that it can be accommodated in the free space between two adjacent objects (not shown) at the receiving conveyor 180.
  • the geometry of the object is in Fig. 20 determined by an optical sensor system as described in Fig. 2, using one optical sensor unit 191 , and in Fig. 21 determined by an optical sensor system as described in Fig. 3, using two optical sensor units 194, 196.
  • the choice of geometry determination being dependent on the accuracy required in the specific embodiment. Of course also at least three, four, five, six or seven optical sensor units may be positioned above the conveyor so as to obtain a better accuracy of the geometry determination.
  • the geometry of the object 190 to be transferred is determined and the dimensions of the object in the first direction are determined and so is the sideways position of the object on the feed conveyor. It is hereby possible to transfer the object 190 at the time and position where the free space between two adjacent objects at the first conveyor belt can accommodate the object 190.
  • Fig. 22 shows a top view of a system according to another aspect of the invention.
  • the conveyor or sorter system in Fig. 22 is similar to the conveyor systems of Figs. 20 and 21.
  • the geometry of the object is determined by means of an optical sensor unit a functional diagram of which is shown in Fig. 4.
  • the optical sensor unit 220 is positioned at one side of the feed conveyor, the visual field of the optical sensor unit 222 thereby defining a sectional plane which may be substantially parallel to the feed conveyor, and a light source 224 being positioned at the other side of the feed conveyor, so as to form a grid of light.
  • a measuring zone 226 is defined by the visual field of two or more specific photo sensors each having a visual field illustrated by the lines 228 and 230.
  • the angle between the lines 228 and 230 is substantially identical to the angle 188 between the travelling direction of the feed conveyor 29 and the travelling direction of the receiving conveyor.
  • the optical sensor senses when the object passes the visual field lines 228 and 230.
  • a processor unit connected to the optical sensor unit determines whether there is an object present in the measuring zone or not, and provides two output signals, one for each photo sensor of the visual field lines 228 and 230, in response hereto.
  • Fig. 23 shows an embodiment of an optical sensor system according to the first aspect of the present invention.
  • the embodiment shown comprises two optical sensors 231 , 232 positioned at each side of the conveyor unit 233, the visual field of the optical sensors thereby defining a sectional plane substantially perpendicular to the conveyor unit.
  • the visual fields of the two sensors on each side of the unit encompass, e.g., the tilting arms 235 and/or the rollers 236 at certain intervals.
  • a processor unit connected to the sensors may provide signal in case a defect is detected, such a defect being, e.g., a broken or defect tilting arm.
  • the processor unit may further be adapted to, e.g., provide a signal when a tilting arm 235 is in a tilting position, and/or when a wheel or roller 236 is missing or defect.
  • the processor unit may further be adapted to provide a signal in case irregularities in, e.g., driving parts is detected.
  • the processor unit may provide a signal if that distance is not within a predetermined interval.
  • the sorting system may be monitored for, e.g., break downs of the conveyor units, wear, erroneous position of the tilting arm and thereby erroneous sorting etc.
  • the number of objects on the tray may be determined by the sensors, when the objects pass through the visual field, so as to ensure the trays contain the desired number of objects.
  • One or more light sources on each side of the conveyor unit may be provided for ensuring sufficient illumination of the parts passing through the visual field of the sensors.
  • Means for preventing interference between the two sensors may be provided.
  • Such means may, e.g., comprise one or more plates or other light barriers.
  • the light barriers may have a pattern which facilitates distinguishing, by means of the comparator, the tilting arm and the wheel, respectively, from the barriers.
  • the optical sensors may be positioned at any positions whereby the most appropriate visual field for monitoring certain parts of the conveyor system may be provided.
  • the conveyor system shown in Fig. 24 comprises two conveyors; a feed conveyor 243 conveying objects in a first direction 244, a receiving conveyor 245 conveying objects in a second direction 240, and an optical sensor 241.
  • Information representing the sideways position of the object 242 may be used for controlling the starting time and/or speed of the feed conveyor in order to transfer the object 242 to the receiving conveyor 245 at an appropriate position and at an appropriate time.
  • the object 242 may be fed onto the receiving conveyor 245 with a certain delay dependent from the sideways position of the object in relation to the surface of the feeding conveyor 243, cf. also EP 0 355 705.
  • a feed conveyor 243 conveying objects in a first direction 244
  • a receiving conveyor 245 conveying objects in a second direction 240
  • an optical sensor 241 Information representing the sideways position of the object 242 may be used for controlling the starting time and/or speed of the feed conveyor in order to transfer the object 242 to the receiving conveyor 245 at an appropriate position and at an appropriate time.
  • the optical sensor unit 241 is positioned above the feed conveyor belt and is displaced in a sideways position in relation to the centre of the conveyor belt, the visual field of the optical sensor defining a sectional plane.
  • the optical sensor may, within its visual field, sense the bottom edge of the object 246, whereby the processor unit, connected to the sensor unit, may determine the sideways position of the object.
  • the conveyor unit 250 shown in Fig. 25 comprises a plate 251 positioned beneath the unit and having a barcode which can be detected by the sensor 252.
  • the sensor 252 detects the barcode.
  • a processor unit connected to the sensor may thus identify the unit.
  • the bar code may be printed on the plate 252 or elsewhere on the conveyor unit 251 or it may be provided as grooved in the plate 251.
  • the barcode may be positioned on top of the conveyor unit, whereby the barcode may be read from above by a sensor mounted on the upper side of the conveyor system.
  • Fig. 26 shows an embodiment comprising two optical sensors 260 and 261.
  • Two conveyor units 262 and 263, and two barcodes 264 and 265 are provided on each of the conveyor units.
  • the connection of the conveyor units in the conveyor track defines a fixed distance between the two barcodes.
  • the senor 261 sends a second signal to the processor unit.
  • the processor unit then calculates the speed or the velocity of the conveyor track from the distance between the two barcodes and the time difference between the first and second time instant.
  • the optical sensor system in Fig. 26 may comprise only one single optical sensor unit, reading both barcodes. Further, the speed or the velocity of the conveyor may be determined by sensing a predetermined edge portion at each of the conveyor units 262, 263, and calculating the speed or the velocity as mentioned above.
  • Fig. 23, 24, 25 and 26 may be performed by one single application as the one shown in Fig. 26.
  • the optical sensor system with two sensor units is capable of monitoring the conveyor system, identifying the conveyor unit, and calculating the speed or the velocity of the chain of conveyor units. Alternatively, these functions may be performed by only one single optical sensor unit.
  • the unit 270 shown in Fig. 27 comprises the optical means 272 and a light source 271 in the form of two halogen spots.
  • the light source 271 may comprise an LED-type power source or laser-diodes with a lens in front if necessary.
  • the optical sensors may comprise adjustment means, so as to adjust the visual field of the sensors and thereby provide an adjustable visual field of the sensors.
  • the adjustable area provides for the possibility that the sensor may sense/capture objects being supported by a conveyor unit as well as means mounted to the conveyor unit below its object supporting surface within a single passage of the conveyor unit through the visual field.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

An optical sensor system for determining the two- and three-dimensional geometry and/or position of moving objects, in particular objects being conveyed in a sorter conveyor system such as tilt-tray or cross-belt conveyors. The system comprises an optical sensor unit which comprises an array of photo sensors that sense the intensity of light emitted in the visual field wherein the object moves, a transmission of image information from the sensor unit to a processor unit, and a control system for controlling e.g. the conveyor system. A multipurpose optical sensor system is further disclosed which is adapted to perform at least two of the following operations: determination of the velocity of an object, detection of object identification information provided on the object and detection of defects on e.g. the conveyor system.

Description

1
AN OPTICAL SENSOR SYSTEM FOR INCORPORATION IN A CONVEYOR SYSTEM AND A METHOD FOR DETERMINING THE GEOMETRY AND/OR ANGULAR POSITION OF A MOVING OBJECT
FIELD OF THE INVENTION
The present invention relates to an optical sensor system for determining the geometry and/or angular position of a moving object, such as an object being conveyed in a conveyor system. In particular, the present invention relates to an optical sensor system for determining a three-dimensional image or measurement of an object being conveyed in a conveyor system.
BACKGROUND OF THE INVENTION
It is often desired to measure the dimensions and/or weight of moving objects, in particular in conveyor systems for conveying, e.g., letters, parcels, fruits or other matter. In such conveyor systems there often exists a need for controlling the feeding of objects from a feed conveyor onto a receiving conveyor and thus a need for surveying moving objects on the feed conveyor.
It is well known in the art to use CCD cameras or CCD arrays for surveying moving objects. CCD cameras provide a high quality of the measurements; thus for example, a CCD array comprises a large number of photo diodes, often many thousands of photo diodes, and provides a correspondingly high picture resolution.
in applications where such a high picture resolution is not required, an alternative has been to provide a number of sets of photo diodes positioned above a conveyor section and corresponding photo detectors positioned underneath the conveyor section. Thus, in case the conveyor section comprises a belt conveyor, the belt is usually divided into a number of sub-belts defining slits there between, light for the photo detectors passing through the slits. The resolution of such prior art systems is limited to the width of the sub-belts, usually limiting the accuracy to 40-50 mm. 2
From the prior art systems for three-dimensional measurements of objects in conveyor systems are known. French patent application FR 2 396 953 discloses a system for measuring the dimensions of a rectangular object being conveyed along a conveyor. The system disclosed in FR 2 396 953 comprises three photo sensors, a first one of which is arranged at a right angle in relation to the object, thereby measuring the length of the object by measuring the travelling distance of the object while the first photo detector detects the object. A second photo detector is arranged at an acute angle in the sectional plane of the object, whereby the sum of the length and the width is being measured in the same way as the length. Finally, a third photo detector is arranged at an acute angle in relation to the transport direction of the object, whereby the height of the object is being measured in an analogous way as the width. Thus, FR 2 396 953 discloses a relatively simple on/off use of one-dimensional photo detectors for determination of the size of a rectangular object.
In conveyor or sorter systems, objects are often transferred from one conveyor to another, such as, e.g., from a feed conveyor to a receiving conveyor. To optimize the capacity of such systems, it is important to transfer an object from the feed conveyor at the first possible time, in which a free space at the receiving conveyor can accommodate the object. Most often there is an acute angle between the feed conveyor and the receiving conveyor.
European Patent No. 0 305 755 and corresponding German patent no. 37 29 081 disclose a feed conveyor for a conveyor system of the above-mentioned type. In these patents, a system having a feed conveyor and a receiving conveyor is disclosed, the travelling direction of the receiving conveyor forming an acute angle, α, with the travelling direction of the feed conveyor. The length of the object is measured before the object is transferred to the receiving conveyor. Two measuring lines are defined, e.g., by two photo sensors, the two measuring lines being arranged at an acute angle which may be equal to α.
An alternative feed system encompassing a plurality of light scanners is disclosed in European Patent No. EP 0 366 857.
It has been found that the only prior art systems capable of both surveying moving objects and performing three-dimensional measurements on such objects or even generating full three-dimensional images of objects, are systems comprising CCD cameras. However, 3
such systems and in particular the hardware and software required to control and/or survey moving objects are expensive.
SUMMARY OF THE INVENTION
Thus, it is an object of the present invention to provide an optical sensor system for determining both the two- and three-dimensional geometry of a moving object which is both easy and cheap to implement. It is a further object of the invention to provide a system which may be used for a number of different purposes and which is easy and cheap to manufacture and easy to implement. A still further object of the invention is to provide a system which constitutes a cost efficient and reliable alternative to prior art systems, such as, e.g., systems comprising one or more CCD cameras.
According to a first aspect of the invention, at least some of the above objects are accomplished by a first optical sensor system for incorporation in a conveyor system, the conveyor system comprising a conveyor control system for controlling operation of the conveyor system, the optical sensor system comprising:
an optical sensor unit comprising:
an array of photo sensors adapted to sense intensity of light emitted or reflected in the visual field of the array of photo sensors, the visual field encompassing a path in which a part of the conveyor system and/or an object moves,
first signal transmission means for transmission of image information representing the intensity of light sensed by each photo sensor to a processor unit,
the processor unit comprising:
signal receiving means for receiving the image information from the optical sensor unit,
memory means for storing one or more algorithms which allow for appropriate processing of the image information by the processor unit depending on the operation of the first optical sensor system, the processor unit being adapted to process the image information which is being captured in the visual field, so as to generate an output signal representing the image of a part of the conveyor system and/or the object,
second signal transmission means for passing said output signal to the control system of the conveyor system,
the optical sensor unit being comprised in an integrated unit, the first optical sensor system being adapted to perform at least two of the following operations:
determining the sideways position of an object being supported and/or conveyed by the conveyor system,
- detecting object identification information provided or printed on the object,
capturing a sequence of at least two images, so as to determine therefrom a signal representing the velocity of a part of the conveyor system or the object,
- capturing an image of a predetermined part of the conveyor system, so as to survey the condition of said predetermined part of the conveyor system,
detecting whether an object is present within the visual field defined by one or more of the photo detectors,
the optical sensor system being adapted to perform at least one of said operations at a time.
The conveyor system may further comprise a plurality of conveyor units, the first optical sensor system further being adapted to perform the following operation:
detecting conveyor unit identification information provided or printed on at least some of the conveyor units. 5
In the present application, the term "photo sensors" covers photo-electric sensors, diode- based sensors and CCD-sensors.
The first optical sensor unit may be adapted to perform any number of operations, and may also be adapted to perform two or more operations simultaneously.
A particular possibility of the system according to the first aspect of the invention is that it may be used also as an on/off detection system for, e.g., detecting if an object is present in the visual field defined by one or more of the photo sensors.
The system may be applied for determining the sideways position of an object in relation to the supporting surface of the conveyor. This has the particular advantage that the sideways position of the object may be determined by means of only a single unit. Thus, two or more light scanners need not be applied.
A determination of the sideways position of an object being supported and/or conveyed by the conveyor system may provide valuable information e.g. when the conveyor system comprises a feed conveyor conveying objects in a first direction and a receiving conveyor conveying objects in a second direction. For the object being transferred from the feed conveyor to the receiving conveyor, information about the sideways position of the object is required in order to properly control the starting time and/or speed of the feed conveyor in order for the feed conveyor to transfer the object to the receiving conveyor at an appropriate position at an appropriate time, e.g., a position where the object may be accommodated at a suitable position on the receiving conveyor.
A detection of object identification information, such as a barcode, a binary number, provided or printed on the object and/or detection of conveyor unit identification information, such as a barcode, a binary number, etc., provided or printed on at least some of the conveyor units may provide information to the processor unit or the conveyor control system to enable monitoring of the conveyor system.
By capturing a sequence of at least two images, so as to determine therefrom a signal representing the velocity of a part of the conveyor system or the object, a contact less velocity determination may be performed. The two images may be any images, such as any characteristic edge point or part of a conveyor unit, such as any identification 6
information provided or printed on the object and/or provided or printed on at least some of the conveyor units or any contrast detected at a surface of the conveyor system.
The measured velocity may be transmitted to e.g. the processor unit and/or the conveyor control system at inquiries, or the velocity may be transmitted to the processor unit and/or the conveyor control system each time an image is captured, or the optical sensor system may provide a continuos or partly continuos set of pulses, wherein a pulse is transmitted for each given distance traversed.
By capturing an image of a predetermined part of the conveyor system, the condition of said predetermined part of the conveyor system may be surveyed or inspected. Hereby detection of broken or missing parts, such as a broken or defect tilting arm, a missing or defect wheel, inadequate distance from e.g. wheels to any actuators or motors caused by wear and tear or by defects in e.g. the wheel suspensions.
Furthermore, it is possible to detect whether a tray of the conveyor system used for supporting objects, is tilted or not tilted, whether a tray supports an object or not, how many objects a tray is supporting, etc.
At least part of the surface of the conveyor system may have a predetermined pattern, the processor unit of the first optical sensor system being adapted to process the image information, so as to distinguish the surface of the conveyor system from a part of the conveyor system and/or the object.
The predetermined pattern may have a plurality of mutually contrasting areas, whereby an output signal representing an image of a part of the conveyor system and/or the object may be generated by superimposing the image information and previously stored data representing said contrasting areas.
The predetermined pattern may be any characteristic surface, such as flecked, chequered, ruled, etc, further the mutually contrasting areas may be any light/dark areas, shining/non- shining areas, reflecting/non-reflecting areas, etc. Still further the optical sensor unit may be positioned above a slot wherein a discrete number of light sources is positioned, thereby defining light and dark areas of the surface. 7
By providing a surface with a predetermined pattern, an object and/or a part of the conveyor system may be detected by detection of missing contrasting areas or by detection of additional contrasting areas. The contrasting areas may be detected by detection of gradients in the sensed light intensity or by detection of predetermined levels of intensity.
The conveyor control system may comprise the processor unit or the processor unit may be an independent unit or the processor unit may form part of the integrated unit. Alternatively, the integrated unit may further comprise the conveyor control system.
The integrated unit may comprise a lens system provided in front of the optical sensor unit so as to facilitate regulation of the visual field of the sensor unit and to ensure that the objects to be detected are in the plane of focus of the array of photo sensors.
The integrated unit may further comprise a light source for illumination of the visual field. This light source may be a combination of any number of light sources positioned at the integrated unit, such as halogen spots, light emitting diodes, LED's, such as laser-diodes. A lens system may be provided in front of the lens systems. Furthermore, the light sources may be comprised in the optical sensor unit and using the same optical lens system as the lens system provided in front of the optical sensor unit to illuminated the visual field.
Further, the optical sensor system may control the light source so that the sensor unit sampling frequency is adjusted to the frequency of the light source. The intensity of light is integrated during an integration time which may be at least 1 ms, such as at least 50 μs, such as at least 20 μs, preferably such as at least 5 μs, such as at least 2 μs, such as at least 0,5 μs.
As representative examples, the photo sensor array of the optical sensor units may, e.g., comprise 32-1024 photo sensors, such as 64-512, or 128-256. In preferred embodiments comprise 64, 128, 256 or 512 photo sensors are provided.
This optical sensor may for example comprise a number of photo sensors (pixels) which measure incident light over an integration/exposure time and each photo sensor may generate an analog output voltage. Thus, the output from each photo sensor is proportional to the exposure time multiplied by the intensity of light incident at the photo sensor. The 8
analog output voltage from each pixel is provided to an analog-to-digital converter and the converted digital signal is an 8-bit number between 0 and 255, corresponding to a 256 grey-scale resolution.
According to a second aspect of the invention, at least some of the above objects are accomplished by an optical sensor system for determining the geometry and/or angular position of a moving object, said system preferably comprising:
at least two optical sensor units each comprising:
an array of photo sensors adapted to sense intensity of light emitted or reflected in the visual field of the array of photo sensors, the visual field encompassing a path in which the object moves,
signal transmission means for transmission of image information representing the intensity of light sensed by each photo sensor,
means for moving the object in a travelling direction relative to the optical sensor units,
a processor unit comprising:
signal receiving means for receiving the image information from each of the at least two optical sensor units,
means for obtaining information as to the position of the moving means relative to the sensor units,
the processor unit being adapted to process the image information and the moving means position information together with information concerning spatial position of each of the at least two optical sensor units, in a sectional plane corresponding to a moving means position or a sequence of sectional planes corresponding to a sequence of moving means positions so as to generate an output signal representing the geometry and/or the angular position of the object. 9
Thus, the second aspect of the present invention provides an optical sensor system which is both easy and cheap to implement as it requires only two optical sensor units while allowing for a relatively high resolution of both two- and three-dimensional images and/or geometry measurements of moving objects. The system according to the first and second aspect of the invention has the further advantage that no sets of photo detectors/photo diodes are required, thereby increasing both the obtainable accuracy /resolution and the robustness of the system in relation to prior art systems comprising conventional photo detectors/photo diodes. Moreover, the system according to the second aspect of the present invention is capable of performing two- and three-dimensional measurements and determining two- and three-dimensional images of objects of substantially any shape.
According to a third aspect of the invention, the above-mentioned and other objects are accomplished by a method of determining the geometry and/or angular position of a moving object, the method comprising:
providing at least two optical sensor units, each comprising
an array of photo sensor sensing intensity of light emitted or reflected in the visual field of the array of photo sensors, the visual field encompassing a path in which the object moves,
and signal transmission means for transmission of image information representing the intensity of light sensed by each photo sensor,
moving the object in a travelling direction,
obtaining information of the position of the moving means relative to the sensor units,
transmitting image information representing the intensity of light at pixels of the photo sensor array, to a processor unit,
receiving the image information at the processor unit, 10
and processing the image information and the moving means position information together with information concerning the spatial position information of each of the at least two optical sensor units in a sectional plane or a sequence of sectional planes of the object so as to generate an output signal representing the geometry and/or the angular position of the object.
By supplying at least two optical sensor units to the system, or by using at least two optical sensor units in the method, a determination of the area occupied by the object is obtained which determination is much more accurate than an area determination obtainable with a single optical sensor unit.
The optical sensor units may be positioned above a conveyor. In such case, the heights above the conveyor and the angular positions of the optical sensor units which define the visual field of the optical sensor constitute the spatial position information. The spatial position information determines the visual field of each optical sensor which is the entire expanse of space visible from the optical sensor unit at the given height. To obtain the best results for a moving object, the optical sensor units may be positioned so that the visual field of each optical sensor unit encompasses the entire width of the path in which the object may be moved. For example, if the object is moved along a conveyor, the visual field of each optical sensor unit should preferably encompass the width of the conveyor belt.
The spatial position information may be electronically transmitted from each of the optical sensor units to the processor unit, or the information may be entered directly to the processor unit, for example via a keyboard connected to the processor unit.
The intensity of light is integrated during an integration time which may be at least 1 ms, such as at least 50 μs, such as at least 20 μs, preferably such as at least 5 μs, such as at least 2 μs, such as at least 0,5 μs.
As representative examples, the photo sensor array of the optical sensor units may, e.g., comprise 32-1024 photo sensors, such as 64-512, or 128-256. In preferred embodiments comprise 64, 128, 256 or 512 photo sensors are provided.
The system according to the first and second aspect of the present invention may comprise a comparator means which is adapted to compare the image information with stored 1 1
reference information so as to distinguish object image information from background information and to provide object image information to the processor unit.
The comparator means may comprise a micro controller adapted to compare the image information with stored reference information representing information as to the intensity of the signals received from the background of the object when no object is present, i.e. usually the surface of the conveyor.
The reference information may be stored in an electronic memory, such as an EEPROM. Different sets of reference information stored in the memory may, e.g., contain information as to light intensity variations corresponding to signals received when there is no object present in the visual field of the optical sensor units. The electronic memory may be comprised in either the optical sensor unit, the comparator means or the processor unit.
Each optical sensor unit may comprise comparator means, so as to facilitate that only object image information is transferred to the processor unit. By integrating a comparator means in each optical sensor unit, it is possible to transfer only the number of the specific pixel or photo sensor of the array of photo sensors between which the object is visible; thus, only information about the width and the sideways position of the object is transmitted from each of the optical sensor units to the processor unit.
The comparator means may also be comprised in the processor unit, so that substantially all image information is transferred from the optical sensor units to the comparator means of the processor unit.
According to the second aspect of the invention, the position of the object may be acquired from a pulse emitting sensor connected to the moving means. The pulse emitting sensor may be adapted to provide a pulse whenever the moving means has moved the object a certain, predetermined distance, such as a distance between 1 and 100 mm, such as a distance between 1 and 50 mm, such as a distance between 1 and 20 mm, preferably a distance between 1 and 10 mm, and more preferably a distance between 1 and 6 mm, such as, e.g., 1 , 2 or 5 mm.
The processor unit is preferably adapted to process the image information and the moving means position information together with the information concerning the spatial position of 1 2
the optical sensor units by means of computational geometry for substantially each of the sectional planes so as to determine the geometry and/or the angular position of the object. The computational geometry principle used in this connection is described further below.
The object moving in a system according to any aspects of the present invention may be an object chosen from the group of: parcels, letters, luggage, parcel post packets, totes, spare parts, newspapers, magazines, pharmaceuticals, articles of food, videotapes, magnetic tape cassettes, compact disks, floppy disks, and all other conveyable items, including all items deliverable by mail.
In the system according to any aspects of the invention the moving means may be a conveyor moving the object, such as a conveyor belt, a tray or tilt-tray conveyor, a cross- belt conveyor or any other kind of conveyor. In particular, the conveyor may be a feed conveyor for a receiving conveyor, the receiving conveyor being, e.g., part of a sorting or handling system.
In a conveyor system, at least part of the upper surface of the conveyor may have a pattern which facilitates distinguishing of the belt from the moving object. The surface may, e.g., be provided with a pattern having characteristic light/dark nuances, such as a stripe or check pattern, so that the objects are easily distinguished from the background signals of the conveyor. Thus, if the belt surface characteristic is known and well-defined, the comparator means may effectively distinguish the object image information from the reference information, and thus the object image information may be determined relatively accurately.
The optical sensor system according to any aspects of the present invention may be used with the standard light provided by the surroundings of the conveyor, e.g., daylight or artificial light. Preferably, there is provided an external light source, such as an electric bulb light source, a halogen bulb, a halide, such as a metal halide, such as a HQI, a neon tube, infrared light source and/or a fluorescent light source for providing illumination of the moving object and/or the background of the object. This external light source may shine on the surface of the conveyor and/or on the moving object, the image of which surface is to be focused at the optical sensor. Alternatively, the light source may be positioned underneath the conveyor, the light source being able to illuminate the conveyor from below, thus providing a stronger contrast in the image and thereby improving the distinguishability between object image information and background information. Alternatively, the 1 3
background in the visual field of the sensor units may be a surface having light-reflecting properties which do not change over time, or the background may be a reflective surface, a fluorescent surface or a luminous surface. Furthermore, the light source may be positioned below an intersection between a feed conveyor and a receiving conveyor.
Two or more optical sensor units may be positioned at a distance from each other along the travelling direction. In such case, the processor means may further comprise synchronization means for synchronization of the image information from each of the at least two optical sensor units and the moving means position information so as to compensate for the difference in position along the travelling direction relative to each one of the at least two or more optical sensor units. The invention further relates to a method for synchronization of the image information, the method comprising the steps outlined in connection with the system as described above.
The width and height of an object, and thereby the geometry of a moving object, may as mentioned above be determined by means of computational geometry. Firstly, the width of the object in a sectional plane may be determined by the steps of:
comparing the object visual fields, the object visual fields being defined as the visual fields of the individual optical sensor units which contain object image information, to extract the common area of the object visual fields for the individual optical sensor units of the system,
the width of the object in each sectional plane being determined as the difference between the outermost left and the outermost right limit of the common area.
As may be seen from above, the at least two optical sensor units do not necessarily need to be positioned in the same plane, i.e. do not need to see the object under the same angle or in the same sectional plane. The image information from each optical sensor unit may then be stored and the width of the object may be determined by cross-line calculations in the sectional plane of the cross-section of the visual fields of the at least two optical sensor units.
Secondly, the height is determined as described below. In order to determine the height of a moving object, at least two optical sensor units are preferably provided in the same 14 plane. Firstly, the highest possible uppermost point of the object and the lowest possible uppermost point of the object in a sectional plane are determined by the steps of:
comparing the object visual field of the individual optical sensor units to extract the common area of the object visual fields, the limits of the common area being defined by substantially straight lines connecting points of intersection of the visual fields of the individual optical sensor units within which all of the at least two optical sensor units sense the object,
determining the highest possible uppermost point of the object as the uppermost point of intersection defining the common area, and determining the lowest possible uppermost point of the object as the second uppermost point of intersection defining the common area.
For example, if the object is moved along a conveyor, the highest possible uppermost point of the object is the highest possible height of the object reckoned from the upper surface of the conveyor and the lowest possible uppermost point is the lowest possible height of the object reckoned from the upper surface of the conveyor.
To obtain a better approximation to the height of the object, a second approximated lowest uppermost point of the object in a sectional plane may be determined by:
defining a triangle, the baseline of which is a substantially horizontal line traversing the second approximated uppermost point of intersection, and the other sides of which are the substantially straight lines of the circumference of the common area connecting the uppermost point of intersection with the baseline,
correlating the object image information of each of the at least two optical sensor units substantially at the intersection of the altitude of the triangle and the baseline of the triangle,
correlating the object image information of each of the at least two optical sensor units through a sequence of points substantially along the altitude of the triangle until the uppermost point of intersection is reached, 15
determining the point having the substantially best correlation among the group of points constituted by the intersection point and the points of the sequence, this point being the second approximated lowest uppermost point.
It may be assumed that the object has a shape so that from the baseline comprising the second approximated lowest uppermost point there is a maximum angle to each side of the point within which at least part of the object will occupy space. Third and fourth approximated lowest uppermost points in a sectional plane may be determined by the steps of:
defining a second approximated highest uppermost point at one side of the second approximated lowest uppermost point and a third approximated highest uppermost point at the other side of the second approximated lowest uppermost point,
defining lines between the second approximated lowest uppermost point and each of the second and third approximated highest uppermost points,
correlating the object image information of each of the at least two optical sensor units substantially at the intersection of the altitude of each of the two triangles formed by respective parts of the baseline comprising the second approximated lowest uppermost point, the respective lines between the second approximated lowest uppermost point and each of the second and third approximated highest uppermost points, and respective parts of the straight lines of the circumference of the common area connecting the uppermost point of intersection with the baseline,
correlating the object image information of each of the at least two optical sensor units at a sequence of points substantially along each altitude of each of the triangles until the uppermost limit for each of the new triangles is reached,
determining the respective points having the best correlation among the respective group of points constituted by each intersection point and the points of each sequence, these points being the third and fourth approximated lowest uppermost points.
The correlating process may be an iterative process being repeated until 16
a predetermined difference between the respective resulting third and fourth approximated lowest uppermost points and the respective resulting second and third approximated highest uppermost points is reached, the predetermined difference being less than a predetermined value of 5-50 mm, such as 5-30 mm, such as 10-20 mm, such as 5-20 mm, preferably such as 5-10 mm,
or until a predetermined number of iterations has been performed, such as a number of iterations between 1-1000, such as between 10-100, preferably 40-60,
or until a second predetermined difference between the difference between the respective resulting second approximated highest uppermost point and the third and fourth approximated lowest uppermost points of two adjacent iterations is reached, the second predetermined difference being 0.5-5 mm, such as 0.5-3 mm, preferably 2-3 mm.
The predetermined difference between the respective resulting approximated lowest uppermost points and the respective resulting approximated highest uppermost points may alternatively be expressed as a percentage of an expected resulting height of the moving object, such as, e.g., 0.1-10%, such as 0.5-5%.
The volume of the object may be determined be a rough height measurement provided by a row of photo sensors positioned along the height direction of the object. By combination with the above-mentioned length and width determination a rough measurement of the volume of object is obtained. This rough measurement of volume may be sufficient for many purposes, such as for determination of the degree to which the containers wherein the objects are collected are filled.
Alternatively, the volume of the object may be determined by the steps of:
determining the circumference of the area in which the object is determined to be present for substantially each sectional plane of the sequence of sectional planes,
integrating the circumference of the area in which the object is determined to be present over the length of the object, whereby the volume is determined. 1 7
The length of the object may be obtained by methods known perse. Thus, for example, the moving means position information may be recorded the first time the image information contains object image information and a counter starts counting until the image information does not contain object image information. The value of the counter thereby determines the length of the object.
The integration is preferably substantially a summation of the circumference of the areas of each sectional plane of the object over which the volume is determined, the area of each sectional plane being defined as the circumference of a small volume part comprising the sectional plane, the size of the volume part being determined by the length between two adjacent sectional planes.
The volume of an object having a predetermined height may be determined with only a width and a length determination according to the above-mentioned methods.
According to a fourth aspect of the invention the geometry and/or angular position of an object to be transferred from a feed conveyor to a receiving conveyor may be determined; the conveyor system for transporting objects such as goods or articles comprising:
a feed conveyor for conveying objects in a first direction,
a receiving conveyor for conveying objects in a second direction,
a control system being connected to an optical sensor system as described above said control system being adapted to control the speed of the feed conveyor in response to at least the output signal from the processor unit comprised in the optical sensor system.
The sideways and angular positions of the first object conveyed along the feed conveyor are usually unknown and in many cases unpredictable. However, at least the sideways position, and often also the angular position, is required in order to properly and effectively transfer the object from the feed conveyor to the receiving conveyor. In particular when the conveying direction of the feed conveyor forms an acute angle in relation to the conveying direction of the receiving conveyor, the sideways position of the object is required in order for the control system to properly control the speed of the feed conveyor in order for the 18
feed conveyor to transfer the object to the receiving conveyor at an appropriate position at an appropriate time, i.e. at a time and a position where the object may be accommodated at a suitable position on the receiving conveyor.
Thus, in a conveyor system according to the present invention, the sideways position of the object may be extracted from the object image information. The object image information of the at least two optical sensor units is substantially described by intensity information and photo sensor position information. The sideways position of the object is then determined, by the processor unit, from the photo sensor position information of the photo sensors sensing the object image information. For example, the position of each of the photo sensors of an array of photo sensors comprising 128 photo sensors may be expressed as a number between 0 and 127, each number characterising an angle of visual field for the corresponding photo sensor.
According to a further aspect the present invention provides a conveyor system as described above wherein only one optical sensor unit is comprised in the optical sensor system. As it appears from the above description, a single optical sensor unit is sufficient to determine a two-dimensional image of a moving object though the accuracy of the image is considerably reduced in relation to the accuracy of a two-dimensional image determined by two or more optical sensor units. However, for some applications this reduced accuracy is sufficient.
A further aspect of the invention relates to an optical sensor unit for determining the geometry and/or angular position of an object to be transferred from a feed conveyor to a receiving conveyor, the conveyor system for transporting objects comprising:
a feed conveyor for conveying objects in a first direction,
a receiving conveyor for conveying objects in a second direction, this second direction making an angle α with the receiving conveyor direction,
a control system being connected to an optical sensor system comprising an optical sensor unit, said control system being adapted to control the speed of the feed conveyor in response to at least an output signal from the processor unit comprised in the optical sensor system, 1 9
the optical sensor unit comprising:
an array of photo sensors adapted to sense intensity of light emitted or reflected in the visual field of the array of photo sensors, the visual field encompassing a path in which the object moves,
signal transmission means for transmission of image information representing the intensity of light sensed by each photo sensor,
the optical sensor unit being positioned at one side of the feed conveyor, the visual field of the optical sensor unit thereby defining a sectional plane which may be substantially parallel to the feed conveyor, and a light source being positioned at the other side of the feed conveyor, so as to form a grid of light,
a measuring zone being defined by the visual field of two or more specific photo sensors each having a visual field, a processor unit connected to the optical sensor unit determining whether there is an object present in the measuring zone,
whereby the visual field of each photo sensor defines a measuring line.
Thus, the present invention provides an improvement for prior art feed conveyor systems, such as, e.g., the feed conveyor disclosed in EP 0 305 755 and DE 37 29 081.
BRIEF DESCRIPTION OF THE DRAWINGS
Fig. 1 shows schematically a measuring principle using linear array scanning for determining the width and the position of a moving object,
Fig. 2 shows an optical sensor system for determining the width and sideways position of a moving object, 20
Fig. 3 shows schematically a measuring principle according to an embodiment of the invention where computational geometry, such as cross-line calculation, is used for determining geometry and/or angular position of a moving object,
Fig. 4 shows a functional diagram of an optical sensor unit according to the present invention,
Fig. 5 shows a set-up of an interface module connecting a number of optical sensor units to the processor unit,
Figs. 6-9 shows diagrammatically a method of determining the width of a moving object,
Figs. 10-19 shows diagrammatically a method of determining the height of a moving object,
Figs. 20 and 21 show a conveyor or sorter system, wherein an optical sensor system for determining the geometry and/or angular position of a moving object, having one respectively two optical sensor units, is provided,
Fig. 22 shows a conveyor or sorter system wherein measuring lines are used for determining the position of an object.
Fig. 23 shows a conveyor or sorter system, wherein an optical sensor system for detecting defects beneath the conveyor unit or for monitoring sorting means or operations on the conveyor unit is provided.
Fig. 24 shows a conveyor or sorter system, wherein a single optical sensor for determining the sideways position of an object on a feed conveyor is provided.
Fig. 25 shows a conveyor or sorter system, wherein an optical sensor system for identifying the conveyor units by reading a barcode on the unit is provided.
Fig. 26 shows a conveyor or sorter system, wherein an optical sensor system for determining the speed of the conveyor track by reading barcodes is provided. 21
Fig. 27 shows an optical sensor unit, wherein a light source is integrated, such that both the light source and the optical sensor is provided in the same unit.
DETAILED DESCRIPTION OF THE DRAWINGS
Fig. 1 shows schematically a prior art system for linear array scanning of a moving object 1 on e.g. a conveyor belt. The system shown comprises eight photo diodes 2 placed above the conveyor belt and eight corresponding photo detectors 3 placed in slits (not shown) in the belt 4, a photo diode and a corresponding photo detector forming a photo sensor set, and the 8 photo sensor sets form a grid of light 5. When the object 1 is moved into the area covered by the grid 5, the grid 5 is broken and from the detection of which photo sensor sets have been obscured and when they have been obscured, it is possible to determine a width of the object 1 , and also a length of the object 1 if the speed of the moving object is provided. It is also possible to determine information as to the lateral placement of the object 1 on the conveyor belt 4.
As mentioned in the section "Background of the invention" the accuracy of the measurement is limited by the distance between two adjacent photo sensor sets. This distance is limited by the fact that a slit for each photo detector must be provided in the conveyor belt thus normally limiting the distance between two adjacent photo sensor sets to 40-50 mm and thus limiting the overall accuracy of the measurement to 40-50 mm.
In Fig. 2a a system for determining the width and sideways position of the object 1 can be determined by use of a single optical sensor unit 20. Above e.g. a conveyor belt 4 the optical sensor unit 20 is positioned in such a way that the visual field of the optical sensor unit 20 encompasses at least part of the width of the path in which the object 1 is moved. The visual field of the optical sensor unit 20 is illustrated by the solid lines 22, 23. It is preferred that the visual field of the optical sensor unit 20 encompasses the full width of the path.
The visual field of the optical sensor unit, in which the object is visible is illustrated by the dashed lines 26, 27. The cross-sections 30 and 31 of the dashed lines and the conveyor belt 4, provide information about the maximum width of the object in the sectional plane of the object coinciding with the plane of the visual field of the optical sensor unit at the time 22
of sensing. As can be seen from the figure, the width determination of the object is not very accurate and the higher the object, the more inaccurate is the width determined. Furthermore, the system is not capable of providing any height determination of the object.
In Fig. 2b the position of the optical sensor unit 20 mounted at a support 240 is shown.
In Fig. 3a an optical sensor system according to an embodiment of the present invention is shown schematically. The width and sideways position of the object 1 can be determined by means of computational geometry, such as cross-line calculation. Above e.g. a conveyor belt 4 two optical sensor units 20, 21 are positioned in such a way that the visual field of each optical sensor unit encompasses at least part of the width of the path in which the object 1 is moved. The visual field of the optical sensor unit 20 is illustrated by the solid lines 22, 23 and the visual field of the optical sensor unit 21 is illustrated by the solid lines 24, 25. It is preferred that the visual field of each optical sensor unit encompasses the full width of the path.
The visual field of each of the two optical sensor units, in which the object is visible is illustrated by the dashed lines 26, 27, 28 and 29. The cross-sections 30, 31 , and 32 of the dashed lines provide, in the embodiment shown, information about the maximum height and the maximum width of the object, respectively, in the sectional plane of the object coinciding with the plane of the visual field of the optical sensor units at the time of sensing.
In Fig. 3b the position of the optical sensor units 20 and 21 mounted at a support 240 are shown.
To obtain a more accurate determination of the width and the height, more optical sensor units, such as at least 3 optical sensor units, such as at least 4, 5, 6 or 7 optical sensor units may be provided. The method of the width and the height determination is described further below in a system where three optical sensor units are provided.
Fig. 4 shows a functional diagram of an exemplification of an optical sensor unit comprising an optical sensor and a micro controller.
In a preferred embodiment the optical sensor 33 is a CMOS-based opto sensor, such as a Texas Instrument Line Imager, such as a TSL1401. This optical sensor comprises 128 23
pixels which measure incident light over an integration/exposure time and each pixel generate an analog output voltage at 0-2 V. Thus, the output from each pixel is proportional to the exposure time multiplied by the intensity of light incident at the pixel. The analog output voltage from each pixel is provided to an analog-to-digital converter 34, such as a TLC5510. The digital signal is an 8-bit number between 0 and 255, corresponding to a 256 grey-scale resolution. This digital signal is submitted to comparator means comprising a micro controller unit 35, such as a Microchip micro controller, such as a PIC16C77.
A standard lens system is provided at the optical sensor 33 for focusing an image of the conveyor and/or moving object at the optical sensor.
An EEPROM 36 is connected to the micro controller 35 for storing reference image information. The reference image information which is also 8-bit signals for each of the 128 pixels, can be stored at a time prior to the actual measurements and being read into the micro controller unit at the time of measuring. The reference image information contains information of the intensity of the signals normally received from the conveyor belt when no object is present, different set of reference image information stored in the EEPROM may i.e. contain information of the largest and smallest intensity, respectively, of signals received when there is no object on the conveyor.
In the micro controller unit 35, the signals obtained from the optical sensor 33 are compared to the reference images, pixel by pixel. A pixel of the image information having an intensity which differs from the intensity of the corresponding reference image pixel is a pixel, sensing part of the object. The output of the micro controller unit 35 is hereby limited to the information of the intensity sensed at each pixel sensing part of the image, the "object image information". Of course the output may further be limited to information of which pixels of the optical sensor unit that senses part of the object.
The output port of the micro controller is a standard RS485 serial port, so that information from the optical sensor unit can be provided to a processor unit.
In the above example, the micro controller was comprised in the optical sensor unit. This limits the information to be transferred from the optical sensor unit to the processor unit which may be positioned at a distance to the optical sensor. Typically, the distance is 1-10 meters, such as 5-10 meters. It is therefore advantageous to transfer only object relevant 24
information to the processor unit. Of course the micro controller may be an independent unit transferring information to the processor unit, or the micro controller may form part of the processor unit.
In Fig. 5, a set-up of an interface module connecting four optical sensor units according to Fig. 4 to a processor unit 155 is shown. Preferably, each optical sensor unit 151 , 152, 153, 154 provides object image information to the processor unit. In the processor unit the object image information from each of the optical sensor units are processed together with information about the position of the moving means at the time of sensing and together with information concerning the spatial position of the optical sensor units, i.e. the angle in which the visual field of the optical sensor unit is encompassing the path of the object and the distance from the optical sensor unit to the conveyor belt. This information is processed in the processor unit according to the procedures which are shown in the following figures.
In Fig. 6, a conveyor belt with a travelling direction 41 is shown. The moving means of the conveyor belt is defined as the moving part of the conveyor. Three optical sensor units 42, 43, and 44 are positioned above the conveyor on a support which is positioned parallel to the surface of the conveyor belt and at a right angle to the travelling direction of the conveyor belt. In this example, the visual fields of all three optical sensor units are identical at the level of the conveyor belt 45 and extend over the entire width of the conveyor belt 46. Thus, at the time of sensing, the optical sensor units 42, 43, and 44 sense the object in the sectional plane of the object coinciding with the plane of the visual field of the optical sensor units and the sensed signals are recorded. At the time of sensing, the position of the moving means is also recorded. All further processing of the image information and the position information in the specific sectional plane is made on the basis of these recordings.
The optical sensor system of Fig. 6 is the system which will be considered in more detail in the following figures.
Fig. 7 shows, the visual fields of the optical sensor units 42, 43, and 44 of Fig. 6 in a right angle to the plane defined by the visual fields of the optical sensor units 42, 43, and 44. In this specific embodiment the height 51 of each of the optical sensor units 42, 43, and 44 over the conveyor belt is the same for all three optical sensor units 42, 43, and 44. In the 25 specific embodiment illustrated a maximum height of the objects to be measured is defined, as indicated by the line 52 and the line 53 is the width of the conveyor belt.
Fig. 8 shows, the visual field for each of the optical sensor units in which the object is visible from the optical sensor unit. These visual fields have been found by comparing the image information to reference information so as to distinguish the visual fields in which a sensing different form the background is recorded. In Fig. 8 this "object visual field" from each optical sensor unit is illustrated by solid lines 61. The object visual field of each of the optical sensor units 42, 43, and 44 are defined by angles α, β, and γ, respectively. The intersections 62-68 of the lines 61 define the area in which the object is visible from all the optical sensor units 42, 43, and 44.
In Fig. 9, the width of the object in the particular sectional plane is determined by the largest width of the common area of the object visual fields of each of the optical sensor units 42, 43, and 44, respectively, and the outer limits of the object in the specific sectional plane are found by drawing straight lines connecting the points 62-68. As mentioned above the distance between the outermost right and the outermost left point is defining the width 69 of the object in the specific sectional plane.
It is seen from Figs. 8 and 9 that the more optical sensor units provided for each sectional plane the more intersection points are obtained and thus the more accurate is the determination of the width of the object in the specific sectional plane.
In Fig. 10, an area limited by the highest possible uppermost point 111 of the object and the lowest possible uppermost point 112 of the object is defined. In the embodiment shown in Fig. 10, wherein the object is positioned on the conveyor belt, this corresponds to the maximum and the minimum possible heights of the object. The maximum possible height of the object in the specific sectional plane is the uppermost point, 62, of the intersection of the lines 61 , and the minimum possible height of the object is the second uppermost point, 66, of the intersection of the lines 61. Thus, in the embodiment shown, the maximum possible height is defined by the point 62 and the minimum possible height is defined by a line parallel to the conveyor belt and traversing the point 66. As shown in Fig. 11 this area is a triangle and a second approximated lowest uppermost point of the object in the specific sectional plane, in the following called the "second minimum possible height", will have to be found in the hatched area 90. 26
In Fig. 12, the altitude line 101 of the triangle is shown and the second minimum possible height of the object is found at a point along the line, 101 , starting from the intersection 103 of the altitude line 101 and the baseline 102 of the triangle 90.
The correlation between the image information from each of the optical sensor units 42, 43, and 44 is calculated, starting from the point of intersection 103, as shown in Fig. 13. The image information in this point is defined by an angle in the visual field for each optical sensor unit.
As illustrated in Fig. 14 the correlation of the image information of each of the optical sensor units 42, 43, and 44 is calculated in the point of intersection 103 and in a sequence of points along the altitude line 101 , the number of calculations determining the accuracy of the second minimum possible height determination of the object. The point where the correlation between the optical sensor units 42, 43, and 44 is best, i.e. the point where the difference between the image information of the three optical sensor units 42, 43, and 44 is smallest, as explained immediately below, is the second minimum possible height 120 of the object.
Fig. 15 shows a second hatched area 130, the upper limit of which is the maximum possible height 62, and the lower limit of which is a line 131 traversing the point where the second minimum possible height 120 is found.
At this point an assumption regarding the shape of the object is made. It is assumed that from the line 131 traversing the point of the second minimum possible height, there is a maximum angle to each side of the point 120 within which a part of the object can be expected to be present, and the object is not expected to extend beyond this limit. In Fig.
16, the angle 140 is assumed to be 45°. The intersection 141 of the line 143 forming an angle of 45° with the baseline 131 and the line connecting the points 62 and 63 is a second maximum possible height of the object, and the intersection 142 of the line 144 forming an angle of 45° with the baseline 131 and the line connecting the point 62 and the point 66 is a third maximum possible height of the object.
It is thus seen in Fig. 17 that the second and third approximated highest uppermost points in this embodiment corresponds to the second 153 and third 154 maximum possible 27 heights of the object is the vertices of two new triangles respectively 151 and 152 each with the baseline formed by the line 131 of the second minimum possible height of the object 155.
In Fig. 18 the two new correlation lines 161 and 162 corresponding to the altitudes of the triangles 151 and 152 are shown. The height of the object along the two lines 161 and 162 are then found by correlation of the image information of each of the optical sensor units 42, 43, and 44, and two new minimum possible heights of the object are found as illustrated in Figs. 12-15.
The process is an iterative process which can be terminated when the difference between the minimum possible height and the maximum possible height is less than a predetermined value of 5-50 mm, such as 5-30 mm, such as 10-20 mm, such as 5-20 mm, preferably such as 5-10 mm.
In Fig. 19 the area 171 of the object in the specific sectional plane is shown, the maximum width 172 and the maximum height 173 is shown.
After having calculated the height and width in a sequence of sectional planes, e.g. measuring the light intensity and processing the image information for at least every 10 cm of object passing, such as at least for every 7 cm, such as at least for every 5 cm, such as at least for every 2 cm, preferably such as at least for every 1 cm of object passing, such as at least for every 0,5 cm, the volume of an object can be calculated by combining the calculated heights and widths of the object and the information of the position of the moving means at the time of intensity measurement. Hereby, the grid formed by these calculated heights and widths and the object position information encloses the object.
It is obvious from the above-illustrated method that there may be more than one set of optical sensor units, each set forming a different angle with the travelling direction of the object, so as to obtain width and/or height information, respectively, for different sectional planes of the object. The combination of information from each of the sectional planes provide a more true image of the object.
The system described above may, according to the fourth aspect of the present invention, be used in conveyor systems as shown in Figs. 20 and 21. 28
The conveyor systems of Figs. 20 and 21 comprise two conveyors: a receiving conveyor 180 conveying objects in a first direction 182, and a feed conveyor 184 conveying objects in a second direction 186 making an angle 188 with the first conveyor direction 182. Objects 190, 192 being moved at the feed conveyor 184 are transferred to the receiving conveyor 180. An object 190 is transferred from the feed conveyor 184 to the receiving conveyor 180 whenever the object 190 has a geometry so that it can be accommodated in the free space between two adjacent objects (not shown) at the receiving conveyor 180.
The geometry of the object is in Fig. 20 determined by an optical sensor system as described in Fig. 2, using one optical sensor unit 191 , and in Fig. 21 determined by an optical sensor system as described in Fig. 3, using two optical sensor units 194, 196. The choice of geometry determination being dependent on the accuracy required in the specific embodiment. Of course also at least three, four, five, six or seven optical sensor units may be positioned above the conveyor so as to obtain a better accuracy of the geometry determination.
The geometry of the object 190 to be transferred is determined and the dimensions of the object in the first direction are determined and so is the sideways position of the object on the feed conveyor. It is hereby possible to transfer the object 190 at the time and position where the free space between two adjacent objects at the first conveyor belt can accommodate the object 190.
Fig. 22 shows a top view of a system according to another aspect of the invention. The conveyor or sorter system in Fig. 22 is similar to the conveyor systems of Figs. 20 and 21. In the embodiment shown in Fig. 22, the geometry of the object is determined by means of an optical sensor unit a functional diagram of which is shown in Fig. 4. However, in this embodiment the optical sensor unit 220 is positioned at one side of the feed conveyor, the visual field of the optical sensor unit 222 thereby defining a sectional plane which may be substantially parallel to the feed conveyor, and a light source 224 being positioned at the other side of the feed conveyor, so as to form a grid of light. According to the specific setup of the feed conveyor relative to the receiving conveyor, a measuring zone 226 is defined by the visual field of two or more specific photo sensors each having a visual field illustrated by the lines 228 and 230. Preferably, the angle between the lines 228 and 230 is substantially identical to the angle 188 between the travelling direction of the feed conveyor 29 and the travelling direction of the receiving conveyor. When an object is conveyed along the conveyor 184 in the travelling direction 186, the optical sensor senses when the object passes the visual field lines 228 and 230. A processor unit connected to the optical sensor unit determines whether there is an object present in the measuring zone or not, and provides two output signals, one for each photo sensor of the visual field lines 228 and 230, in response hereto.
Fig. 23 shows an embodiment of an optical sensor system according to the first aspect of the present invention. The embodiment shown comprises two optical sensors 231 , 232 positioned at each side of the conveyor unit 233, the visual field of the optical sensors thereby defining a sectional plane substantially perpendicular to the conveyor unit. When the conveyor unit is moving in the travelling direction 234, the visual fields of the two sensors on each side of the unit encompass, e.g., the tilting arms 235 and/or the rollers 236 at certain intervals. A processor unit connected to the sensors may provide signal in case a defect is detected, such a defect being, e.g., a broken or defect tilting arm. The processor unit may further be adapted to, e.g., provide a signal when a tilting arm 235 is in a tilting position, and/or when a wheel or roller 236 is missing or defect. The processor unit may further be adapted to provide a signal in case irregularities in, e.g., driving parts is detected. Thus, if the conveyor system is driven by linear induction motors requiring a certain distance between a moving part on each conveyor unit and the linear induction motors, the processor unit may provide a signal if that distance is not within a predetermined interval. In summary, the sorting system may be monitored for, e.g., break downs of the conveyor units, wear, erroneous position of the tilting arm and thereby erroneous sorting etc.
Additionally, the number of objects on the tray may be determined by the sensors, when the objects pass through the visual field, so as to ensure the trays contain the desired number of objects.
One or more light sources on each side of the conveyor unit may be provided for ensuring sufficient illumination of the parts passing through the visual field of the sensors. Means for preventing interference between the two sensors may be provided. Such means may, e.g., comprise one or more plates or other light barriers. The light barriers may have a pattern which facilitates distinguishing, by means of the comparator, the tilting arm and the wheel, respectively, from the barriers. 30
The optical sensors may be positioned at any positions whereby the most appropriate visual field for monitoring certain parts of the conveyor system may be provided.
The conveyor system shown in Fig. 24 comprises two conveyors; a feed conveyor 243 conveying objects in a first direction 244, a receiving conveyor 245 conveying objects in a second direction 240, and an optical sensor 241. Information representing the sideways position of the object 242 may be used for controlling the starting time and/or speed of the feed conveyor in order to transfer the object 242 to the receiving conveyor 245 at an appropriate position and at an appropriate time. Thus, the object 242 may be fed onto the receiving conveyor 245 with a certain delay dependent from the sideways position of the object in relation to the surface of the feeding conveyor 243, cf. also EP 0 355 705. In the application shown in Fig. 24, the optical sensor unit 241 is positioned above the feed conveyor belt and is displaced in a sideways position in relation to the centre of the conveyor belt, the visual field of the optical sensor defining a sectional plane. The optical sensor may, within its visual field, sense the bottom edge of the object 246, whereby the processor unit, connected to the sensor unit, may determine the sideways position of the object.
The conveyor unit 250 shown in Fig. 25 comprises a plate 251 positioned beneath the unit and having a barcode which can be detected by the sensor 252. When the conveyor unit is conveyed along the conveyor system in the travelling direction 253, the sensor 252 detects the barcode. A processor unit connected to the sensor may thus identify the unit. The bar code may be printed on the plate 252 or elsewhere on the conveyor unit 251 or it may be provided as grooved in the plate 251.
In an alternative embodiment of the conveyor system, the barcode may be positioned on top of the conveyor unit, whereby the barcode may be read from above by a sensor mounted on the upper side of the conveyor system.
Fig. 26 shows an embodiment comprising two optical sensors 260 and 261. Two conveyor units 262 and 263, and two barcodes 264 and 265 are provided on each of the conveyor units. The connection of the conveyor units in the conveyor track defines a fixed distance between the two barcodes. When reading the first barcode 264 at a first time instant, the sensor 260 sends a first signal to the processor unit. When reading the second barcode 31
265 at a second time instant, the sensor 261 sends a second signal to the processor unit. The processor unit then calculates the speed or the velocity of the conveyor track from the distance between the two barcodes and the time difference between the first and second time instant.
The optical sensor system in Fig. 26 may comprise only one single optical sensor unit, reading both barcodes. Further, the speed or the velocity of the conveyor may be determined by sensing a predetermined edge portion at each of the conveyor units 262, 263, and calculating the speed or the velocity as mentioned above.
The functions described in Fig. 23, 24, 25 and 26 may be performed by one single application as the one shown in Fig. 26. In Fig. 26, the optical sensor system with two sensor units is capable of monitoring the conveyor system, identifying the conveyor unit, and calculating the speed or the velocity of the chain of conveyor units. Alternatively, these functions may be performed by only one single optical sensor unit.
The unit 270 shown in Fig. 27 comprises the optical means 272 and a light source 271 in the form of two halogen spots. As an alternative to halogen spots, the light source 271 may comprise an LED-type power source or laser-diodes with a lens in front if necessary.
The optical sensors may comprise adjustment means, so as to adjust the visual field of the sensors and thereby provide an adjustable visual field of the sensors. The adjustable area provides for the possibility that the sensor may sense/capture objects being supported by a conveyor unit as well as means mounted to the conveyor unit below its object supporting surface within a single passage of the conveyor unit through the visual field.

Claims

32CLAIMS
1. A first optical sensor system for incorporation in a conveyor system, the conveyor system comprising a control system for controlling operation of the conveyor system, the optical sensor system comprising:
an optical sensor unit comprising:
an array of photo sensors adapted to sense intensity of light emitted or reflected in the visual field of the array of photo sensors, the visual field encompassing a path in which a part of the conveyor system and/or an object moves,
first signal transmission means for transmission of image information representing the intensity of light sensed by each photo sensor to a processor unit,
the processor unit comprising:
signal receiving means for receiving the image information from the optical sensor unit,
memory means for storing one or more algorithms which allow for appropriate processing of the image information by the processor unit depending on the operation of the first optical sensor system,
the processor unit being adapted to process the image information which is being captured in the visual field, so as to generate an output signal representing the image of a part of the conveyor system and/or the object,
second signal transmission means for passing said output signal to the control system of the conveyor system,
the optical sensor unit being comprised in an integrated unit, the first optical sensor system being adapted to perform at least two of the following operations: 33
determining the sideways position of an object being supported and/or conveyed by the conveyor system,
detecting object identification information provided or printed on the object,
capturing a sequence of at least two images, so as to determine therefrom a signal representing the velocity of a part of the conveyor system or the object,
capturing an image of a predetermined part of the conveyor system, so as to survey the condition of said predetermined part of the conveyor system,
detecting whether an object is present within the visual field defined by one or more of the photo detectors,
the optical sensor system being adapted to perform at least one of said operations at a time.
2. A first optical sensor system according to claim 1 , wherein the processor unit is comprised in the integrated unit.
3. A first optical sensor system according to claim 1 , wherein the conveyor system further comprises a plurality of conveyor units, the first optical sensor system further being adapted to perform the following operation:
- detect conveyor unit identification information provided or printed on at least some of the conveyor units.
4. A first optical sensor system according to any of the preceding claims, wherein the first optical sensor unit is adapted to perform at least three of said operations.
5. A first optical sensor system according to any of the preceding claims, wherein at least part of the surface of the conveyor system has a predetermined pattern, the processor unit of the first optical sensor system being adapted to process the image information, so as to distinguish the surface of the conveyor system from a part of the conveyor system and/or the object. 34
6. A first optical sensor system according to claim 5, wherein the predetermined pattern has a plurality of mutually contrasting areas, whereby an output signal representing an image of a part of the conveyor system and/or the object may be generated by superimposing the image information and previously stored data representing said contrasting areas.
7. A first optical sensor system according to any of the preceding claims, wherein the processor unit is comprised in the control system for controlling operation of the conveyor system.
8. A first optical sensor system according to any of the preceding claims, wherein the integrated unit further comprises a light source for illumination of the visual field.
9. A first optical sensor unit according to any of the preceding claims, wherein the array of photo sensors comprises at least 512 photo sensors.
10. A conveyor system comprising a plurality of conveyor units for conveying objects, a control system for controlling operation of the conveyor system, and a first optical sensor system, the optical sensor system comprising:
an optical sensor unit comprising:
an array of photo sensors adapted to sense intensity of light emitted or reflected in the visual field of the array of photo sensors, the visual field encompassing a path in which a part of the conveyor system and/or an object moves,
first signal transmission means for transmission of image information representing the intensity of light sensed by each photo sensor to a processor unit,
the processor unit comprising:
signal receiving means for receiving the image information from the optical sensor unit, 35
memory means for storing one or more algorithms which allow for appropriate processing of the image information by the processor unit depending on the operation of the first optical sensor system,
the processor unit being adapted to process the image information which is being captured in the visual field, so as to generate an output signal representing the image of a part of the conveyor system and/or the object,
second signal transmission means for passing said output signal to the control system of the conveyor system,
the optical sensor unit being comprised in an integrated unit, the first optical sensor system being adapted to perform at least two of the following operations:
- determining the sideways position of an object being supported and/or conveyed by the conveyor system,
detecting object identification information provided or printed on the object,
- capturing a sequence of at least two images, so as to determine therefrom a signal representing the velocity of a part of the conveyor system or the object,
capturing an image of a predetermined part of the conveyor system, so as to survey the condition of said predetermined part of the conveyor system,
detecting whether an object is present within the visual field defined by one or more of the photo detectors,
the optical sensor system being adapted to perform at least one of said operations at a time.
11. An optical sensor system for determining the geometry and/or angular position of a moving object, said system comprising:
at least two optical sensor units each comprising: 36
an array of photo sensors adapted to sense intensity of light emitted or reflected in the visual field of the array of photo sensors, the visual field encompassing a path in which the object moves,
signal transmission means for transmission of image information representing the intensity of light sensed by each photo sensor,
means for moving the object in a travelling direction relative to the optical sensor units,
a processor unit comprising:
signal receiving means for receiving the image information from each of the at least two optical sensor units,
means for obtaining information as to the position of the moving means relative to the sensor units,
the processor unit being adapted to process the image information and the moving means position information together with information concerning spatial position of each of the at least two optical sensor units, in a sectional plane corresponding to a moving means position or a sequence of sectional planes corresponding to a sequence of moving means positions so as to generate an output signal representing the geometry and/or the angular position of the object.
12. A system according to claim 11 , which comprises a comparator means which is adapted to compare the image information with stored reference information to distinguish object image information from background information and provide object image information for processing by the processor unit.
13. A system according to claim 11 or 12, wherein moving means position information is acquired from a pulse emitting sensor connected to the moving means. 37
14. A system according to any of claims 11-13, wherein the processor unit is adapted to process the image information and the moving means position information together with the information concerning the spatial position of the optical sensor units by means of computational geometry for substantially each of the sectional planes to obtain the geometry and/or the angular position of the object.
15. A system according to any of the claims 11-14, wherein the moving means is a moving means of a conveyor.
16. A system according to claim 15, wherein the moving means is a conveyor belt.
17. A system according to claim 16, wherein at least a part of the upper surface of the belt has a pattern which facilitates distinguishing, by means of the comparator means, the belt from the moving object.
18. A system according to any of the claims 11-17, wherein the moving object is an object chosen from the group consisting parcels, letters, luggage, parcel post packets, totes, spare parts, newspapers, magazines, pharmaceuticals, articles of food, videotapes, magnetic tape cassettes, compact disks, floppy disks, and all other conveyable items, such as items deliverable by post.
19. A system according to any of claims 11-18, wherein the at least two optical sensor units are positioned at a distance from each other along the travelling direction and in which the processor means further comprises synchronization means for synchronization of the image information from each of the at least two optical sensor units and the moving means position information to compensate for the difference in position along the travelling direction relative to the at least two optical sensor units.
20. A method of determining geometry and/or angular position of a moving object, the method comprising:
providing at least two optical sensor units, each comprising 38 an array of photo sensors sensing intensity of light emitted or reflected in the visual field of the array of photo sensors, the visual field encompassing a path in which the object moves,
and signal transmission means for transmission of image information representing the intensity of light sensed by each photo sensor,
moving the object in a travelling direction,
obtaining information of the position of the moving means relative to the sensor units,
transmitting image information representing the intensity of light at pixels of the photo sensor array, to a processor unit,
receiving the image information at the processor unit, and
processing the image information and the moving means position information together with information concerning the spatial position information of each of the at least two optical sensor units in a sectional plane or a sequence of sectional planes of the object so as to generate an output signal representing the geometry and/or the angular position of the object.
21. A method according to claim 20, wherein the image information transmitted from the arrays of photo sensors is compared with stored reference information to distinguish object image information from background information and provide object image information, the object image information being the image information processed by the processor unit.
22. A method according to claim 20 or 21 , wherein the moving means position information is transmitted from a pulse emitting sensor connected to the moving means.
23. A method according to any of claims 20-22, wherein the image information and the moving means position information together with information concerning the spatial position of each of the at least two optical sensor units is processed in the processor unit 39
by means of computational geometry for substantially each sectional plane in the sequence of sectional planes to determine the geometry and/or the angular position of the object.
24. A method according to any of claims 20-23, wherein the width of the object in a sectional plane is determined by the steps of:
comparing the object visual fields, the object visual fields being defined as the visual fields of the individual optical sensor units which contain object image information, to extract the common area of the object visual fields for the individual optical sensor units of the system,
the width of the object in each sectional plane being determined as the difference between the outermost left and the outermost right limit of the common area.
25. A method according to claim 24, wherein the height of the object is determined by one or by a sequence of photo diodes positioned at predetermined heights, so as to provide a measure of the height of the object
26. A method according to any of claims 20-24, wherein the at least two optical sensor units are provided in the same sectional plane, and the highest possible uppermost point of the object and the lowest possible uppermost point of the object in a sectional plane are determined by the steps of:
comparing the object visual field of the individual optical sensor units to extract the common area of the object visual fields, the limits of the common area being defined by substantially straight lines connecting points of intersection of the visual fields of the individual optical sensor units within which all of the at least two optical sensor units sense the object,
determining the highest possible uppermost point of the object as the uppermost point of intersection defining the common area, and determining the lowest possible uppermost point of the object as the second uppermost point of intersection defining the common area. 40
27. A method according to claim 26, wherein a second approximated lowest uppermost point of the object in a sectional plane is determined by:
defining a triangle, the baseline of which is a substantially horizontal line traversing the second uppermost point of intersection, and the other sides of which are the substantially straight lines of the circumference of the common area connecting the uppermost point of intersection with the baseline,
correlating the object image information of each of the at least two optical sensor units substantially at the intersection of the altitude of the triangle and the baseline of the triangle,
correlating the object image information of each of the at least two optical sensor units through a sequence of points substantially along the altitude of the triangle until the uppermost point of intersection is reached,
determining the point having the substantially best correlation among the group of points constituted by the intersection point and the points of the sequence, this point being the second approximated lowest uppermost point.
28. A method according to claim 27, wherein,
assuming that the object has a shape so that from the baseline comprising the second approximated lowest uppermost point there is a maximum angle to each side of the point within which at least part of the object will occupy space,
third and fourth approximated lowest uppermost points in a sectional plane are determined by the steps of:
defining a second approximated highest uppermost point at one side of the second approximated lowest uppermost point and a third approximated highest uppermost point at the other side of the second approximated lowest uppermost point,
defining lines between the second approximated lowest uppermost point and each of the second and third approximated highest uppermost points, 41
correlating the object image information of each of the at least two optical sensor units substantially at the intersection of the altitude of each of the two triangles formed by respective parts of the baseline comprising the second approximated lowest uppermost point, the respective lines between the second approximated lowest uppermost point and each of the second and third approximated highest uppermost points, and respective parts of the straight lines of the circumference of the common area connecting the uppermost point of intersection with the baseline,
correlating the object image information of each of the at least two optical sensor units at a sequence of points substantially along each altitude of each of the triangles until the uppermost limit for each of the new triangles is reached,
determining the respective points having the best correlation among the respective group of points constituted by each intersection point and the points of each sequence, these points being the third and fourth approximated lowest uppermost points.
29. A method according to claim 28, wherein the correlating process is an iterative process being repeated until
a first predetermined difference between the respective resulting second approximated highest uppermost point and third and fourth approximated lowest uppermost points is reached, or
a predetermined number of iterations has been performed, or
a second predetermined difference between
the difference between the respective resulting second approximated highest uppermost point and third and fourth approximated lowest uppermost points
of two adjacent iterations is reached. 42
30. A method according to any of claims 26-29, wherein the volume of the object is determined by the steps of:
determining the circumference of the area in which the object is determined to be present for substantially each sectional plane of the sequence of sectional planes,
integrating the circumference of the area in which the object is determined to be present over the length of the object, whereby the volume is determined.
31. A method according to any of claims 20-30, wherein the at least two optical sensor units are positioned at a distance from each other along the travelling direction, and wherein the processor means is adapted to
synchronize the object image information, the optical sensor unit spatial position information from each of the at least two optical sensor units, and the moving means position information for compensation of the difference in position along the travelling direction relative to the at least two optical sensor units.
32. A conveyor system for transporting objects and comprising:
a feed conveyor for conveying objects in a first direction,
a receiving conveyor for conveying objects in a second direction,
a control system being connected to an optical sensor system according to any of claims 11-19, said control system being adapted to control the speed of the feed conveyor in response to at least the output signal from the processor unit comprised in the optical sensor system.
33. A conveyor system for transporting objects and comprising:
a feed conveyor for conveying objects in a first direction,
a receiving conveyor for conveying objects in a second direction, 43 a control system being connected to an optical sensor system adapted to determine the geometry and/or angular position of a moving object, said optical sensor system comprising:
at least one optical sensor unit, comprising:
an array of photo sensors adapted to sense intensity of light emitted or reflected in the visual field of the array of photo sensors, the visual field encompassing a path in which the object moves,
signal transmission means for transmission of image information representing the intensity of light sensed by each photo sensor,
means for moving the object in a travelling direction relative to the optical sensor units,
a processor unit comprising:
signal receiving means for receiving the image information from each of the at least two optical sensor units,
means for obtaining information as to the position of the moving means relative to the sensor units,
the processor unit being adapted to process the image information and the moving means position information together with information concerning spatial position of each of the at least two optical sensor units, in a sectional plane corresponding to a moving means position or a sequence of sectional planes corresponding to a sequence of moving means positions so as to generate an output signal representing the geometry and/or the angular position of the object,
the control system being adapted to control the speed of the feed conveyor in response to at least the output signal from the processor unit comprised in the optical sensor system. 44
34. A conveyor system according to claim 32 or 33, wherein the first conveying direction forms an acute angle with the second conveying direction.
PCT/DK1999/000180 1998-03-25 1999-03-25 An optical sensor system for incorporation in a conveyor system and a method for determining the geometry and/or angular position of a moving object WO1999049277A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU30249/99A AU3024999A (en) 1998-03-25 1999-03-25 An optical sensor system for incorporation in a conveyor system and a method fordetermining the geometry and/or angular position of a moving object

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
DK0426/98 1998-03-25
DK42698 1998-03-25
DKPA199801511 1998-11-18
DKPA199801511 1998-11-18

Publications (1)

Publication Number Publication Date
WO1999049277A1 true WO1999049277A1 (en) 1999-09-30

Family

ID=26063981

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/DK1999/000180 WO1999049277A1 (en) 1998-03-25 1999-03-25 An optical sensor system for incorporation in a conveyor system and a method for determining the geometry and/or angular position of a moving object

Country Status (2)

Country Link
AU (1) AU3024999A (en)
WO (1) WO1999049277A1 (en)

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2807163A1 (en) * 2000-03-30 2001-10-05 Cybernetix Monitoring system for goods on conveyor, uses two cameras at inclined axes taking images for analysis to produce three-dimensional model
DE10028201A1 (en) * 2000-06-09 2001-12-20 Basler Ag Optical inspection for moving objects by line scanning employing bright and dark field illumination of object
WO2002037419A1 (en) * 2000-10-30 2002-05-10 Mark Peters Apparatus and method for the construction of spatial representations
WO2002026598A3 (en) * 2000-09-26 2002-07-04 Crisplant As Method and apparatus for orientating articles during feeding to a conveyor with a discharge section
WO2013184910A1 (en) * 2012-06-06 2013-12-12 Momentum Machines Company System and method for dispensing toppings
ITMI20130086A1 (en) * 2013-01-22 2014-07-23 Tecnoexamina S P A MEASUREMENT SYSTEM OF LINEAR DIMENSIONS OF A TILE ON THE MOVE ALONG A DEFAULT FEED DIRECTION AND ITS MEASUREMENT METHOD
EP2865620A1 (en) * 2013-10-25 2015-04-29 Maschinen- und Stahlbau Julius Lippert GmbH & Co. KG Sorting and commissioning facility with terminal equipment
WO2016023135A1 (en) * 2014-08-13 2016-02-18 Ferag Ag Method and apparatus for gripping and separating out articles
US9295281B2 (en) 2012-06-06 2016-03-29 Momentum Machines Company System and method for dispensing toppings
US9295282B2 (en) 2012-06-06 2016-03-29 Momentum Machines Company System and method for dispensing toppings
US9326544B2 (en) 2012-06-06 2016-05-03 Momentum Machines Company System and method for dispensing toppings
US10068273B2 (en) 2013-03-13 2018-09-04 Creator, Inc. Method for delivering a custom sandwich to a patron
EP3399275A1 (en) * 2017-05-04 2018-11-07 MULTIVAC Sepp Haggenmüller SE & Co. KG Determining the position and orientation of a conveyed promoted object
CN110907366A (en) * 2018-09-18 2020-03-24 涂层国外知识产权有限公司 Device and method for determining an observation geometry
US10905150B2 (en) 2012-06-06 2021-02-02 Creator, Inc. System for dispensing toppings
US11023949B2 (en) 2013-03-13 2021-06-01 Creator, Inc. Method for delivering a custom sandwich to a patron
DE102013113246B4 (en) 2013-04-02 2021-10-21 Hyundai Motor Company Speed measuring device for a conveyor belt
US11185105B2 (en) 2018-06-20 2021-11-30 Creator, Inc. System and method for dispensing toppings
CN115027902A (en) * 2022-05-31 2022-09-09 西门子工厂自动化工程有限公司 Method and device for determining installation position of safety control device
CN110907366B (en) * 2018-09-18 2024-05-31 艾仕得涂料系统有限责任公司 Device and method for determining an observation geometry

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2021762A (en) * 1978-05-17 1979-12-05 British Steel Corp Improvements in determining the dimensions of workpieces
DE3542896A1 (en) * 1985-12-04 1987-06-11 Siemens Ag METHOD FOR GENERATING A SIGNAL REPRESENTING THE CROSS-SECTIONAL SURFACE OF ANY ELLIPTICAL OBJECT
US4693607A (en) * 1983-12-05 1987-09-15 Sunkist Growers Inc. Method and apparatus for optically measuring the volume of generally spherical fruit
EP0295987A1 (en) * 1987-05-29 1988-12-21 T.F.K. Measuring method and device for the lenghtwise camber of a metal plate
EP0851207A1 (en) * 1996-12-31 1998-07-01 Datalogic S.P.A. Process and apparatus for measuring the volume of an object by means of a laser scanner and a CCD detector

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2021762A (en) * 1978-05-17 1979-12-05 British Steel Corp Improvements in determining the dimensions of workpieces
US4693607A (en) * 1983-12-05 1987-09-15 Sunkist Growers Inc. Method and apparatus for optically measuring the volume of generally spherical fruit
DE3542896A1 (en) * 1985-12-04 1987-06-11 Siemens Ag METHOD FOR GENERATING A SIGNAL REPRESENTING THE CROSS-SECTIONAL SURFACE OF ANY ELLIPTICAL OBJECT
EP0295987A1 (en) * 1987-05-29 1988-12-21 T.F.K. Measuring method and device for the lenghtwise camber of a metal plate
EP0851207A1 (en) * 1996-12-31 1998-07-01 Datalogic S.P.A. Process and apparatus for measuring the volume of an object by means of a laser scanner and a CCD detector

Cited By (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2807163A1 (en) * 2000-03-30 2001-10-05 Cybernetix Monitoring system for goods on conveyor, uses two cameras at inclined axes taking images for analysis to produce three-dimensional model
DE10028201A1 (en) * 2000-06-09 2001-12-20 Basler Ag Optical inspection for moving objects by line scanning employing bright and dark field illumination of object
WO2002026598A3 (en) * 2000-09-26 2002-07-04 Crisplant As Method and apparatus for orientating articles during feeding to a conveyor with a discharge section
WO2002037419A1 (en) * 2000-10-30 2002-05-10 Mark Peters Apparatus and method for the construction of spatial representations
US7483571B2 (en) 2000-10-30 2009-01-27 Peters Mark W Apparatus and method for the construction of spatial representations
US10905150B2 (en) 2012-06-06 2021-02-02 Creator, Inc. System for dispensing toppings
WO2013184910A1 (en) * 2012-06-06 2013-12-12 Momentum Machines Company System and method for dispensing toppings
US10292415B2 (en) 2012-06-06 2019-05-21 Creator, Inc. System for dispensing toppings
JP2015526063A (en) * 2012-06-06 2015-09-10 モーメンタム マシーンズ カンパニーMomentum Machines Company Topping dispensing system and method
US10219535B2 (en) 2012-06-06 2019-03-05 Creator, Inc. System for dispensing toppings
US9295281B2 (en) 2012-06-06 2016-03-29 Momentum Machines Company System and method for dispensing toppings
US9295282B2 (en) 2012-06-06 2016-03-29 Momentum Machines Company System and method for dispensing toppings
US9326544B2 (en) 2012-06-06 2016-05-03 Momentum Machines Company System and method for dispensing toppings
US9386799B2 (en) 2012-06-06 2016-07-12 Momentum Machines Company System and method for dispensing toppings
US9770049B2 (en) 2012-06-06 2017-09-26 Momentum Machines Company System and method for dispensing toppings
ITMI20130086A1 (en) * 2013-01-22 2014-07-23 Tecnoexamina S P A MEASUREMENT SYSTEM OF LINEAR DIMENSIONS OF A TILE ON THE MOVE ALONG A DEFAULT FEED DIRECTION AND ITS MEASUREMENT METHOD
US10068273B2 (en) 2013-03-13 2018-09-04 Creator, Inc. Method for delivering a custom sandwich to a patron
US11023949B2 (en) 2013-03-13 2021-06-01 Creator, Inc. Method for delivering a custom sandwich to a patron
DE102013113246B4 (en) 2013-04-02 2021-10-21 Hyundai Motor Company Speed measuring device for a conveyor belt
EP2865620A1 (en) * 2013-10-25 2015-04-29 Maschinen- und Stahlbau Julius Lippert GmbH & Co. KG Sorting and commissioning facility with terminal equipment
DE102013111788B4 (en) 2013-10-25 2021-08-12 Lippert Gmbh & Co. Kg Sorting and order picking system with terminal equipment
WO2016023135A1 (en) * 2014-08-13 2016-02-18 Ferag Ag Method and apparatus for gripping and separating out articles
US9845205B2 (en) 2014-08-13 2017-12-19 Ferag Ag Method and device for detecting and segregating piece goods
EP3399275A1 (en) * 2017-05-04 2018-11-07 MULTIVAC Sepp Haggenmüller SE & Co. KG Determining the position and orientation of a conveyed promoted object
US11185105B2 (en) 2018-06-20 2021-11-30 Creator, Inc. System and method for dispensing toppings
CN110907366A (en) * 2018-09-18 2020-03-24 涂层国外知识产权有限公司 Device and method for determining an observation geometry
CN110907366B (en) * 2018-09-18 2024-05-31 艾仕得涂料系统有限责任公司 Device and method for determining an observation geometry
CN115027902A (en) * 2022-05-31 2022-09-09 西门子工厂自动化工程有限公司 Method and device for determining installation position of safety control device
CN115027902B (en) * 2022-05-31 2024-03-15 西门子工厂自动化工程有限公司 Method and device for determining installation position of safety control device

Also Published As

Publication number Publication date
AU3024999A (en) 1999-10-18

Similar Documents

Publication Publication Date Title
WO1999049277A1 (en) An optical sensor system for incorporation in a conveyor system and a method for determining the geometry and/or angular position of a moving object
EP1738136B1 (en) Measuring apparatus and method in a distribution system
US9193534B2 (en) Detection system for installation at a conveyor belt
AU674178B2 (en) Product discrimination system and method therefor
EP0257749B1 (en) Methods and apparatus for monitoring the diffuse reflectivity of a surface
US7492973B2 (en) Apparatus and method for determining whether machine readable information on an item matches the item
EP2272596B1 (en) System and method for dimensioning objects
US5637854A (en) Optical bar code scanner having object detection
US5652432A (en) Cylindrical body inspection apparatus utilizing displacement information and reflected light information
EP1269114B1 (en) Apparatus and method for determining the dimensions of an object utilizing negative imaging
KR20130045350A (en) A checkout counter
JPH10332320A (en) Product scanning device and method
EP0572555A4 (en)
WO1996005477A1 (en) High precision semiconductor component alignment systems
JP4210844B2 (en) Imaging device for inspection / sorting machine with automatic imaging timing detection function
US7199385B2 (en) Method and an apparatus for the detection of objects moved on a conveyor means by means of an optoelectronic sensor
US6102291A (en) Apparatus and process for detecting the presence and encumbrance of an object
FI77390B (en) MOTTAGNINGSANORDNING FOER FLASKOR.
US6341726B1 (en) Apparatus for inspecting elements on transport device
CA1178711A (en) Apparatus and process for scanning and analyzing mail address information
JP2811043B2 (en) Automatic bottle identification method and apparatus, and bottle separation apparatus using this identification apparatus
JP3694590B2 (en) Agricultural product image reading apparatus and sorting apparatus using the same
JP2681513B2 (en) Painted surface inspection equipment for vehicles
JP2511508B2 (en) Paper discrimination device
JPH0857432A (en) Bottle sorting apparatus

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AL AM AT AT AU AZ BA BB BG BR BY CA CH CN CU CZ CZ DE DE DK DK EE EE ES FI FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SK SL TJ TM TR TT UA UG US UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW SD SL SZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

NENP Non-entry into the national phase

Ref country code: CA

122 Ep: pct application non-entry in european phase