EP3696135B1 - Forklift and system with forklift for the identification of goods - Google Patents

Forklift and system with forklift for the identification of goods Download PDF

Info

Publication number
EP3696135B1
EP3696135B1 EP20156789.8A EP20156789A EP3696135B1 EP 3696135 B1 EP3696135 B1 EP 3696135B1 EP 20156789 A EP20156789 A EP 20156789A EP 3696135 B1 EP3696135 B1 EP 3696135B1
Authority
EP
European Patent Office
Prior art keywords
camera
goods
forklift truck
image data
forklift
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP20156789.8A
Other languages
German (de)
French (fr)
Other versions
EP3696135A1 (en
Inventor
Andreas Plettner
Armin Lang
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Technische Universitaet Muenchen
Indyon GmbH
Original Assignee
Technische Universitaet Muenchen
Indyon GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Technische Universitaet Muenchen, Indyon GmbH filed Critical Technische Universitaet Muenchen
Publication of EP3696135A1 publication Critical patent/EP3696135A1/en
Application granted granted Critical
Publication of EP3696135B1 publication Critical patent/EP3696135B1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66FHOISTING, LIFTING, HAULING OR PUSHING, NOT OTHERWISE PROVIDED FOR, e.g. DEVICES WHICH APPLY A LIFTING OR PUSHING FORCE DIRECTLY TO THE SURFACE OF A LOAD
    • B66F9/00Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes
    • B66F9/06Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes movable, with their loads, on wheels or the like, e.g. fork-lift trucks
    • B66F9/075Constructional features or details
    • B66F9/0755Position control; Position detectors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66FHOISTING, LIFTING, HAULING OR PUSHING, NOT OTHERWISE PROVIDED FOR, e.g. DEVICES WHICH APPLY A LIFTING OR PUSHING FORCE DIRECTLY TO THE SURFACE OF A LOAD
    • B66F9/00Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes
    • B66F9/06Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes movable, with their loads, on wheels or the like, e.g. fork-lift trucks
    • B66F9/075Constructional features or details
    • B66F9/20Means for actuating or controlling masts, platforms, or forks
    • B66F9/24Electrical devices or systems

Definitions

  • the invention relates to a forklift truck and a system with a forklift truck for identifying goods to be picked up / picked up.
  • a forklift truck according to the preamble of claim 1 is from the EP0254192 A2 known.
  • bottles are used that can be reused immediately for filling after cleaning and testing.
  • the one-way system the products of which are broken down into their basic components after being taken back and processed further, for example into glass or plastic granulate.
  • returnable bottles are more environmentally friendly than single-use bottles.
  • the energy and resource consumption for return transport and cleaning is lower for returnable bottles than the additional manufacturing costs for disposable bottles. This applies all the more, the more regional the distribution and the higher the number of refills.
  • reusable bottles and one-way bottles in terms of shape and quality is usually only possible with plastic bottles (mostly made of the material PET, polyethylene terephthalate).
  • plastic bottles mostly made of the material PET, polyethylene terephthalate.
  • reusable bottles are characterized by a higher wall thickness and thus stability, since the bottles have to withstand a significantly longer life cycle.
  • the external empties management includes the actual acceptance of empties within the beverage trade via the customers, the local or external pre-sorting and the logistical link or the transfer of the empties to the destination of the beverage manufacturer.
  • the internal empties management includes the delivery of empties to the beverage manufacturer itself, the data recording and storage of the goods, cleaning and inspection, as well as the appropriate provision of the empties at a suitable point in production.
  • the used reusable bottles returned by customers are usually pre-sorted at the dealer's location or by special service providers. This pre-sorting is largely done manually in the beverage return points or also automatically by the beverage return machines installed in the retail trade. With these service providers, this pre-sorting is also carried out manually or automatically. These processes of pre-sorting in the external empties management ensure a relatively high degree of purity of the beverage crates, so that it can be assumed that the correct bottles are contained within a crate.
  • empties to the beverage manufacturer is usually done in the form of boxes on standardized, reusable transport pallets.
  • the amount of empties that has to be processed here is very high for a large beverage manufacturer. For example, around 30,000 trucks with 36 pallets each are processed at the Gerolsteiner mineral spring.
  • the process in the prior art is as follows: The forklift drivers have to unload the truck with very short deadlines. While the forklift driver takes the pallets on the fork, he usually has to record the number of pallets and the crate / bottle type using a terminal in the forklift. This data then flows directly into the merchandise management system, which provides the data for the subsequent storage and production processes.
  • load carriers KLTs
  • KLTs load carriers
  • These load carriers are used to hold special components that are used, for example, in automobile production.
  • These load carriers are usually made to accommodate very specific items and are therefore located in a circulatory system between the automobile manufacturer and its suppliers.
  • the invention is based on the object of providing a dynamic, mobile and real-time capable quantity and type detection of goods during unloading or loading which reliably detects incorrect sorting.
  • the forklift truck comprises: first camera system with a first camera which is designed to capture a first side of goods to be picked up; a second camera system with a second camera which is designed to capture a second side of the goods during or after they have been picked up by the forklift truck, the first side differing from the second side; a first sensor system with a first sensor which is designed to detect a distance between the forklift truck and the goods; a controller which is designed to output a first activation signal to the first camera when the distance detected by the first sensor falls below a minimum distance; and a transmission unit which is designed to transmit the image data of the goods captured by the first camera and by the second camera in each case to a data processing system for identifying the goods.
  • the image data of the goods to be picked up / picked up, in particular a pallet of empty boxes are generated in real time and transmitted to the data processing system.
  • Analysis of the Image data has the advantage that the goods, in particular a type detection of the empties, can be determined. This allows an exact recording of the current stocks of empties and thus a more efficient production process.
  • the first sensor system is based on ultrasound or radar in order to determine the distance from the forklift exactly.
  • the forklift truck comprises a second sensor system with a second sensor which is designed to detect that the forklift truck has picked up the goods.
  • the controller is also designed to output a second activation signal to the second camera when it is determined by the second sensor that goods have been picked up.
  • the second camera system has an extendable device which is designed to bring the second camera into a position which enables the second side of the goods to be detected. A rear side of the goods W can thus be detected.
  • the first camera and the second camera are 3D cameras. In this way, the dimensions of the goods, such as height and width, can be determined.
  • the forklift includes the data processing system. This has the advantage that the image analysis can be carried out on the forklift in real time, and the information about the identified goods can be made available directly to an inventory control system.
  • the first camera system further comprises a first mirror and a second mirror, each with a device which is designed to position the mirrors so that they have a third side and a fourth side of the goods picked up by the forklift truck capture. In this way, other sides of the goods can be recorded, which enables more precise identification of empties.
  • the first camera is attached to a mast of the forklift.
  • the second camera system is arranged in one of the forks of the forklift truck. This has the advantage that the second camera can capture a lower side of the goods while the goods are being picked up. So For example, it can be ensured that no empty crates are missing from a pallet
  • the second camera is positioned in such a way that it captures the second side of the goods, the second side of the goods corresponding to the rear side facing away from the forklift truck.
  • the retractable device enables the second camera to be brought into different positions in order to ensure complete coverage of the rear side of the goods.
  • the first mirror and the second mirror are each attached to the forklift and are positioned after the goods have been picked up by the respective device so that the third and fourth sides of the goods, which correspond to the lateral sides, reflect the first mirror and the second mirror being sighted by the first camera after the goods have been picked up.
  • the mirrors are attached to the side of the forklift and can be extended / unfolded using the respective device.
  • the first camera can then also capture the image data of the side part of the goods by means of the extended / unfolded mirror, which improves the goods identification.
  • the data processing system in the forklift is designed to carry out preprocessing of image data that are captured by one or more camera systems of the forklift, the preprocessing filtering sensor-related errors, preparing the data format of the image data, and sensor data fusion , feature extraction and machine segmentation.
  • the data processing system is further designed to carry out an identification of the goods, in particular of empty crate types, and a number of empty crates, by means of one or more of the following classifications from the preprocessed image data: a geometrical classification based on geometrical data that are determined from the image data, or their extracted features, a texture-based classification based on color data, which are determined from the image data, and their extracted features.
  • a geometrical classification based on geometrical data that are determined from the image data, or their extracted features a texture-based classification based on color data, which are determined from the image data, and their extracted features.
  • the bottle types of the goods W can also be determined.
  • the data processing system is further designed to carry out a statistical estimation and an estimation by machine learning on the basis of the captured image data in order to predict the occupancy of individual empty boxes of the goods picked up determine.
  • specific image processing algorithms and machine learning methods such as "K-Nearest Neighbors", neural networks, etc.
  • a system comprises a forklift truck with a positioning system which is designed to detect a geographical position of the forklift truck in a warehouse in real time; a first camera system with a first camera which is designed to capture a side of a goods to be picked up facing the forklift truck; a first sensor system with a first sensor which is designed to detect a distance between the forklift truck and goods to be picked up; a controller which is designed to output a first activation signal to the first camera when the distance detected by the first sensor falls below a minimum distance; a transmission system which is designed to transmit the image data of the goods captured by the first camera to a data processing system; a second camera system with a second camera, which is arranged in a warehouse, and which is designed to capture at least one side of the goods facing away from the forklift truck during and after the goods are picked up by the forklift truck, the data processing system being designed by the the first camera and the second camera to analyze captured image data in order to
  • the second camera of the second camera system is attached in an area of a goods unloading point within the warehouse, preferably on the hall ceiling; or attached to a gate of the warehouse, which the forklift truck passes with the goods picked up.
  • the data processing system is provided by the forklift itself.
  • the positioning system of the forklift is also designed to provide the image data captured by the first camera with the geographical position in order to subsequently assign the image data captured by the first camera to the image data captured by the second camera enable, the image data captured by the first camera and the second camera each additionally having a time stamp.
  • the data processing system is designed to identify the forklift truck by analyzing the image data captured by the second camera in order to enable the image data captured by the first camera to be associated with the image data captured by the second camera.
  • the data processing system is designed to carry out preprocessing on the basis of the associated image data of the first camera and the second camera, the preprocessing filtering of sensor-related errors, preparing the data format of the image data, and sensor data fusion , feature extraction, and machine segmentation.
  • the data processing system is further designed to carry out an identification of the goods, in particular of empty crate types and a number of empty crates, from the preprocessed image data using one or more of the following classifications: (a) a geometric classification based on geometric data determined from the image data or their extracted features, and (b) texture-based classification based on color data determined from the image data and their extracted features.
  • the second camera system comprises a plurality of mirrors which are arranged in the warehouse and which are designed to capture the sides of the goods facing away from the forklift truck, the first camera also being designed to display the mirrors at or after Aiming at picking up the goods.
  • the second camera system is formed by a drone with a camera.
  • the forklift 10 shown comprises a first camera 12 which detects a first side S1 of a goods W to be picked up, such as a pallet of empty boxes.
  • the field of view of the first camera 12 is in Figure 1A indicated by a dashed line.
  • the first camera 12 is attached to a lifting mast 11 of the forklift 10.
  • the first camera 12 can also be attached to a load carrier 17 of the forklift 10.
  • the forklift truck 10 further comprises a first sensor 16 which determines a distance between the forklift truck 10 and the goods W.
  • the first sensor 16 can determine the distance to the goods W by measuring the transit time of electromagnetic waves or sound waves, which is indicated by the dashed line in FIG Figure 1A is marked.
  • the first sensor 16 comprises an ultrasonic sensor, a radar sensor and a LIDAR (“Light Detection and Ranging”) sensor.
  • the first sensor 16 is attached to the roof of the forklift 10.
  • the first sensor 16 can also be attached to a lifting mast 11 of the forklift 10, or can be attached to one of the forks 13 or to the load carrier 17 of the forklift.
  • a controller 20 of the first embodiments comprised by the forklift 10 as shown in FIG Figure 1B As shown, if the distance determined by the first sensor 16 falls below a minimum distance, a first activation signal can be output to the first camera 12.
  • the first camera 12 is activated by the first activation signal and begins to record image data of the first side S1 of the goods W. For example, the first camera records a front side of the goods W to be picked up facing the forklift truck 10.
  • the forklift truck 10 of the first embodiment comprises a second camera 14 which, during or after the recording, detects further sides of the goods W that differ from the first side S1.
  • the second camera 14 is received in one of the forks 13 of the forklift 10.
  • the second camera 14 can also be attached to the lifting mast 11 of the forklift 10.
  • the forklift truck 10 of the first embodiment can preferably comprise a second sensor 18 which determines the pick-up of the goods W by the forklift truck 10.
  • the second sensor 18 shown is an ultrasonic sensor.
  • the second sensor 18 can comprise further sensors, for example a load sensor, which detects the picking up of the goods W by means of a load indication.
  • the second sensor 18 is received in one of the forks 13 of the forklift 10.
  • the second sensor 18 can also be attached to the lifting mast 11 or to the load carrier 17 of the forklift 10.
  • the controller 20 of the forklift 10 can output a second activation signal to the second camera 14 of the forklift 10 in order to activate the second camera 14.
  • the second camera 14 detects a lower side of the goods W while it is being picked up by the forks 13 of the forklift 10.
  • the second camera records a rear side of the goods W facing away from the forklift truck 10 after it has been recorded. This will be referred to later on Figures 2A to 2C described in more detail.
  • the system 100 of the first embodiment shown comprise one or more external cameras 114, which are arranged in a warehouse, and which Detect at least one side of the goods W facing away from the forklift truck 10 during and after the goods W are picked up by the forklift truck 10.
  • An external camera 114 can preferably be attached within a hall ceiling included in the warehouse in order to capture an upper side of the goods W.
  • an external camera 114 can be attached to a lamp post in the warehouse in order to capture a lateral side of the goods W.
  • an external camera 114 can be attached to a passage gate of the warehouse through which the forklift truck with the picked up goods W passes, in order to capture a lateral side of the goods W.
  • An external camera 114 can preferably be recorded in an airworthy drone, which films the forklift 10 during and after the goods W are picked up.
  • a signal with the current geographical coordinates of the forklift 10 can also be output to the drone. The drone then captures the remaining sides of the goods W.
  • an external camera 114 can receive the first or the second activation signal that is output by the controller 20 of the forklift 10 and begin to transmit the image data captured by the external camera 114 to a data processing system 24.
  • the number of cameras and sensors that the forklift 10 and system 100 may include can vary.
  • the forklift 10 can only include one camera and the system 100 can only include one external camera.
  • the forklift 10 can include only one sensor and the system 100 can include one or more sensors.
  • the cameras 12, 16 of the forklift truck 10 and the external cameras 114 of the system 100 are preferably designed with lighting options so that the cameras can achieve correct image data acquisition even in dark surroundings through sufficient lighting.
  • the cameras are preferably 3D cameras that operate on the principle of the light flow method in order to additionally enable a determination of the depth data of the goods W to be recorded or recorded.
  • the cameras can also include network cameras that provide digital signals at the output in the form of a video stream that can be transmitted via Internet Protocol (IP).
  • IP Internet Protocol
  • the forklift 10 comprises a transfer unit 22, as in FIG Figure 1B which can transmit the image data captured by the first 12 and the second camera 14 of the forklift 10 to a data processing system 24 for the identification of the goods W.
  • the forklift 10 can include a location system (not shown) that detects a geographic position of the forklift in the warehouse in real time.
  • the positioning system can comprise a GPS sensor, which makes it possible to determine the geographic coordinates of the forklift truck 10 in real time.
  • the coordinates determined in this way can be transmitted together with the captured image data of the forklift 10 and a time stamp by the transmission unit 22 to the data processing system 24.
  • the data processing system 24 can then, based on the transmitted coordinates and the time stamp, combine the image data captured by the forklift truck 10 with the image data captured by an external camera 114 of the system 100.
  • the forklift truck 10 can be identified by analyzing the image data of an external camera 114 of the system 100, for example via a QR code, in order to enable the image data to be assigned to the image data of the forklift truck 10.
  • FIG. 2A shows a perspective detail of the forklift 10 according to the first embodiment, which shows the fork 13 with the second camera 14 before the goods W are picked up.
  • the second camera 14 can preferably be formed with impact-resistant and / or waterproof materials, which protect the second camera 14 from impacts when picking up goods and / or from moisture.
  • the second camera 14 can be arranged in a shock-proof and / or waterproof housing that is received in the fork 13 of the forklift.
  • Figure 2B shows a perspective detail of the forklift 10 according to the first embodiment, which shows the fork 13 with the second camera 14 while the goods W are picked up.
  • the second camera can preferably comprise a 3D camera which uses a time of flight (TOF) method to determine the dimensions of the goods W during their recording, as in FIG Figure 2B shown, determined. For example, it can be ensured that no empty crates are missing from a pallet.
  • TOF time of flight
  • Figure 2C shows a perspective section of the forklift 10 according to the first Embodiments represent the fork 13 with the second camera 14 after the recording of the Goods W shows.
  • the second camera can be connected to an extendable device 30 (not shown).
  • the extendable device comprises a telescopic device. This makes it possible to position the second camera 18 in such a way that the rear side of the goods W facing away from the forklift truck can be detected.
  • the second camera can be attached to the lifting mast 11 of the forklift (not shown).
  • the second camera 18 can be connected to the extendable device 30 so that the second camera 18 is positioned in such a way that it detects an upper side of the goods W or a lateral side of the goods W.
  • the forklift truck 10 can include one or more mirrors.
  • Figure 3A Figure 12 illustrates a perspective view of the forklift 10 showing the first mirror 26 in a first position.
  • Figure 3B Figure 10 illustrates a perspective view of the forklift 10 showing the first mirror 26 in a second position.
  • Figure 3C Figure 10 illustrates a top perspective view of forklift 10 showing first mirror 26 and second mirror 28 in the second position.
  • the first mirror 26 which is connected to an extendible / fold-out device, is attached to the side of the forklift 10 and is in the first position, i.e. in a retracted / folded position.
  • the second mirror 28 is attached on the opposite side of the forklift 10 (not shown).
  • the first mirror 26 and the second mirror 28 can be brought into the second position, that is to say extended / folded out, in order to reflect the lateral sides S3, S4 of the goods W.
  • the first mirror 26 and the second mirror 28 can preferably be extended when it is determined by the second sensor 18 that the goods W are being picked up.
  • the first camera 12 can preferably be rotated up to 90 ° in two directions in order to vary the field of view of the first camera 12. This enables the first mirror 26 and the second mirror 28 to be aimed at, as in FIG Figure 3C to capture the image data of the side sides of the goods W.
  • the system 100 can also comprise one or more mirrors.
  • one or more mirrors can be fitted in the warehouse in the area of the goods unloading point (not shown) in order to reflect at least one side of the goods W facing away from the forklift truck 10.
  • the first camera 10 of the forklift 10 can then aim at the mirror in the warehouse to capture the image data.
  • the image data processing is explained in detail below. After the image data of the first camera 12 and the second camera 18 of the forklift 10, as well as the external cameras 114 of the system 10, have been captured and transmitted to the data processing system 24, data processing takes place.
  • the forklift truck 10 preferably contains the data processing system 24.
  • the data processing system 24 can also be contained in an external processing unit, for example a central computer.
  • the image data captured by the forklift 10 can be combined with those of the image data captured by the external cameras 114.
  • the image data is preprocessed.
  • sensor-related errors such as noise, overexposure, etc. are preferably filtered out of the image data by machine.
  • the data format is then prepared, for example by converting the image data to gray levels or by transforming the depth data of the goods W to determine the dimensions of the goods W.
  • the data determined by the sensors of the forklift truck 10 and the system 100 are preferably also merged with the image data .
  • a machine segmentation of the captured image data is preferably carried out, as in FIG Fig. 4 shown.
  • Many methods of automatic segmentation are possible. Basically, they are often divided into pixel, edge and region-oriented processes.
  • model-based methods in which a certain shape of the objects is assumed, and texture-based methods, in which an internal, homogeneous structure of the objects is also taken into account. Different methods can also be combined to achieve better results.
  • Features of the captured image data can preferably be extracted. Examples include, but are not limited to, SIFT ("Scale-Invariant Feature Transform"), HOG (“Histogram of Oriented Gradients”), and SURF (" Speeded Up Robust Features”).
  • SIFT Scale-Invariant Feature Transform
  • HOG Histogram of Oriented Gradients
  • SURF Speeded Up Robust Features
  • SIFT is an algorithm for the detection and description of local features in images.
  • the detector and the feature descriptions are, within certain limits, invariant compared to coordinate transformations such as translation, rotation and scaling. They are also robust against lighting variations, image noise and lower geometric deformations of a higher order, such as those caused by projective imaging of an object from different points of view in space.
  • HOG is an algorithm for feature recognition in images that shows the appearance and shape of objects within an image even without detailed knowledge of the positions of edges or corners by distributing the local intensity or the arrangement of the edges. For this purpose, the image is broken down into sub-areas and the orientations of all edges are determined for each sub-area and their number is saved as a histogram.
  • the SURF algorithm is based on the same principles and steps as SIFT. However, the details in each step are different.
  • the algorithm consists of three main parts: identification of points of interest, description of the local environment and mapping.
  • the goods W in particular the empty packaging types of the pallet (s) picked up, are identified.
  • a geometric classification by means of (possibly segmented) depth data or their extracted features, such as height, depth, width and number of boxes, which are obtained from the preprocessed image data, is made possible by comparison with data from a previously created database. This allows the types of empties to be identified, in particular the types of crates of the goods W.
  • a texture-based classification using (possibly segmented) color data or their extracted features includes the application and evaluation of statistical methods for computer-aided recognition of patterns in images.
  • statistical evaluation of the preprocessed image data for example by means of the median color tone determination
  • machine learning for example "shallow learning", “deep learning”
  • an algorithm can be trained with sufficiently real data or their features. This trained algorithm can then be validated and verified using real data and then used to identify empties.
  • the identification of the empty packaging types preferably comprises a combination of the geometric and texture-based classification with merged sensor data.
  • a prediction of the most likely occupancy of the individual empties crates is preferably made by analyzing all visible sides of the goods W, for example two per Pallet. This is done using a statistical estimate and / or an estimate using machine learning methods.
  • the corresponding results of the image data processing are then sent to an ERP system via a standardized communication path (for example using a network protocol such as the "Simple Object Access Protocol” (SOAP)) and standardized data formats (for example CSV, XML, JSON-RPC) transfer.
  • SOAP Simple Object Access Protocol
  • standardized data formats for example CSV, XML, JSON-RPC

Description

Die Erfindung betrifft einen Gabelstapler und ein System mit einem Gabelstapler zur Identifikation einer aufzunehmenden/aufgenommenen Ware. Ein Gabelstapler gemäß dem Oberbegriff des Anspruchs 1 ist aus dem EP0254192 A2 bekannt.The invention relates to a forklift truck and a system with a forklift truck for identifying goods to be picked up / picked up. A forklift truck according to the preamble of claim 1 is from the EP0254192 A2 known.

Die Handhabung von leeren Getränkeflaschen ist innerhalb Deutschlands bereits seit geraumer Zeit reglementiert. Um Energie und Ressourcen zu sparen werden Glas- und Kunststoffflaschen mittels eines Pfandsystems wiederverwendet. Im Jahr 2003 wurde deshalb auch für Einwegflaschen ein Pfand eingeführt, nachdem die entsprechenden Zahlen bzw. die Mehrwegquote über die Jahre hinweg immer mehr gesunken waren. Dadurch konnte dem hohen Abfallaufkommen entgegengewirkt werden.The handling of empty beverage bottles has been regulated within Germany for some time. In order to save energy and resources, glass and plastic bottles are reused by means of a deposit system. In 2003, a deposit was therefore introduced for one-way bottles, after the corresponding numbers and the reusable rate had fallen over the years. This counteracted the high volume of waste.

Beim Mehrwegsystem kommen Flaschen zum Einsatz, die für die Befüllung nach einer Reinigung und Prüfung direkt wiederverwendet werden können. Im direkten Gegensatz dazu steht das Einwegsystem, dessen Produkte nach der Rücknahme in ihre Grundbestandteile zerlegt und weiterverarbeitet werden, zum Beispiel zu Glas- oder Kunststoffgranulat.In the reusable system, bottles are used that can be reused immediately for filling after cleaning and testing. In direct contrast to this, there is the one-way system, the products of which are broken down into their basic components after being taken back and processed further, for example into glass or plastic granulate.

Nach Untersuchungen des Umweltbundesamtes sind Mehrwegflaschen umweltfreundlicher als Einwegflaschen. Der Energie- und Ressourcenverbrauch für Rücktransport und Reinigung ist bei Mehrwegflaschen geringer als der zusätzliche Herstellungsaufwand für Einwegflaschen. Dies gilt umso mehr, je regionaler der Vertrieb und je höher die Zahl der Wiederbefüllungen ist.According to studies by the Federal Environment Agency, returnable bottles are more environmentally friendly than single-use bottles. The energy and resource consumption for return transport and cleaning is lower for returnable bottles than the additional manufacturing costs for disposable bottles. This applies all the more, the more regional the distribution and the higher the number of refills.

Eine Unterscheidung von Mehrwegflaschen und Einwegflaschen in Form und Beschaffenheit ist im Regelfall nur bei Kunststoffflaschen (meist aus dem Material PET Polyethylenenterephthalat hergestellt) möglich. Hier zeichnen sich Mehrwegflaschen im Vergleich zu den Einwegflaschen durch eine höhere Wandstärke und somit Stabilität aus, da die Flaschen einen signifikant längeren Lebenszyklus überstehen müssen.A distinction between reusable bottles and one-way bottles in terms of shape and quality is usually only possible with plastic bottles (mostly made of the material PET, polyethylene terephthalate). In comparison to the one-way bottles, reusable bottles are characterized by a higher wall thickness and thus stability, since the bottles have to withstand a significantly longer life cycle.

Im Falle von Mehrwegflaschen, unabhängig von dem Material der Flaschen, stellt das sogenannte Leergut-Management sämtliche Getränkehersteller vor große Herausforderungen. Das Leergut-Management kann dabei prinzipiell in zwei verschiedene Teilbereiche unterteilt werden:

  • Externes Leergut-Management (bei den Kunden eines Getränkeherstellers)
  • Internes Leergut-Management (beim Getränkehersteller):
In the case of returnable bottles, regardless of the material of the bottles, the so-called empties management poses major challenges for all beverage manufacturers. In principle, empties management can be divided into two different sub-areas:
  • External empties management (for the customers of a beverage manufacturer)
  • Internal empties management (at the beverage manufacturer):

Zum externen Leergut-Management zählt dabei die eigentliche Leergutannahme innerhalb des Getränkehandels über die Kunden, die dortige oder externe Vorsortierung und die logistische Verknüpfung bzw. die Überführung der Leergutware zum Zielort des Getränkeherstellers.The external empties management includes the actual acceptance of empties within the beverage trade via the customers, the local or external pre-sorting and the logistical link or the transfer of the empties to the destination of the beverage manufacturer.

Das interne Leergut-Management umfasst im Gegensatz dazu die Leergutanlieferung beim Getränkehersteller selbst, die datenseitige Erfassung und Lagerung der Ware, die Reinigung und Inspektion, sowie die passende Bereitstellung des Leerguts an geeigneter Stelle in der Produktion.In contrast, the internal empties management includes the delivery of empties to the beverage manufacturer itself, the data recording and storage of the goods, cleaning and inspection, as well as the appropriate provision of the empties at a suitable point in production.

Im Folgenden soll die Problemstellung anhand des Ablaufes beim Leergut-Management erläutert werden.In the following, the problem is explained using the empties management process.

Die gebrauchten, von den Kunden zurückgegebenen Mehrwegflaschen, werden in der Regel am Ort des Händlers oder durch spezielle Dienstleister vorsortiert. Dabei erfolgt diese Vorsortierung größtenteils händisch in den Getränkerücknahmestellen oder aber auch automatisiert durch die beim Einzelhandel aufgestellten Getränkerückgabeautomaten. Bei diesen Dienstleistern, wird diese Vorsortierung ebenfalls manuell oder automatisiert durchgeführt. Diese Prozesse der Vorsortierung im externen Leergut-Management stellen damit eine relativ hohe Sortenreinheit der Getränkekästen sicher, so dass davon ausgegangen werden kann, dass innerhalb eines Kastens, die richtigen dazugehörigen Flaschen enthalten sind.The used reusable bottles returned by customers are usually pre-sorted at the dealer's location or by special service providers. This pre-sorting is largely done manually in the beverage return points or also automatically by the beverage return machines installed in the retail trade. With these service providers, this pre-sorting is also carried out manually or automatically. These processes of pre-sorting in the external empties management ensure a relatively high degree of purity of the beverage crates, so that it can be assumed that the correct bottles are contained within a crate.

Nach Transport der leeren Mehrwegflaschen zum Getränkehersteller setzt nun das interne Leergut-Management an.After the empty returnable bottles have been transported to the beverage manufacturer, the internal empties management starts.

Die Anlieferung des Leerguts beim Getränkehersteller geschieht im Regelfall in Kistenform auf genormten, mehrwegfähigen Transportpaletten. Die Menge an Leergut, welche hierbei verarbeitet werden muss, ist bei einem großen Getränkehersteller sehr hoch. Beispielsweise werden bei dem Mineralbrunnen Gerolsteiner pro Jahr ca. 30.000 LKW mit jeweils 36 Paletten abgewickelt.The delivery of empties to the beverage manufacturer is usually done in the form of boxes on standardized, reusable transport pallets. The amount of empties that has to be processed here is very high for a large beverage manufacturer. For example, around 30,000 trucks with 36 pallets each are processed at the Gerolsteiner mineral spring.

Der Ablauf im Stand der Technik ist dabei wie folgt:
Die Staplerfahrer müssen den LKW mit sehr kurzen Zeitvorgaben entladen. Während der Staplerfahrer die Paletten auf die Gabel nimmt, muss er hierbei gleichzeitig meistens mittels eines Terminals im Stapler die Anzahl der Paletten und den Kasten-/Flaschentyp erfassen. Diese Daten fließen dann direkt in das Warenwirtschaftssystem, welches für die anschließenden Lagerungs- und Produktionsprozesse die Daten bereitstellt.
The process in the prior art is as follows:
The forklift drivers have to unload the truck with very short deadlines. While the forklift driver takes the pallets on the fork, he usually has to record the number of pallets and the crate / bottle type using a terminal in the forklift. This data then flows directly into the merchandise management system, which provides the data for the subsequent storage and production processes.

Die Erfassung des Kasten-/Flaschentyps ist dabei stark fehlerbehaftet. Die Gründe für diese hohe Fehlerquote sind dabei u. a. die Folgenden:

  • Kurze Zeitvorgaben für die Entladung
  • Mehrere Stapler entladen einen LKW, dadurch Konzentration auf Staplerverkehr
  • Die Stapler können (bei entsprechender Ausstattung) mehrere Paletten gleichzeitig auf die Gabeln nehmen. Dies sind oft 4 Paletten, auch 8 oder bis zu 12 Stück.
  • Aus Marketinggründen bestehen inzwischen sehr unterschiedliche Kasten-/Flaschentypen, welche leicht verwechselt werden können
  • Schlechte Sichtverhältnisse bei der Entladung
  • Manuelle Eingaben in aufgrund der hohen Sortenzahl unübersichtlichen Menüs
  • Unterschiedliche Qualität der Vorsortierung, d. h. auf einer Palette sind unterschiedliche Kasten-/Flaschentypen enthalten
The detection of the crate / bottle type is highly error-prone. The reasons for this high error rate include the following:
  • Short deadlines for unloading
  • Several forklifts unload a truck, so concentration on forklift traffic
  • The forklifts can (with the appropriate equipment) take several pallets onto the forks at the same time. This is often 4 pallets, also 8 or up to 12 pieces.
  • For marketing reasons, there are now very different types of crates / bottles, which can easily be mixed up
  • Bad visibility during unloading
  • Manual entries in menus that are confusing due to the large number of types
  • Different quality of pre-sorting, ie different crate / bottle types are contained on one pallet

Da die Eingabequalität des Staplerfahrers direkt die erwarteten IST-Bestände an Leergut beeinflusst, kann es bei zu starken Abweichungen zu direkten Folgeproblemen in der Produktion kommen. Beispielsweise wird die Produktion einer Charge eines Produktes mit dem dazugehörigen Flaschentyp gestartet. Aufgrund der falschen Bestände kommt es dazu, dass zunächst die falschen Kasten-/Flaschentypen in der produktionsbegleitenden Logistik verarbeitet werden und dann erst während des Produktionsprozesses die fehlerhafte Zuordnung erkannt wird. Darüber hinaus ist die geplante Kasten-/Flaschenmenge möglicherweise physisch gar nicht vorhanden (Diskrepanz im Lagerbestand) und die Produktion wird vorzeitig gestoppt. Die Anlage muss umgerüstet werden und das geplante Produkt kann in der geforderten Menge nicht produziert werden. Dadurch werden hohe Kosten in der ohnehin aufgrund des Preiskampfs im Einzelhandel unter Druck stehenden Getränkebranche verursacht. Darüber hinaus entsteht eine erhebliche Verschwendung an Ressourcen und Energie aufgrund der vermeidbaren Aufwände für das Handling von falschen Kästen-/Flaschen.Since the input quality of the forklift driver directly influences the expected actual stocks of empties, excessive deviations can lead to direct consequential problems in production. For example, the production of a batch of a product with the corresponding bottle type is started. Because of the wrong stocks, the wrong crate / bottle types are initially processed in the production-accompanying logistics and then the incorrect assignment is only recognized during the production process. In addition, the planned crate / bottle quantity may not physically exist at all (discrepancy in inventory) and production will be stopped prematurely. The system has to be converted and the planned product cannot be produced in the required quantity. This causes high costs in the beverage industry, which is already under pressure due to the price war in the retail sector. In addition, there is a considerable waste of resources and energy due to the avoidable expenses for handling the wrong crates / bottles.

An dem Folgenden Beispiel soll die Relevanz des bestehenden Problems verdeutlicht werden:The following example is intended to illustrate the relevance of the existing problem:

Bei täglich bis zu 6.000 Paletten und einer Fehlerquote von bis zu 10% (d.h. 600 Paletten) würde dies eine Fehlzuordnung von 480.000 Flaschen bedeuten. Über mehrere Tage betrachtet führt das zu einer vollkommen unzureichenden Erfassung des IST-Bestandes an Leergut. Hierbei wird davon ausgegangen, dass in dem vorgelagerten externen Leergut-Management bereits vollkommen sortenreine Paletten mit nur einem Kasten-/Flaschentyp bereitgestellt wurden. Insbesondere bei hoher Variantenzahl an unterschiedlichen Kästen-/Flaschen aufgrund hoher Individualisierung der Produkte und bei kleineren Händlern, ist es zunehmend nicht möglich, eine Palette immer sortenrein bereitzustellen. Dadurch wird die Fehlerquote zunehmend weiter steigen.With up to 6,000 pallets per day and an error rate of up to 10% (ie 600 pallets) this would mean a misallocation of 480,000 bottles. Viewed over several days, this leads to a completely inadequate recording of the actual inventory of empties. It is assumed here that completely single-type pallets with only one type of crate / bottle have already been provided in the upstream external empties management. Particularly with a high number of variants of different crates / bottles due to a high degree of individualization of the products and with smaller retailers, it is increasingly not possible to always provide a pallet of one type. As a result, the error rate will continue to increase.

Dieser Sachverhalt stellt heutzutage die größte Herausforderung für die Getränkehersteller dar.This fact represents the greatest challenge for beverage manufacturers nowadays.

Auch in anderen Branchen wie der Automobil-Industrie / -Zulieferindustrie wird häufig mit Leergut (Ladungsträgern, KLTs) gearbeitet. Diese Ladungsträger dienen der Aufnahme spezieller Bauteile, die beispielsweise zu der Automobilfertigung eingesetzt werden. Diese Ladungsträger sind meist für Aufnahme ganz spezifischer Artikel hergestellt und befinden sich daher in einem Kreislaufsystem zwischen Automobilhersteller und seinen Zulieferfirmen.In other sectors, too, such as the automotive industry / supplier industry, empties (load carriers, KLTs) are often used. These load carriers are used to hold special components that are used, for example, in automobile production. These load carriers are usually made to accommodate very specific items and are therefore located in a circulatory system between the automobile manufacturer and its suppliers.

Diese Ladungsträger haben einen hohen Wert, da sie spezielle für ein Produkt gebaut wurden.These load carriers have a high value because they are specially built for a product.

Das bedeutet für die Zulieferfirmen, dass sie den Wareneingang des Leergutes ebenso wie die Getränkehersteller verwalten müssen. Egal ob die Ladungsträger dem Zulieferer oder dem Automobilhersteller gehören. Wenn sie das nicht tun, besteht das Risiko, dass sie die hohen Kosten für fehlende Ladungsträger tragen müssen. Auch hier ist eine manuelle Erfassung des Leergutes (also wenn der Ladungsträger leer ist, im vollen Zustand hat er ein Barcodelabel) notwendig. Eine schnelle und zuverlässige Erfassung des Leergutes mittels der vorgestellten Erfindung, ist also auch in Industrien sinnvoll, die mit individuellen Ladungsträgern zur Aufnahme von Bauteilen arbeiten.For the supplier companies, this means that they have to manage the incoming goods of the empties just like the beverage manufacturers. Regardless of whether the load carriers belong to the supplier or the automobile manufacturer. If they don't, there is a risk that they will have to bear the high costs of missing load carriers. Here, too, it is necessary to manually record the empties (i.e. when the load carrier is empty, it has a barcode label when it is full). A quick and reliable detection of the empties by means of the presented invention also makes sense in industries that work with individual load carriers for holding components.

Hiervon ausgehend, liegt der Erfindung die Aufgabe zugrunde, eine dynamische, mobile, und echtzeitfähige Mengen- und Typerfassung einer Ware beim Ent- oder Beladen zur Verfügung zu stellen welche Fehlsortierungen zuverlässig erkennt.Proceeding from this, the invention is based on the object of providing a dynamic, mobile and real-time capable quantity and type detection of goods during unloading or loading which reliably detects incorrect sorting.

Diese Aufgabe wird gemäß einem ersten Aspekt der vorliegenden Erfindung durch einen Gabelstapler gemäß Anspruch 1 gelöst. Der Gabelstapler umfasst: erstes Kamerasystem mit einer ersten Kamera, die ausgebildet ist, eine erste Seite einer aufzunehmenden Ware zu erfassen; ein zweites Kamerasystem mit einer zweiten Kamera, die ausgebildet ist, eine zweite Seite der Ware zu erfassen, während oder nachdem sie durch den Gabelstapler aufgenommen wurde, wobei sich die erste Seite von der zweiten Seite unterscheidet; ein erstes Sensorsystem mit einem ersten Sensor, der ausgebildet ist, einen Abstand des Gabelstaplers von der Ware zu erfassen; eine Steuerung, die ausgebildet ist, ein erstes Aktivierungssignal an die erste Kamera auszugeben, wenn der vom ersten Sensor erfasste Abstand einen Mindestabstand unterschreitet; und eine Übertragungseinheit, die ausgebildet ist, die durch die erste Kamera und durch die zweite Kamera jeweils erfassten Bilddaten der Ware an ein Datenverarbeitungssystem zur Identifikation der Ware zu übertragen. Mit der ersten und der zweiten Kamera des Gabelstaplers werden die Bilddaten der aufzunehmenden/aufgenommenen Ware, insbesondere einer Palette von Leergutkästen, in Echtzeit generiert und an das Datenverarbeitungssystem übertragen. Die Analyse der Bilddaten hat den Vorteil, dass die Ware, insbesondere eine Typerfassung des Leerguts, bestimmt werden kann. Dies erlaubt eine genaue Erfassung der aktuellen Bestände an Leergut und somit einen effizienteren Produktionsprozess.According to a first aspect of the present invention, this object is achieved by a forklift truck according to claim 1. The forklift truck comprises: first camera system with a first camera which is designed to capture a first side of goods to be picked up; a second camera system with a second camera which is designed to capture a second side of the goods during or after they have been picked up by the forklift truck, the first side differing from the second side; a first sensor system with a first sensor which is designed to detect a distance between the forklift truck and the goods; a controller which is designed to output a first activation signal to the first camera when the distance detected by the first sensor falls below a minimum distance; and a transmission unit which is designed to transmit the image data of the goods captured by the first camera and by the second camera in each case to a data processing system for identifying the goods. With the first and the second camera of the forklift truck, the image data of the goods to be picked up / picked up, in particular a pallet of empty boxes, are generated in real time and transmitted to the data processing system. Analysis of the Image data has the advantage that the goods, in particular a type detection of the empties, can be determined. This allows an exact recording of the current stocks of empties and thus a more efficient production process.

Gemäß einem weiteren Aspekt der vorliegenden Erfindung ist das erste Sensorsystem ultraschall- oder radarbasiert, um den Abstand des Gabelstaplers exakt zu bestimmen.According to a further aspect of the present invention, the first sensor system is based on ultrasound or radar in order to determine the distance from the forklift exactly.

Gemäß einem noch weiteren Aspekt der vorliegenden Erfindung umfasst der Gabelstapler ein zweites Sensorsystem mit einem zweiten Sensor, der ausgebildet ist, eine Aufnahme der Ware durch den Gabelstapler zu erfassen.According to yet another aspect of the present invention, the forklift truck comprises a second sensor system with a second sensor which is designed to detect that the forklift truck has picked up the goods.

Gemäß einem noch weiteren Aspekt der vorliegenden Erfindung ist die Steuerung weiterhin ausgebildet, ein zweites Aktivierungssignal an die zweite Kamera auszugeben, wenn durch den zweiten Sensor bestimmt wird, dass eine Ware aufgenommen wurde.According to yet another aspect of the present invention, the controller is also designed to output a second activation signal to the second camera when it is determined by the second sensor that goods have been picked up.

Gemäß einem noch weiteren Aspekt der vorliegenden Erfindung weist das zweite Kamerasystem eine ausfahrbare Vorrichtung auf, die ausgebildet ist, die zweite Kamera in eine Position zu bringen, die eine Erfassung der zweiten Seite der Ware ermöglicht. So kann eine hintere Seite der Ware W erfasst werden.According to yet another aspect of the present invention, the second camera system has an extendable device which is designed to bring the second camera into a position which enables the second side of the goods to be detected. A rear side of the goods W can thus be detected.

Gemäß einem noch weiteren Aspekt der vorliegenden Erfindung sind die erste Kamera und die zweite Kamera 3D-Kameras. So können die Dimensionen der Ware, wie Höhe und Breite bestimmt werden.According to yet another aspect of the present invention, the first camera and the second camera are 3D cameras. In this way, the dimensions of the goods, such as height and width, can be determined.

Gemäß einem noch weiteren Aspekt der vorliegenden Erfindung enthält der Gabelstapler das Datenverarbeitungssystem. Dies hat den Vorteil, dass die Bildanalyse am Gabelstapler in Echtzeit vorgenommen werden kann, und die Informationen über die identifizierte Ware direkt an ein Warenwirtschaftssystem bereitgestellt werden kann.According to yet another aspect of the present invention, the forklift includes the data processing system. This has the advantage that the image analysis can be carried out on the forklift in real time, and the information about the identified goods can be made available directly to an inventory control system.

Gemäß einem noch weiteren Aspekt der vorliegenden Erfindung umfasst das erste Kamerasystem weiterhin einen ersten Spiegel und einen zweiten Spiegel mit jeweils einer Vorrichtung, die ausgebildet ist, die Spiegel so zu positionieren, dass sie eine dritte Seite und eine vierte Seite der von dem Gabelstapler aufgenommen Ware erfassen. So können weitere Seiten der Ware erfasst werden, was eine genauere Leergut-Identifikation ermöglicht.According to yet another aspect of the present invention, the first camera system further comprises a first mirror and a second mirror, each with a device which is designed to position the mirrors so that they have a third side and a fourth side of the goods picked up by the forklift truck capture. In this way, other sides of the goods can be recorded, which enables more precise identification of empties.

Gemäß einem noch weiteren Aspekt der vorliegenden Erfindung ist die erste Kamera an einem Hubmast des Gabelstaplers angebracht.According to yet another aspect of the present invention, the first camera is attached to a mast of the forklift.

Gemäß einem noch weiteren Aspekt der vorliegenden Erfindung ist das zweite Kamerasystem in einem der Gabelstapler-Gabeln angeordnet. Dies hat den Vorteil, dass die zweite Kamera während der Warenaufnahme eine untere Seite der Ware erfassen kann. So kann zum Beispiel sichergestellt werden, dass innerhalb einer Palette keine Leergutkästen fehlenAccording to yet another aspect of the present invention, the second camera system is arranged in one of the forks of the forklift truck. This has the advantage that the second camera can capture a lower side of the goods while the goods are being picked up. So For example, it can be ensured that no empty crates are missing from a pallet

Gemäß einem noch weiteren Aspekt der vorliegenden Erfindung, wird die zweite Kamera nach der Warenaufnahme durch die ausfahrbare Vorrichtung so positioniert, dass sie die zweite Seite der Ware erfasst, wobei die zweite Seite der Ware der hinteren, vom Gabelstapler abgewandten Seite entspricht. Die ausfahrbare Vorrichtung ermöglicht es die zweite Kamera in verschiedene Positionen zu bringen, um somit eine vollständige Erfassung der hinteren Seite der Ware zu gewährleisten.According to yet another aspect of the present invention, after the goods have been picked up by the extendable device, the second camera is positioned in such a way that it captures the second side of the goods, the second side of the goods corresponding to the rear side facing away from the forklift truck. The retractable device enables the second camera to be brought into different positions in order to ensure complete coverage of the rear side of the goods.

Gemäß einem noch weiteren Aspekt der vorliegenden Erfindung ist der erste Spiegel und der zweite Spiegel jeweils an dem Gabelstapler angebracht, und wird nach der Warenaufnahme durch die jeweilige Vorrichtung so positioniert, dass die dritte und vierte Seite der Ware, die den seitlichen Seiten entsprechen, reflektiert wird, wobei der erste Spiegel und der zweite Spiegel nach der Warenaufnahme von der ersten Kamera anvisiert wird. Die Spiegel sind seitlich am Gabelstapler angebracht und können durch die jeweilige Vorrichtung ausgefahren/ausgeklappt werden. Die erste Kamera kann dann auch die Bilddaten der Seitenteil der Ware mittels der ausgefahrenen/ausgeklappten Spiegel erfassen was die Waren-Identifikation verbessert.According to yet another aspect of the present invention, the first mirror and the second mirror are each attached to the forklift and are positioned after the goods have been picked up by the respective device so that the third and fourth sides of the goods, which correspond to the lateral sides, reflect the first mirror and the second mirror being sighted by the first camera after the goods have been picked up. The mirrors are attached to the side of the forklift and can be extended / unfolded using the respective device. The first camera can then also capture the image data of the side part of the goods by means of the extended / unfolded mirror, which improves the goods identification.

Gemäß einem noch weiteren Aspekt der vorliegenden Erfindung ist das Datenverarbeitungssystem im Gabelstapler ausgebildet, eine Vorverarbeitung von Bilddaten durzuführen, die durch ein oder mehrere Kamerasysteme des Gabelstaplers erfasst werden, wobei die Vorverarbeitung eine Filterung von sensorbedingten Fehlern, eine Vorbereitung des Datenformats der Bilddaten, eine Sensordatenfusion, eine Extraktion von Merkmalen und eine maschinelle Segmentierung umfasst.According to yet another aspect of the present invention, the data processing system in the forklift is designed to carry out preprocessing of image data that are captured by one or more camera systems of the forklift, the preprocessing filtering sensor-related errors, preparing the data format of the image data, and sensor data fusion , feature extraction and machine segmentation.

Gemäß einem noch weiteren Aspekt der vorliegenden Erfindung ist das Datenverarbeitungssystem weiterhin ausgebildet aus den vorverarbeiteten Bilddaten eine Identifikation der Ware, insbesondere von Leergutkastentypen, sowie einer Anzahl der Leergutkästen, durch eine oder mehrere der folgenden Klassifizierungen durchzuführen: eine geometrische Klassifizierung basierend auf geometrische Daten, die aus den Bilddaten bestimmt werden, oder deren extrahierten Merkmalen, eine texturbasierte Klassifizierung basierend auf Farbdaten, die aus den Bilddaten bestimmt werden, und deren extrahierten Merkmalen. Weiterhin können auch die Flaschentypen der Ware W bestimmt werden.According to yet another aspect of the present invention, the data processing system is further designed to carry out an identification of the goods, in particular of empty crate types, and a number of empty crates, by means of one or more of the following classifications from the preprocessed image data: a geometrical classification based on geometrical data that are determined from the image data, or their extracted features, a texture-based classification based on color data, which are determined from the image data, and their extracted features. Furthermore, the bottle types of the goods W can also be determined.

Gemäß einem noch weiteren Aspekt der vorliegenden Erfindung ist das Datenverarbeitungssystem weiterhin ausgebildet, auf Grundlage der erfassten Bilddaten eine statistische Schätzung und eine Schätzung durch maschinelles Lernen durchzuführen, um eine Vorhersage der Belegung an einzelnen Leergutkästen der aufgenommenen Ware zu bestimmen. Somit können spezifische Bildverarbeitungsalgorithmen und Maschinelle Lernverfahren (wie zum Beispiel "K-Nearest Neighbors", neuronale Netze, etc.) miteinander kombiniert werden, um eine zuverlässige Erkennung sicherzustellen.According to yet another aspect of the present invention, the data processing system is further designed to carry out a statistical estimation and an estimation by machine learning on the basis of the captured image data in order to predict the occupancy of individual empty boxes of the goods picked up determine. In this way, specific image processing algorithms and machine learning methods (such as "K-Nearest Neighbors", neural networks, etc.) can be combined with one another in order to ensure reliable detection.

Gemäß einem noch weiteren Aspekt der vorliegenden Erfindung umfasst ein System gemäß Anspruch 13 einen Gabelstapler mit einem Ortungssystem, das ausgebildet ist eine geographische Position des Gabelstaplers in einem Lager in Echtzeit zu erfassen; einem ersten Kamerasystem mit einer ersten Kamera, die ausgebildet ist, eine dem Gabelstapler zugewandte Seite einer aufzunehmenden Ware zu erfassen; einem ersten Sensorsystem mit einem ersten Sensor, der ausgebildet ist, einen Abstand des Gabelstaplers von einer aufzunehmenden Ware zu erfassen; einer Steuerung, die ausgebildet ist, ein erstes Aktivierungssignal an die erste Kamera auszugeben, wenn der vom ersten Sensor erfasste Abstand einen Mindestabstand unterschreitet; einem Übertragungssystem, das ausgebildet ist, die durch die erste Kamera erfassten Bilddaten der Ware an ein Datenverarbeitungssystem zu übertragen; ein zweites Kamerasystem mit einer zweiten Kamera, die in einem Lager angeordnet ist, und die ausgebildet ist, mindestens eine vom Gabelstapler abgewandte Seite der Ware während und nach der Aufnahme der Ware durch den Gabelstapler zu erfassen, wobei das Datenverarbeitungssystem ausgebildet ist, die durch die erste Kamera und die zweite Kamera erfassten Bilddaten zu analysieren, um dadurch die Ware zu identifizieren. Dies ermöglicht es zwei Konzepte, das heißt eine externe und eine frontalen Bilderfassung einer Ware, zu fusionieren. Durch Kombination der erfassten Bilddaten der ersten Kamera des Gabelstaplers mit den Bilddaten der zweiten Kamera des Systems kann die Ware, insbesondere Leerguttypen, identifiziert werden.According to yet another aspect of the present invention, a system according to claim 13 comprises a forklift truck with a positioning system which is designed to detect a geographical position of the forklift truck in a warehouse in real time; a first camera system with a first camera which is designed to capture a side of a goods to be picked up facing the forklift truck; a first sensor system with a first sensor which is designed to detect a distance between the forklift truck and goods to be picked up; a controller which is designed to output a first activation signal to the first camera when the distance detected by the first sensor falls below a minimum distance; a transmission system which is designed to transmit the image data of the goods captured by the first camera to a data processing system; a second camera system with a second camera, which is arranged in a warehouse, and which is designed to capture at least one side of the goods facing away from the forklift truck during and after the goods are picked up by the forklift truck, the data processing system being designed by the the first camera and the second camera to analyze captured image data in order to thereby identify the goods. This makes it possible to merge two concepts, i.e. an external and a frontal image acquisition of a product. By combining the captured image data from the first camera of the forklift truck with the image data from the second camera of the system, the goods, in particular types of empty containers, can be identified.

Gemäß einem noch weiteren Aspekt der vorliegenden Erfindung ist die zweite Kamera des zweiten Kamerasystems in einem Bereich einer Waren-Entladestelle innerhalb des Lagers, vorzugsweise an der Hallendecke, angebracht ist; oder an einem Durchfahrtstor des Lagers, welches der Gabelstapler mit der aufgenommen Ware passiert, angebracht.According to yet another aspect of the present invention, the second camera of the second camera system is attached in an area of a goods unloading point within the warehouse, preferably on the hall ceiling; or attached to a gate of the warehouse, which the forklift truck passes with the goods picked up.

Gemäß einem noch weiteren Aspekt der vorliegenden Erfindung wird das Datenverarbeitungssystem durch den Gabelstapler selbst bereitgestellt.According to yet another aspect of the present invention, the data processing system is provided by the forklift itself.

Gemäß einem noch weiteren Aspekt der vorliegenden Erfindung ist das Ortungssystem des Gabelstaplers weiterhin ausgebildet die durch die erste Kamera erfassten Bilddaten mit der geographischen Position zu versehen, um dadurch eine spätere Zuordnung der durch die erste Kamera erfassten Bilddaten mit den durch die zweite Kamera erfassten Bilddaten zu ermöglichen, wobei die durch die erste Kamera und die durch die zweite Kamera erfassten Bilddaten zusätzlich jeweils einen Zeitstempel aufweisen.According to yet another aspect of the present invention, the positioning system of the forklift is also designed to provide the image data captured by the first camera with the geographical position in order to subsequently assign the image data captured by the first camera to the image data captured by the second camera enable, the image data captured by the first camera and the second camera each additionally having a time stamp.

Gemäß einem noch weiteren Aspekt der vorliegenden Erfindung ist das Datenverarbeitungssystem ausgebildet, den Gabelstapler durch Analyse der durch die zweite Kamera erfassten Bilddaten zu identifizieren, um eine Zuordnung der durch die erste Kamera erfassten Bilddaten mit den durch die zweite Kamera erfassten Bilddaten zu ermöglichen.According to yet another aspect of the present invention, the data processing system is designed to identify the forklift truck by analyzing the image data captured by the second camera in order to enable the image data captured by the first camera to be associated with the image data captured by the second camera.

Gemäß einem noch weiteren Aspekt der vorliegenden Erfindung, ist das Datenverarbeitungssystem ausgebildet, auf Grundlage der einander zugeordneten Bilddaten der ersten Kamera und der zweiten Kamera, eine Vorverarbeitung durzuführen, wobei die Vorverarbeitung eine Filterung von sensorbedingten Fehlern, eine Vorbereitung des Datenformats der Bilddaten, eine Sensordatenfusion, eine Extraktion von Merkmalen, und eine maschinelle Segmentierung umfasst.According to yet another aspect of the present invention, the data processing system is designed to carry out preprocessing on the basis of the associated image data of the first camera and the second camera, the preprocessing filtering of sensor-related errors, preparing the data format of the image data, and sensor data fusion , feature extraction, and machine segmentation.

Gemäß einem noch weiteren Aspekt der vorliegenden Erfindung ist das Datenverarbeitungssystem weiterhin ausgebildet, aus den vorverarbeiteten Bilddaten eine Identifikation der Ware, insbesondere von Leergutkastentypen, sowie einer Anzahl der Leergutkästen, durch eine oder mehrere der folgenden Klassifizierungen durchzuführen: (a) eine geometrische Klassifizierung basierend auf geometrische Daten, die aus den Bilddaten bestimmt werden, oder deren extrahierten Merkmalen, und (b) eine texturbasierte Klassifizierung basierend auf Farbdaten, die aus den Bilddaten bestimmt werden, und deren extrahierten Merkmalen.According to yet another aspect of the present invention, the data processing system is further designed to carry out an identification of the goods, in particular of empty crate types and a number of empty crates, from the preprocessed image data using one or more of the following classifications: (a) a geometric classification based on geometric data determined from the image data or their extracted features, and (b) texture-based classification based on color data determined from the image data and their extracted features.

Gemäß einem noch weiteren Aspekt der vorliegenden Erfindung umfasst das zweite Kamerasystem mehrere Spiegel, die im Lager angeordnet sind, und die ausgebildet sind, die von dem Gabelstapler abgewandten Seiten der Ware zu erfassen, wobei die erste Kamera weiterhin ausgebildet ist, die Spiegel bei oder nach Aufnahme der Ware anzuvisieren.According to yet another aspect of the present invention, the second camera system comprises a plurality of mirrors which are arranged in the warehouse and which are designed to capture the sides of the goods facing away from the forklift truck, the first camera also being designed to display the mirrors at or after Aiming at picking up the goods.

Gemäß einem noch weiteren Aspekt der vorliegenden Erfindung wird das zweite Kamerasystem durch eine Drohne mit Kamera gebildet.According to yet another aspect of the present invention, the second camera system is formed by a drone with a camera.

Weitere Vorteile und Einzelheiten der Erfindung werden anhand der in den schematischen Figuren dargestellten Ausführungsbeispiele näher erläutert.

  • Fig. 1A stellt eine perspektivische Ansicht dar, die ein System mit einem Gabelstapler gemäß ersten Ausführungsformen zeigt.
  • Fig. 1B stellt eine perspektivische Ansicht dar, die den Gabelstapler gemäß der ersten Ausführungsformen im Detail zeigt.
  • Fig. 2A stellt einen perspektivischen Ausschnitt dar, der eine Gabel des Gabelstaplers gemäß der ersten Ausführungsformen vor einer Warenaufnahme zeigt.
  • Fig. 2B stellt einen perspektivischen Ausschnitt dar, der die Gabel des Gabelstaplers gemäß der ersten Ausführungsformen während einer Warenaufnahme zeigt.
  • Fig. 2C stellt einen perspektivischen Ausschnitt dar, der die Gabel des Gabelstaplers gemäß der ersten Ausführungsformen nach einer Warenaufnahme zeigt.
  • Fig. 3A stellt eine perspektivische Seitenansicht dar, die den Gabelstapler gemäß der ersten Ausführungsformen mit einem ersten Spiegel in einer ersten Position zeigt.
  • Fig. 3B stellt eine perspektivische Seitenansicht dar, die den Gabelstapler gemäß der ersten Ausführungsformen mit dem ersten Spiegel in einer zweiten Position zeigt.
  • Fig. 3C stellt eine perspektivische Draufsicht dar, die den Gabelstapler gemäß der ersten Ausführungsformen mit dem ersten Spiegel und einem zweiten Spiegel in der zweiten Position zeigt.
  • Fig. 4 stellt ein Beispiel der Bilddatenverarbeitung dar, das eine Segmentierung eines erfassten Bildes zeigt.
Further advantages and details of the invention are explained in more detail with reference to the exemplary embodiments shown in the schematic figures.
  • Figure 1A Fig. 13 is a perspective view showing a system including a forklift according to first embodiments.
  • Figure 1B Fig. 13 is a perspective view showing the forklift truck according to the first embodiment in detail.
  • Figure 2A shows a perspective cutout showing a fork of the forklift according to the first embodiment in front of a goods pick-up.
  • Figure 2B represents a perspective detail showing the fork of the forklift according to the first embodiment during a goods pick-up.
  • Figure 2C represents a perspective detail showing the fork of the forklift according to the first embodiment after receiving goods.
  • Figure 3A Fig. 10 is a side perspective view showing the forklift truck according to the first embodiment with a first mirror in a first position.
  • Figure 3B Figure 10 is a side perspective view showing the forklift truck according to the first embodiment with the first mirror in a second position.
  • Figure 3C Fig. 10 is a top perspective view showing the forklift according to the first embodiment with the first mirror and a second mirror in the second position.
  • Fig. 4 FIG. 10 illustrates an example of image data processing showing segmentation of a captured image.

Detaillierte BeschreibungDetailed description

Im Folgenden wird unter Bezugnahme auf Fig. 1A und Fig. 1B ein System 100 mit einem Gabelstapler 10 gemäß ersten Ausführungsformen ausführlicher beschrieben. Der in Fig. 1A und Fig. 1B gezeigte Gabelstapler 10 umfasst eine erste Kamera 12 die eine erste Seite S1 einer aufzunehmenden Ware W, wie zum Beispiel eine Palette von Leergutkästen, erfasst. Das Sichtfeld der ersten Kamera 12 ist in Fig. 1A durch eine gestrichelte Linie gekennzeichnet. In dem in Fig. 1A und Fig. 1B gezeigten Beispiel ist die erste Kamera 12 an einem Hubmast 11 des Gabelstaplers 10 angebracht. In einem weiteren (nicht dargestellten) Beispiel der ersten Ausführungsformen kann die erste Kamera 12 auch an einem Lastträger 17 des Gabelstaplers 10 angebracht werden.In the following with reference to Figure 1A and Figure 1B a system 100 with a forklift 10 according to first embodiments is described in more detail. The in Figure 1A and Figure 1B The forklift 10 shown comprises a first camera 12 which detects a first side S1 of a goods W to be picked up, such as a pallet of empty boxes. The field of view of the first camera 12 is in Figure 1A indicated by a dashed line. In the in Figure 1A and Figure 1B In the example shown, the first camera 12 is attached to a lifting mast 11 of the forklift 10. In a further (not shown) example of the first embodiment, the first camera 12 can also be attached to a load carrier 17 of the forklift 10.

Weiterhin umfasst der Gabelstapler 10 einen ersten Sensor 16, der einen Abstand des Gabelstaplers 10 von der Ware W bestimmt. Zum Beispiel kann der erste Sensor 16 durch Laufzeitmessung von elektromagnetischen Wellen oder Schallwellen den Abstand zu der Ware W bestimmen, was durch die gestrichelte Linie in Fig. 1A gekennzeichnet ist. Der erste Sensor 16 umfasst, einen Ultraschallsensor, einen Radarsensor und einen LIDAR ("Light Detection and Ranging")-Sensor. In dem in Fig. 1A und Fig. 1B gezeigten Beispiel ist der erste Sensor 16 am Dach des Gabelstaplers 10 angebracht. In weiteren (nicht dargestellten) Beispielen der ersten Ausführungsformen kann der erste Sensor 16 auch an einem Hubmast 11 des Gabelstaplers 10 angebracht, oder kann in eine der Gabeln 13 oder an dem Lastträger 17 des Gabelstaplers angebracht werden.The forklift truck 10 further comprises a first sensor 16 which determines a distance between the forklift truck 10 and the goods W. For example, the first sensor 16 can determine the distance to the goods W by measuring the transit time of electromagnetic waves or sound waves, which is indicated by the dashed line in FIG Figure 1A is marked. The first sensor 16 comprises an ultrasonic sensor, a radar sensor and a LIDAR (“Light Detection and Ranging”) sensor. In the in Figure 1A and Figure 1B The example shown, the first sensor 16 is attached to the roof of the forklift 10. In further (not shown) examples of the first embodiment, the first sensor 16 can also be attached to a lifting mast 11 of the forklift 10, or can be attached to one of the forks 13 or to the load carrier 17 of the forklift.

Eine vom Gabelstapler 10 umfasste Steuerung 20 der ersten Ausführungsformen, wie in Fig. 1B gezeigt, kann, wenn der durch den ersten Sensor 16 bestimmte Abstand einen Mindestabstand unterschreitet, ein erstes Aktivierungssignal an die erste Kamera 12 ausgeben. Die erste Kamera 12 wird durch das erste Aktivierungssignal aktiviert und beginnt mit der Aufnahme von Bilddaten der ersten Seite S1 der Ware W. Zum Beispiel erfasst die erste Kamera eine frontale, dem Gabelstapler 10 zugewandte Seite der aufzunehmenden Ware W.A controller 20 of the first embodiments comprised by the forklift 10 as shown in FIG Figure 1B As shown, if the distance determined by the first sensor 16 falls below a minimum distance, a first activation signal can be output to the first camera 12. The first camera 12 is activated by the first activation signal and begins to record image data of the first side S1 of the goods W. For example, the first camera records a front side of the goods W to be picked up facing the forklift truck 10.

Zusätzlich zu der ersten Kamera 12 umfasst der Gabelstapler 10 der ersten Ausführungsformen eine zweite Kamera 14, die während oder nach der Aufnahme weitere Seiten der Ware W, die sich von der ersten Seite S1 unterscheiden, erfasst. In dem in Fig. 1A und Fig. 1B gezeigten Beispiel der ersten Ausführungsformen ist die zweite Kamera 14 in eine der Gabeln 13 des Gabelstaplers 10 aufgenommen. In einem weiteren (nicht dargestellten) Beispiel der ersten Ausführungsformen kann die zweite Kamera 14 auch am Hubmast 11 des Gabelstapler 10 angebracht werden.In addition to the first camera 12, the forklift truck 10 of the first embodiment comprises a second camera 14 which, during or after the recording, detects further sides of the goods W that differ from the first side S1. In the in Figure 1A and Figure 1B In the example shown in the first embodiment, the second camera 14 is received in one of the forks 13 of the forklift 10. In a further example (not shown) of the first embodiment, the second camera 14 can also be attached to the lifting mast 11 of the forklift 10.

Vorzugsweise kann der Gabelstapler 10 der ersten Ausführungsformen einen zweiten Sensor 18 umfassen, der die Aufnahme der Ware W durch den Gabelstapler 10 bestimmt. Zum Beispiel umfasst der in Fig. 1B gezeigte zweite Sensor 18 einen Ultraschallsensor. Der zweite Sensor 18 kann weitere Sensoren, wie zum Beispiel einen Last-Sensor, umfassen, der durch eine Lastindikation die Aufnahme der Ware W detektiert. In dem in Fig. 1B gezeigten Beispiel ist der zweite Sensor 18 in einer der Gabeln 13 des Gabelstaplers 10 aufgenommen. In einem weiteren Beispiel (nicht dargestellt) der ersten Ausführungsformen kann der zweite Sensor 18 auch an dem Hubmast 11 oder an dem Lastträger 17 des Gabelstaplers 10 angebracht werden.The forklift truck 10 of the first embodiment can preferably comprise a second sensor 18 which determines the pick-up of the goods W by the forklift truck 10. For example, the in Figure 1B The second sensor 18 shown is an ultrasonic sensor. The second sensor 18 can comprise further sensors, for example a load sensor, which detects the picking up of the goods W by means of a load indication. In the in Figure 1B In the example shown, the second sensor 18 is received in one of the forks 13 of the forklift 10. In a further example (not shown) of the first embodiment, the second sensor 18 can also be attached to the lifting mast 11 or to the load carrier 17 of the forklift 10.

Vorzugsweise kann die Steuerung 20 des Gabelstaplers 10, wenn der zweite Sensor 18 eine Aufnahme der Ware W bestimmt, ein zweites Aktivierungssignal an die zweite Kamera 14 des Gabelstaplers 10 ausgeben, um die zweite Kamera 14 zu aktivieren. Zum Beispiel erfasst die zweite Kamera 14 eine untere Seite der Ware W während sie von den Gabeln 13 des Gabelstaplers 10 aufgenommen wird. In einem weiteren Beispiel erfasst die zweite Kamera eine hintere, dem Gabelstapler 10 abgewandte Seite der Ware W nachdem sie aufgenommen wurde. Dies wird später unter Bezugnahme auf Fig. 2A bis Fig. 2C genauer beschrieben.Preferably, when the second sensor 18 determines that the goods W have been picked up, the controller 20 of the forklift 10 can output a second activation signal to the second camera 14 of the forklift 10 in order to activate the second camera 14. For example, the second camera 14 detects a lower side of the goods W while it is being picked up by the forks 13 of the forklift 10. In a further example, the second camera records a rear side of the goods W facing away from the forklift truck 10 after it has been recorded. This will be referred to later on Figures 2A to 2C described in more detail.

Weiterhin bietet sich durch die Bewegung des Gabelstaplers 10 im Bereich einer Waren-Entladestelle innerhalb des Lagers die Möglichkeit mit einer geringen Zahl an externen Kameras 114 Bilddaten der vom Gabelstapler 10 abgewandten Seiten der Ware W zu erfassen. Daher kann das in Fig. 1A gezeigte System 100 der ersten Ausführungsformen eine oder mehrere externe Kameras 114 umfassen, die in einem Lager angeordnet sind, und die mindestens eine vom Gabelstapler 10 abgewandte Seite der Ware W während und nach der Aufnahme der Ware W durch den Gabelstapler 10 erfassen.Furthermore, the movement of the forklift 10 in the area of a goods unloading point within the warehouse offers the possibility of capturing image data of the sides of the goods W facing away from the forklift 10 with a small number of external cameras 114. Therefore, in Figure 1A The system 100 of the first embodiment shown comprise one or more external cameras 114, which are arranged in a warehouse, and which Detect at least one side of the goods W facing away from the forklift truck 10 during and after the goods W are picked up by the forklift truck 10.

Vorzugsweise kann eine externe Kamera 114 innerhalb einer im Lager umfassten Hallendecke angebracht werden, um eine obere Seite der Ware W zu erfassen. In einem weiteren (nicht dargestellten) Beispiel kann eine externe Kamera 114 an einem Laternenmasten in dem Lager angebracht werden, um eine seitliche Seite der Ware W zu erfassen. In einem noch weiteren (nicht dargestellten) Beispiel kann eine externe Kamera 114 an einem Durchfahrtstor des Lagers, das der Gabelstapler mit der aufgenommen Ware W passiert, angebracht werden um eine seitliche Seite der Ware W zu erfassen.An external camera 114 can preferably be attached within a hall ceiling included in the warehouse in order to capture an upper side of the goods W. In a further example (not shown), an external camera 114 can be attached to a lamp post in the warehouse in order to capture a lateral side of the goods W. In yet another example (not shown), an external camera 114 can be attached to a passage gate of the warehouse through which the forklift truck with the picked up goods W passes, in order to capture a lateral side of the goods W.

Vorzugsweise kann eine externe Kamera 114 in einer flugfähigen Drohne aufgenommen werden, die den Gabelstapler 10 während und nach der Aufnahme der Ware W filmt. Zum Beispiel kann, wenn die Steuerung das erste Aktivierungssignal ausgibt, auch ein Signal mit den aktuellen geographischen Koordinaten des Gabelstaplers 10 an die Drohne ausgegeben werden. Die Drohne erfasst dann die restlichen Seiten der Ware W.An external camera 114 can preferably be recorded in an airworthy drone, which films the forklift 10 during and after the goods W are picked up. For example, when the controller outputs the first activation signal, a signal with the current geographical coordinates of the forklift 10 can also be output to the drone. The drone then captures the remaining sides of the goods W.

In einem weiteren Beispiel gemäß der ersten Ausführungsformen kann eine externe Kamera 114 das erste oder das zweite Aktivierungssignal, das von der Steuerung 20 des Gabelstaplers 10 ausgeben wird, empfangen und damit beginnen die durch die externe Kamera 114 erfassten Bilddaten an ein Datenverarbeitungssystem 24 zu übertragen.In a further example according to the first embodiment, an external camera 114 can receive the first or the second activation signal that is output by the controller 20 of the forklift 10 and begin to transmit the image data captured by the external camera 114 to a data processing system 24.

Die Anzahl der Kameras und der Sensoren, die der Gabelstapler 10 und das System 100 umfassen können, kann variieren. In einem (nicht dargestellten) Beispiel zweiter Ausführungsformen kann der Gabelstapler 10 nur eine Kamera umfassen und das System 100 kann nur eine externe Kamera umfassen. In einem weiteren (nicht dargestellten) Beispiel der zweiten Ausführungsformen kann der Gabelstapler 10 nur einen Sensor umfassen und das System 100 kann einen Sensor oder mehrere Sensoren umfassen.The number of cameras and sensors that the forklift 10 and system 100 may include can vary. In an example of second embodiments (not shown), the forklift 10 can only include one camera and the system 100 can only include one external camera. In a further example (not shown) of the second embodiment, the forklift 10 can include only one sensor and the system 100 can include one or more sensors.

Vorzugsweise sind die Kameras 12, 16 des Gabelstaplers 10 und die externen Kameras 114 des Systems 100 mit Beleuchtungsmöglichkeiten ausgebildet, so dass die Kameras selbst bei dunklen Umgebungen durch eine ausreichende Beleuchtung eine korrekte Bilddaten Erfassung erzielen können.The cameras 12, 16 of the forklift truck 10 and the external cameras 114 of the system 100 are preferably designed with lighting options so that the cameras can achieve correct image data acquisition even in dark surroundings through sufficient lighting.

Vorzugsweise handelt es sich bei den Kameras um 3D Kameras die auf dem Prinzip des Lichtlaufverfahrens operieren, um zusätzlich eine Bestimmung der Tiefendaten der aufzunehmenden bzw. aufgenommen Ware W zu ermöglichen. Die Kameras können auch Netzwerkkameras umfassen, die am Ausgang digitale Signale in Form eines Videostreams bereitstellen, das per Internet-Protokoll (IP) übertragen werden kann.The cameras are preferably 3D cameras that operate on the principle of the light flow method in order to additionally enable a determination of the depth data of the goods W to be recorded or recorded. The cameras can also include network cameras that provide digital signals at the output in the form of a video stream that can be transmitted via Internet Protocol (IP).

Weiterhin umfasst der Gabelstapler 10 eine Übertragungseinheit 22, wie in Fig. 1B gezeigt, die die durch die erste 12 und die zweite Kamera 14 des Gabelstaplers 10 erfassten Bilddaten an ein Datenverarbeitungssystem 24 zur Identifikation der Ware W übertragen kann.Furthermore, the forklift 10 comprises a transfer unit 22, as in FIG Figure 1B which can transmit the image data captured by the first 12 and the second camera 14 of the forklift 10 to a data processing system 24 for the identification of the goods W.

Zusätzlich kann der Gabelstapler 10 ein Ortungssystem (nicht dargestellt) umfassen, das eine geographische Position des Gabelstaplers in dem Lager in Echtzeit erfasst. Zum Beispiel kann das Ortungssystem ein GPS Sensor umfassen, der es ermöglicht die geographischen Koordinaten des Gabelstaplers 10 in Echtzeit zu bestimmen. Die so bestimmten Koordinaten können zusammen mit den erfassten Bilddaten des Gabelstaplers 10 und einem Zeitstempel durch die Übertragungseinheit 22 an das Datenverarbeitungssystem 24 übertragen werden. Das Datenverarbeitungssystem 24 kann dann, basierend auf den übertragenen Koordinaten und dem Zeitstempel, die vom Gabelstapler 10 erfassten Bilddaten mit den Bilddaten, die durch eine externe Kamera 114 des Systems 100 erfasst werden, kombinieren.In addition, the forklift 10 can include a location system (not shown) that detects a geographic position of the forklift in the warehouse in real time. For example, the positioning system can comprise a GPS sensor, which makes it possible to determine the geographic coordinates of the forklift truck 10 in real time. The coordinates determined in this way can be transmitted together with the captured image data of the forklift 10 and a time stamp by the transmission unit 22 to the data processing system 24. The data processing system 24 can then, based on the transmitted coordinates and the time stamp, combine the image data captured by the forklift truck 10 with the image data captured by an external camera 114 of the system 100.

Alternativ, oder zusätzlich zu dem Ortungssystem des Gabelstaplers 10 kann durch Analyse der Bilddaten einer externen Kamera 114 des Systems 100 der Gabelstapler 10 identifiziert werden, zum Beispiel über einen QR-Code, um eine Zuordnung der Bilddaten zu den Bilddaten des Gabelstaplers 10 zu ermöglichen.Alternatively, or in addition to the location system of the forklift truck 10, the forklift truck 10 can be identified by analyzing the image data of an external camera 114 of the system 100, for example via a QR code, in order to enable the image data to be assigned to the image data of the forklift truck 10.

Unter Bezugnahme auf Fig. 2A bis Fig. 2C wird die Erfassung der Bilddaten durch die zweite Kamera 14 des Gabelstaplers 10 gemäß ersten Ausführungsformen ausführlicher beschrieben. Fig. 2A stellt einen perspektivischen Ausschnitt des Gabelstaplers 10 gemäß der ersten Ausführungsformen dar, der die Gabel 13 mit der zweiten Kamera 14 vor der Aufnahme der Ware W zeigt. Die zweite Kamera 14 kann vorzugsweise mit stoßfesten und/oder wasserfesten Materialien ausgebildet werden, welche die zweite Kamera 14 vor Stößen bei der Warenaufnahme und/oder vor Nässe schützt. Zum Beispiel kann die zweite Kamera 14 in einem stoß- und/oder wasserfesten Gehäuse, das in der Gabel 13 des Gabelstaplers aufgenommen ist, angeordnet sein.With reference to Figures 2A to 2C the acquisition of the image data by the second camera 14 of the forklift 10 according to first embodiments is described in more detail. Figure 2A shows a perspective detail of the forklift 10 according to the first embodiment, which shows the fork 13 with the second camera 14 before the goods W are picked up. The second camera 14 can preferably be formed with impact-resistant and / or waterproof materials, which protect the second camera 14 from impacts when picking up goods and / or from moisture. For example, the second camera 14 can be arranged in a shock-proof and / or waterproof housing that is received in the fork 13 of the forklift.

Fig. 2B stellt einen perspektivischen Ausschnitt des Gabelstaplers 10 gemäß ersten Ausführungsformen dar, der die Gabel 13 mit der zweiten Kamera 14 während der Aufnahme der Ware W zeigt. Vorzugsweise kann die zweite Kamera eine 3D-Kamera umfassen, die durch ein Laufzeitverfahren ("Time of Flight" (TOF)) die Dimensionen der Ware W während ihrer Aufnahme, wie in Fig. 2B gezeigt, bestimmt. So kann zum Beispiel sichergestellt werden, dass innerhalb einer Palette keine Leergutkästen fehlen. Figure 2B shows a perspective detail of the forklift 10 according to the first embodiment, which shows the fork 13 with the second camera 14 while the goods W are picked up. The second camera can preferably comprise a 3D camera which uses a time of flight (TOF) method to determine the dimensions of the goods W during their recording, as in FIG Figure 2B shown, determined. For example, it can be ensured that no empty crates are missing from a pallet.

Fig. 2C stellt einen perspektivischen Ausschnitt des Gabelstaplers 10 gemäß ersten Ausführungsformen dar, der die Gabel 13 mit der zweiten Kamera 14 nach der Aufnahme der Ware W zeigt. Vorzugsweise kann die zweite Kamera mit einer ausfahrbaren Vorrichtung 30 (nicht gezeigt) verbunden werden. Die ausfahrbare Vorrichtung umfasst eine Teleskopvorrichtung. Diese ermöglicht es, die zweite Kamera 18 so zu positionieren, dass die hintere dem Gabelstapler abgewandte Seite der Ware W erfasst werden kann. Figure 2C shows a perspective section of the forklift 10 according to the first Embodiments represent the fork 13 with the second camera 14 after the recording of the Goods W shows. Preferably, the second camera can be connected to an extendable device 30 (not shown). The extendable device comprises a telescopic device. This makes it possible to position the second camera 18 in such a way that the rear side of the goods W facing away from the forklift truck can be detected.

In einem weiteren Beispiel der ersten Ausführungsformen kann die zweite Kamera am Hubmast 11 des Gabelstaplers angebracht werden (nicht gezeigt). In diesem Fall kann die zweite Kamera 18 mit der ausfahrbaren Vorrichtung 30 verbunden werden, so dass die zweite Kamera 18 so positioniert wird, dass sie eine obere Seite der Ware W oder eine seitliche Seite der Ware W erfasst.In a further example of the first embodiment, the second camera can be attached to the lifting mast 11 of the forklift (not shown). In this case, the second camera 18 can be connected to the extendable device 30 so that the second camera 18 is positioned in such a way that it detects an upper side of the goods W or a lateral side of the goods W.

Alternativ oder zusätzlich zu den vom Gabelstapler 10 umfassten Kameras kann der Gabelstapler 10 einen oder mehrere Spiegel umfassen.As an alternative or in addition to the cameras comprised by the forklift truck 10, the forklift truck 10 can include one or more mirrors.

Unter Bezugnahme auf Fig. 3A bis Fig. 3C wird die Erfassung der Bilddaten gemäß der ersten Ausführungsformen mit einem ersten Spiegel 26 und einem zweiten Spiegel 28, die an dem Gabelstapler 10 angebracht sind, ausführlicher beschrieben. Fig. 3A stellt eine perspektivische Ansicht des Gabelstaplers 10 dar, der den ersten Spiegel 26 in einer ersten Position zeigt. Fig. 3B stellt eine perspektivische Ansicht des Gabelstaplers 10 dar, der den ersten Spiegel 26 in einer zweiten Position zeigt. Fig. 3C stellt eine perspektivische Draufsicht auf den Gabelstapler 10 dar, die den ersten Spiegel 26 und den zweiten Spiegel 28 in der zweiten Position zeigt.With reference to Figures 3A to 3C the acquisition of the image data according to the first embodiment with a first mirror 26 and a second mirror 28 which are attached to the forklift 10 will be described in more detail. Figure 3A Figure 12 illustrates a perspective view of the forklift 10 showing the first mirror 26 in a first position. Figure 3B Figure 10 illustrates a perspective view of the forklift 10 showing the first mirror 26 in a second position. Figure 3C Figure 10 illustrates a top perspective view of forklift 10 showing first mirror 26 and second mirror 28 in the second position.

In Fig. 3A ist der erste Spiegel 26, der mit einer ausfahrbaren/ausklappbaren Vorrichtung verbunden ist, an der Seite des Gabelstaplers 10 angebracht und befindet sich in der ersten Position, das heißt in einer eingefahrenen/eingeklappten Position. Ebenso ist der zweite Spiegel 28 auf der gegenüberliegenden Seite des Gabelstaplers 10 angebracht (nicht dargestellt). Nachdem die Ware W durch den Gabelstapler 10 aufgenommen wurde, kann der erste Spiegel 26 und der zweite Spiegel 28 in die zweite Position gebracht, das heißt, ausgefahren/ausgeklappt werden, um die seitlichen Seiten S3, S4 der Ware W zu reflektieren. Dies ist in Fig. 3B gezeigt. Vorzugsweise kann der erste Spiegel 26 und der zweite Spiegel 28, wenn durch den zweiten Sensor 18 bestimmt wird, das die Ware W aufgenommen wird, ausgefahren werden.In Figure 3A For example, the first mirror 26, which is connected to an extendible / fold-out device, is attached to the side of the forklift 10 and is in the first position, i.e. in a retracted / folded position. Likewise, the second mirror 28 is attached on the opposite side of the forklift 10 (not shown). After the goods W have been picked up by the forklift truck 10, the first mirror 26 and the second mirror 28 can be brought into the second position, that is to say extended / folded out, in order to reflect the lateral sides S3, S4 of the goods W. This is in Figure 3B shown. The first mirror 26 and the second mirror 28 can preferably be extended when it is determined by the second sensor 18 that the goods W are being picked up.

Vorzugsweise ist die erste Kamera 12 in zwei Richtungen bis zu 90° drehbar, um so das Sichtfeld der ersten Kamera 12 zu variieren. Dies ermöglicht es den ersten Spiegel 26 und den zweiten Spiegel 28 anzuvisieren, wie in Fig. 3C gezeigt, um die Bilddaten der seitlichen Seiten der Ware W zu erfassen.The first camera 12 can preferably be rotated up to 90 ° in two directions in order to vary the field of view of the first camera 12. This enables the first mirror 26 and the second mirror 28 to be aimed at, as in FIG Figure 3C to capture the image data of the side sides of the goods W.

In einem weiteren Beispiel gemäß der ersten Ausführungsformen kann auch das System 100 einen oder mehrere Spiegel umfassen. Zum Beispiel kann, alternativ oder zusätzlich zu den externen Kameras 114, ein oder mehrere Spiegel in dem Lager im Bereich der Waren-Entladestelle angebracht werden (nicht gezeigt), um mindestens eine dem Gabelstapler 10 abgewandte Seite der Ware W zu reflektieren. In einem Beispiel kann die erste Kamera 10 des Gabelstaplers 10 dann den Spiegel in dem Lager anvisieren um die Bilddaten zu erfassen.In a further example according to the first embodiment, the system 100 can also comprise one or more mirrors. For example, can, alternatively, or in addition to the external cameras 114, one or more mirrors can be fitted in the warehouse in the area of the goods unloading point (not shown) in order to reflect at least one side of the goods W facing away from the forklift truck 10. In one example, the first camera 10 of the forklift 10 can then aim at the mirror in the warehouse to capture the image data.

Im Folgenden wird die Bilddaten-Verarbeitung im Detail erläutert. Nachdem die Bilddaten der ersten Kamera 12 und der zweiten Kamera 18 des Gabelstaplers 10, sowie der externen Kameras 114 des Systems 10, erfasst und an das Datenverarbeitungssystem 24 übertragen worden sind, findet eine Datenverarbeitung statt.The image data processing is explained in detail below. After the image data of the first camera 12 and the second camera 18 of the forklift 10, as well as the external cameras 114 of the system 10, have been captured and transmitted to the data processing system 24, data processing takes place.

Vorzugsweise enthält der Gabelstapler 10 das Datenverarbeitungssystem 24. Das Datenverarbeitungssystem 24 kann auch in einer externen Recheneinheit, zum Beispiel einem Zentralrechner, enthalten sein.The forklift truck 10 preferably contains the data processing system 24. The data processing system 24 can also be contained in an external processing unit, for example a central computer.

Mit den durch das Ortungssystem des Gabelstaplers 10 bestimmten Koordinaten des Gabelstaplers 10 im Lager lassen sich die vom Gabelstapler 10 erfassten Bilddaten mit denen der durch die externen Kameras 114 erfassten Bilddaten kombinieren. In einem ersten Schritt findet eine Vorverarbeitung der Bilddaten statt.With the coordinates of the forklift 10 in the warehouse determined by the location system of the forklift 10, the image data captured by the forklift 10 can be combined with those of the image data captured by the external cameras 114. In a first step, the image data is preprocessed.

Vorzugsweise werden in der Vorverarbeitung sensorbedingte Fehler wie Rauschen, Überbelichtung, etc. aus den Bilddaten maschinell gefiltert. Dann wird das Datenformat vorbereitet, zum Beispiel durch Konvertierung der Bilddaten in Graustufen oder durch Transformation der Tiefendaten der Ware W zur Bestimmung der Dimensionen der Ware W. Vorzugsweise werden auch die durch die Sensoren das Gabelstaplers 10 und des Systems 100 bestimmten Daten mit den Bilddaten fusioniert.In the preprocessing, sensor-related errors such as noise, overexposure, etc. are preferably filtered out of the image data by machine. The data format is then prepared, for example by converting the image data to gray levels or by transforming the depth data of the goods W to determine the dimensions of the goods W. The data determined by the sensors of the forklift truck 10 and the system 100 are preferably also merged with the image data .

Vorzugsweise wird eine maschinelle Segmentierung der erfassten Bilddaten durchgeführt, wie in Fig. 4 gezeigt. Es sind viele Verfahren zur automatischen Segmentierung möglich. Grundsätzlich werden sie oft in pixel-, kanten- und regionenorientierte Verfahren eingeteilt. Zusätzlich unterscheidet man modellbasierte Verfahren, bei denen man von einer bestimmten Form der Objekte ausgeht, und texturbasierte Verfahren, bei denen auch eine innere homogene Struktur der Objekte berücksichtigt wird. Auch können verschiedene Verfahren kombiniert werden, um bessere Ergebnisse zu erzielen.A machine segmentation of the captured image data is preferably carried out, as in FIG Fig. 4 shown. Many methods of automatic segmentation are possible. Basically, they are often divided into pixel, edge and region-oriented processes. In addition, a distinction is made between model-based methods, in which a certain shape of the objects is assumed, and texture-based methods, in which an internal, homogeneous structure of the objects is also taken into account. Different methods can also be combined to achieve better results.

Vorzugsweise können Merkmale der erfassten Bilddaten extrahiert werden. Beispiele hierfür umfassen, sind aber nicht beschränkt auf SIFT ("Scale-Invariant Feature Transform"), HOG ("Histogram of Oriented Gradients") und SURF ("Speeded Up Robust Features"). Features of the captured image data can preferably be extracted. Examples include, but are not limited to, SIFT ("Scale-Invariant Feature Transform"), HOG ("Histogram of Oriented Gradients"), and SURF (" Speeded Up Robust Features").

SIFT ist ein Algorithmus zur Detektion und Beschreibung von lokalen Merkmalen in Bildern. Der Detektor und die Merkmalsbeschreibungen sind, in gewissen Grenzen, invariant gegenüber Koordinatentransformationen wie Translation, Rotation und Skalierung. Sie sind außerdem robust gegen Beleuchtungsvariation, Bildrauschen und geringere geometrische Deformation höherer Ordnung, wie sie zum Beispiel durch projektive Abbildung eines Objekts von verschiedenen Standpunkten im Raum entstehen.SIFT is an algorithm for the detection and description of local features in images. The detector and the feature descriptions are, within certain limits, invariant compared to coordinate transformations such as translation, rotation and scaling. They are also robust against lighting variations, image noise and lower geometric deformations of a higher order, such as those caused by projective imaging of an object from different points of view in space.

Ähnlich wie SIFT ist HOG ein Algorithmus zur Merkmals-Erkennung in Bildern, der das Aussehen und die Form von Objekten innerhalb eines Bildes auch ohne näheres Wissen über die Positionen von Kanten oder Ecken durch die Verteilung der lokalen Intensität oder der Anordnung der Kanten darstellt. Dazu wird das Bild in Teilbereiche zerlegt und für jeden Teilbereich werden die Orientierungen aller Kanten bestimmt und deren Anzahl als Histogramm gespeichert.Similar to SIFT, HOG is an algorithm for feature recognition in images that shows the appearance and shape of objects within an image even without detailed knowledge of the positions of edges or corners by distributing the local intensity or the arrangement of the edges. For this purpose, the image is broken down into sub-areas and the orientations of all edges are determined for each sub-area and their number is saved as a histogram.

Der SURF-Algorithmus basiert auf den gleichen Prinzipien und Schritten wie SIFT. Die Details in jedem Schritt sind jedoch unterschiedlich. Der Algorithmus besteht aus drei Hauptteilen: Erkennung von Interessenpunkten, Beschreibung der lokalen Umgebung und Zuordnung.The SURF algorithm is based on the same principles and steps as SIFT. However, the details in each step are different. The algorithm consists of three main parts: identification of points of interest, description of the local environment and mapping.

In einem zweiten Schritt wird, basierend auf einer der im nachfolgenden beschriebenen Vorgehensweisen, eine Identifikation der Ware W, insbesondere der Leerguttypen der aufgenommenen Palette/n, durchgeführt.In a second step, based on one of the procedures described below, the goods W, in particular the empty packaging types of the pallet (s) picked up, are identified.

Eine geometrische Klassifizierung mittels (gegebenenfalls segmentierter) Tiefendaten oder deren extrahierten Merkmalen, wie Höhe, Tiefe, Breite und Anzahl der Kästen, die aus den vorverarbeiteten Bilddaten gewonnen werden, wird durch Vergleich mit Daten aus einer vorher erstellten Datenbank ermöglicht. Dies erlaubt eine Identifikation der Leerguttypen, insbesondere der Kastentypen der Ware W.A geometric classification by means of (possibly segmented) depth data or their extracted features, such as height, depth, width and number of boxes, which are obtained from the preprocessed image data, is made possible by comparison with data from a previously created database. This allows the types of empties to be identified, in particular the types of crates of the goods W.

Eine texturbasierte Klassifizierung mittels (gegebenenfalls segmentierter) Farbdaten oder deren extrahierten Merkmalen umfasst die Anwendung und Evaluierung von statistischen Verfahren für rechnergestützte Erkennung von Mustern in Bildern. Durch statistische Auswertung der vorverarbeiteten Bilddaten (zum Beispiel mittels der Median-Farbtonbestimmung) und Verwendung des maschinellen Lernens (zum Beispiel "Shallow Learning", "Deep Learning") kann ein Algorithmus mit ausreichend realen Daten bzw. deren Merkmalen trainiert werden. Dieser trainierte Algorithmus kann dann anhand realer Daten validiert und verifiziert werden und anschließend zur Leerguterkennung verwendet werden.A texture-based classification using (possibly segmented) color data or their extracted features includes the application and evaluation of statistical methods for computer-aided recognition of patterns in images. By statistical evaluation of the preprocessed image data (for example by means of the median color tone determination) and the use of machine learning (for example "shallow learning", "deep learning"), an algorithm can be trained with sufficiently real data or their features. This trained algorithm can then be validated and verified using real data and then used to identify empties.

Vorzugsweise umfasst die Identifikation der Leerguttypen eine Kombination aus der geometrischen und texturbasierten Klassifizierung mit fusionierten Sensordaten.The identification of the empty packaging types preferably comprises a combination of the geometric and texture-based classification with merged sensor data.

Vorzugsweise wird eine Vorhersage der wahrscheinlichsten Belegung der einzelnen Leergutkästen durch die Analyse aller sichtbaren Seiten der Ware W, zum Beispiel zwei pro Palette, durchgeführt. Dies geschieht unter Verwendung von einer statistischen Schätzung und/oder einer Schätzung mittels Methoden des maschinellen Lernens.A prediction of the most likely occupancy of the individual empties crates is preferably made by analyzing all visible sides of the goods W, for example two per Pallet. This is done using a statistical estimate and / or an estimate using machine learning methods.

Im letzten Schritt werden dann die entsprechenden Ergebnisse der BilddatenVerarbeitung an ein Warenwirtschaftssystem WWS über einen standardisierten Kommunikationsweg (zum Beispiel mittels einem Netzwerkprotokoll wie dem "Simple Object Access Protocoll" (SOAP)) und standardisierte Datenformate (zum Beispiel CSV, XML, JSON-RPC) übertragen. Somit kann ein Produktionsplanungs- und Steuerungssystem PPS mit Leergutdaten in hoher Güte arbeiten.In the last step, the corresponding results of the image data processing are then sent to an ERP system via a standardized communication path (for example using a network protocol such as the "Simple Object Access Protocol" (SOAP)) and standardized data formats (for example CSV, XML, JSON-RPC) transfer. A production planning and control system PPS can thus work with empties data in a high quality.

Vorstehend wurden verschiede Ausführungsformen ausführlich beschrieben. Für den Fachmann ist klar, dass verschiedene spezielle Merkmale spezieller Ausführungsformen mit Merkmalen anderer jeweiliger Ausführungsformen kombinierbar sind.Various embodiments have been described in detail above. It is clear to a person skilled in the art that various special features of special embodiments can be combined with features of other respective embodiments.

Claims (15)

  1. A forklift truck (10), comprising:
    a first camera system with a first camera (12) configured to capture a first side (S1) of goods (W) to be picked up;
    a second camera system with a second camera (14) configured to capture a second side of the goods while or after picking them up with the forklift, the first side being different from the second side;
    characterized by
    a first sensor system with a first sensor (16) configured to detect a distance of the forklift truck from the goods;
    a controller (20) configured to output a first enabling signal to the first camera when the distance detected by the first sensor is less than a minimum distance; and
    a transmission unit (22) configured to transmit the image data of the goods captured by the first camera and by the second camera, respectively, to a data processing system (24) for identifying the goods.
  2. The forklift truck according to claim 1, wherein the first sensor system is an ultrasonic or radar system.
  3. The forklift truck according to any one of claims 1 to 2, further comprising:
    a second sensor system with a second sensor (18) configured to capture the forklift truck while picking up the goods.
  4. The forklift truck according to claim 3, wherein the controller is further configured to output a second enabling signal to the second camera when it is determined by the second sensor that goods have been picked up.
  5. The forklift truck according to any one of claims 1 to 4, wherein the second camera system includes an extendible device (30) configured to move the second camera to a position enabling capturing of the second side of the goods.
  6. The forklift truck according to any one of claims 1 to 5, wherein the first camera and the second camera are 3D cameras capable of detecting the dimensions of the goods and/or wherein the forklift truck includes the data processing system.
  7. The forklift truck according to any one of claims 1 to 6, wherein the first camera system further comprises a first mirror (26) and a second mirror (28) each having a device (32) configured to position the mirrors for capturing a third side (S3) and a fourth side (S4) of the goods picked up by the forklift truck.
  8. The forklift truck according to any one of claims 1 to 7, wherein the first camera is mounted to a mast (11) of the forklift truck, or
    wherein the second camera system is arranged in one of the forks (13, 15) of the forklift truck and wherein the second camera captures the lower side of the goods while the goods are picked up.
  9. The forklift truck according to claim 5 and claim 8, wherein the second camera is positioned by the extendible device so as to capture the second side of the goods after the goods have been picked up and wherein the second side of the goods corresponds to the rear side facing away from the forklift truck.
  10. The forklift truck according to any one of claims 1 to 6, 8 or 9 and according to claim 7,
    wherein the first mirror and the second mirror are respectively mounted to the forklift truck and are positioned by the respective device after the goods have been picked up such that the third and fourth sides of the goods corresponding to the lateral sides are reflected, wherein the first mirror and the second mirror are aimed at by the first camera after the goods have been picked up.
  11. The forklift truck according to any one of claims 1 to 10 having a data processing system, wherein the data processing system is configured to perform preprocessing of image data captured by one or more camera systems of the forklift truck, wherein the preprocessing comprises filtering out of sensor-related errors, preparing the data format of the image data, merging of the sensor data, extracting features, and machine segmentation, wherein preferably the data processing system is further configured to perform an identification of the goods, in particular of types of crates with empty bottles as well as of a number of the crates with empty bottles, using the preprocessed image data by means of one or more of the following classifications:
    (a) a geometric classification using geometric data determined from the image data or their extracted features,
    (b) a texture-based classification using colour data determined from the image data or their extracted features.
  12. The forklift truck according to claim 11, wherein the data processing system is further configured to perform a statistical estimation and an estimation by means of machine learning based on the captured image data to determine a prediction of occupancy at individual crates with empty bottles of the picked up goods.
  13. A system (100) comprising,
    a forklift truck (110) having:
    a positioning system configured to detect a geographical position of the forklift truck in a warehouse in real time;
    a first camera system with a first camera (112) configured to capture a side of goods (W) to be picked up facing the forklift truck;
    a first sensor system with a first sensor configured to detect a distance of the forklift truck from goods to be picked up;
    a controller configured to output a first enabling signal to the first camera when the distance detected by the first sensor is less than a minimum distance;
    a transmission system configured to transmit the image data of the goods captured by the first camera to a data processing system;
    a second camera system with a second camera (114) located in a warehouse and configured to capture at least one side of the goods facing away from the forklift truck during and after picking up the goods with the forklift truck,
    wherein the data processing system is configured to analyze the image data captured by the first camera and the second camera to thereby identify the goods.
  14. The system according to claim 13, wherein
    the second camera of the second camera system is installed in an area for unloading goods within the warehouse, preferably on the ceiling; or
    the second camera of the second camera system is installed at a passage gate of the warehouse which the forklift truck with the picked-up goods passes.
  15. The system according to any one of claims 13 or 14, wherein the data processing system is provided by the forklift truck itself, or
    wherein the positioning system of the forklift truck is further configured to provide the image data captured by the first camera with the geographical position, thereby enabling a subsequent association of the image data captured by the first camera with the image data captured by the second camera, wherein the image data captured by the first camera and the image data captured by the second camera each additionally comprise a time stamp, or
    wherein the data processing system is configured to identify the forklift truck by analyzing the image data captured by the second camera to enable association of the image data captured by the first camera with the image data captured by the second camera, or
    wherein the data processing system is configured to perform preprocessing on the basis of the associated image data of the first camera and the second camera, the preprocessing comprising filtering out sensor-related errors, preparing the data format of the image data, merging of sensor data, extracting features and machine segmentation, and/or
    wherein the data processing system is further configured to perform an identification of the goods, in particular of types of crates with empty bottles as well as of a number of the crates with empty bottles, using the preprocessed image data by means of one or more of the following classifications:
    (a) a geometric classification using geometric data determined from the image data or their extracted features, and
    (b) a texture-based classification using colour data determined from the image data or their extracted features or
    wherein the second camera system comprises a plurality of mirrors located in the warehouse and configured to capture the sides of the goods facing away from the forklift truck, the first camera being further configured to be aimed at the mirrors while or after picking up the goods, or
    wherein the second camera system is constituted by a drone with a camera.
EP20156789.8A 2019-02-15 2020-02-12 Forklift and system with forklift for the identification of goods Active EP3696135B1 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
DE102019202076.3A DE102019202076A1 (en) 2019-02-15 2019-02-15 FORKLIFT AND SYSTEM WITH FORKLIFT FOR IDENTIFICATION OF GOODS

Publications (2)

Publication Number Publication Date
EP3696135A1 EP3696135A1 (en) 2020-08-19
EP3696135B1 true EP3696135B1 (en) 2021-12-29

Family

ID=69571899

Family Applications (1)

Application Number Title Priority Date Filing Date
EP20156789.8A Active EP3696135B1 (en) 2019-02-15 2020-02-12 Forklift and system with forklift for the identification of goods

Country Status (2)

Country Link
EP (1) EP3696135B1 (en)
DE (1) DE102019202076A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102020211892A1 (en) 2020-09-23 2022-03-24 Robert Bosch Gesellschaft mit beschränkter Haftung Method for object recognition, computer program, storage medium, object recognition device and surveillance arrangement
DE102021108146A1 (en) 2021-03-31 2022-10-06 Bayerische Motoren Werke Aktiengesellschaft Method and device for unloading a vehicle
GB2620859A (en) * 2021-10-05 2024-01-24 Wrs Solutions Ltd Goods monitoring system and method of use thereof

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE3624486A1 (en) * 1986-07-19 1988-01-28 Thomas Enkelmann Computer METHOD AND DEVICE FOR REDUCING THE ACTIVE AND PASSIVE HAZARD IN VEHICLES AND / OR CONVEYOR DEVICES
EP2184254B1 (en) * 2008-11-11 2013-01-09 Deutsche Post AG Forklift truck with a guidance and collision warning device
US8965561B2 (en) * 2013-03-15 2015-02-24 Cybernet Systems Corporation Automated warehousing using robotic forklifts
KR20150000317U (en) * 2013-07-12 2015-01-21 현대중공업 주식회사 Forklift truck

Also Published As

Publication number Publication date
EP3696135A1 (en) 2020-08-19
DE102019202076A1 (en) 2020-08-20

Similar Documents

Publication Publication Date Title
DE102018006765B4 (en) PROCEDURE AND SYSTEM (S) FOR THE MANAGEMENT OF CARGO VEHICLES
EP3696135B1 (en) Forklift and system with forklift for the identification of goods
EP2439487B1 (en) Volume measuring device for mobile objects
EP3177889B1 (en) Device and method for determining the volume of an object moved by an industrial truck
EP2384489B1 (en) Method and system for planning the travel route of a transport vehicle
DE112019002547B4 (en) SYSTEM AND PROCEDURE FOR DETERMINING A LOCATION FOR PLACEMENT OF A PACKAGE
DE112017006429T5 (en) Methods, systems and devices for segmenting objects
DE102008018436B4 (en) Storage rack with automatic storage location posting
EP2888060B1 (en) Method and device for transporting rectangular objects
DE102016107767A1 (en) Image analysis system and method for automated good identification
DE102013002554A1 (en) Method for detecting objects in a warehouse and / or for spatial orientation in a warehouse
EP3651091A1 (en) Storage management system, having position detecting function for items and associated storage areas
EP3071340B1 (en) Method and appartus for sorting objects
EP3211611A1 (en) Method for the computer-assisted detection of an empty transport container and device for computer-assisted detection of an empty transport container
DE102020124613A1 (en) IMPROVED SELECTION OF AN OBJECT OF INTEREST IN NEURAL NETWORK SYSTEMS AT POINT OF SALE
DE102020117545A1 (en) SYSTEM AND METHOD FOR IDENTIFYING OBJECTS IN A COMPOSITE OBJECT
EP3977225B1 (en) Method for creating an environment map for use in the autonomous navigation of a mobile robot
DE102018006764B4 (en) PROCEDURE AND SYSTEM (S) FOR THE MANAGEMENT OF CARGO VEHICLES
EP3597490B1 (en) Method for monitoring the condition of commercial vehicles or interchangeable bodies for commercial vehicles
DE112017008146T5 (en) METHOD AND DEVICE FOR DETECTING AND RECOGNIZING GRAPHICAL CHARACTERISTICS IN IMAGE DATA USING SYMMETRICALLY ARRANGED EMPTY AREAS
EP3866128A1 (en) Detection system and method for detecting objects for or in an automated intake machine
DE102007055704A1 (en) Device for counting animals, particularly economically useful animals for transport, has recording unit for image-based recording of number of animals in spacious area, and memory unit for storing results of count
DE112019005299T5 (en) Method for detecting incorrect placement of packages in incorrect trailers using a trailer monitoring unit
DE112019006192T5 (en) METHOD FOR IMPROVING THE ACCURACY OF A TRAINING IMAGE DATA SET OF A FOLDING NEURONAL NETWORK FOR LOSS PREVENTION APPLICATIONS
DE102017211023A1 (en) Device and method for measuring an expansion of a freight item

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20200212

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

RBV Designated contracting states (corrected)

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

INTG Intention to grant announced

Effective date: 20210721

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

Free format text: NOT ENGLISH

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 502020000480

Country of ref document: DE

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 1458554

Country of ref document: AT

Kind code of ref document: T

Effective date: 20220115

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

Free format text: LANGUAGE OF EP DOCUMENT: GERMAN

REG Reference to a national code

Ref country code: NL

Ref legal event code: FP

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG9D

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211229

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211229

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211229

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220329

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211229

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220329

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211229

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211229

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220330

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211229

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211229

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211229

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220429

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211229

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211229

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211229

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211229

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211229

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220429

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 502020000480

Country of ref document: DE

REG Reference to a national code

Ref country code: BE

Ref legal event code: MM

Effective date: 20220228

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220212

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211229

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211229

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

26N No opposition filed

Effective date: 20220930

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220212

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211229

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220228

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: NL

Payment date: 20230222

Year of fee payment: 4

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20230222

Year of fee payment: 4

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20230222

Year of fee payment: 4

Ref country code: IT

Payment date: 20230228

Year of fee payment: 4

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20230228

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20230228

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: NL

Payment date: 20240222

Year of fee payment: 5