EP3696135A1 - Chariot élévateur et système pourvu de chariot élévateur permettant d'identifier une marchandise - Google Patents

Chariot élévateur et système pourvu de chariot élévateur permettant d'identifier une marchandise Download PDF

Info

Publication number
EP3696135A1
EP3696135A1 EP20156789.8A EP20156789A EP3696135A1 EP 3696135 A1 EP3696135 A1 EP 3696135A1 EP 20156789 A EP20156789 A EP 20156789A EP 3696135 A1 EP3696135 A1 EP 3696135A1
Authority
EP
European Patent Office
Prior art keywords
camera
goods
forklift
image data
designed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
EP20156789.8A
Other languages
German (de)
English (en)
Other versions
EP3696135B1 (fr
Inventor
Andreas Plettner
Armin Lang
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Technische Universitaet Muenchen
Indyon GmbH
Original Assignee
Technische Universitaet Muenchen
Indyon GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Technische Universitaet Muenchen, Indyon GmbH filed Critical Technische Universitaet Muenchen
Publication of EP3696135A1 publication Critical patent/EP3696135A1/fr
Application granted granted Critical
Publication of EP3696135B1 publication Critical patent/EP3696135B1/fr
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66FHOISTING, LIFTING, HAULING OR PUSHING, NOT OTHERWISE PROVIDED FOR, e.g. DEVICES WHICH APPLY A LIFTING OR PUSHING FORCE DIRECTLY TO THE SURFACE OF A LOAD
    • B66F9/00Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes
    • B66F9/06Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes movable, with their loads, on wheels or the like, e.g. fork-lift trucks
    • B66F9/075Constructional features or details
    • B66F9/0755Position control; Position detectors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66FHOISTING, LIFTING, HAULING OR PUSHING, NOT OTHERWISE PROVIDED FOR, e.g. DEVICES WHICH APPLY A LIFTING OR PUSHING FORCE DIRECTLY TO THE SURFACE OF A LOAD
    • B66F9/00Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes
    • B66F9/06Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes movable, with their loads, on wheels or the like, e.g. fork-lift trucks
    • B66F9/075Constructional features or details
    • B66F9/20Means for actuating or controlling masts, platforms, or forks
    • B66F9/24Electrical devices or systems

Definitions

  • the invention relates to a forklift truck and a system with a forklift truck for identifying goods to be picked up / picked up.
  • bottles are used that can be reused for filling immediately after cleaning and testing.
  • the one-way system the products of which are broken down into their basic components after being taken back and processed further, for example into glass or plastic granulate.
  • returnable bottles are more environmentally friendly than single-use bottles.
  • the energy and resource consumption for return transport and cleaning is lower for returnable bottles than the additional manufacturing costs for disposable bottles. This applies all the more, the more regional the distribution and the higher the number of refills.
  • reusable bottles and non-returnable bottles in terms of shape and quality is usually only possible in the case of plastic bottles (mostly made of the material PET, polyethylene terephthalate).
  • reusable bottles are characterized by a greater wall thickness and thus stability, since the bottles have to withstand a significantly longer life cycle.
  • the external empties management includes the actual acceptance of empties within the beverage trade via the customers, the local or external pre-sorting and the logistical link or the transfer of the empties to the beverage manufacturer's destination.
  • internal empties management includes the delivery of empties to the beverage manufacturer itself, the data recording and storage of the goods, cleaning and inspection, as well as the appropriate provision of empties at a suitable point in production.
  • the used reusable bottles returned by the customer are usually pre-sorted at the dealer's location or by special service providers. This pre-sorting is mostly done manually in the beverage return points or also automatically by the beverage return machines installed in the retail trade. With these service providers, this pre-sorting is also carried out manually or automatically. These pre-sorting processes in the external empties management ensure a relatively high degree of purity of the beverage crates, so that it can be assumed that the right bottles are contained within a crate.
  • the delivery of empties to the beverage manufacturer is usually done in box form on standardized, reusable transport pallets.
  • the amount of empties that has to be processed here is very high for a large beverage manufacturer. For example, around 30,000 trucks with 36 pallets each are processed at the Gerolsteiner mineral spring.
  • the process in the prior art is as follows: The forklift drivers have to unload the truck with very short deadlines. While the forklift driver takes the pallets on the fork, he usually has to record the number of pallets and the crate / bottle type using a terminal in the forklift. This data then flows directly into the merchandise management system, which provides the data for the subsequent storage and production processes.
  • load carriers KLTs
  • KLTs load carriers
  • These load carriers are used to hold special components that are used, for example, in automobile production.
  • These load carriers are usually made to accommodate very specific items and are therefore located in a circulatory system between the automobile manufacturer and its suppliers.
  • the invention is based on the object of providing a dynamic, mobile, and real-time capable quantity and type detection of goods during unloading or loading, which reliably detects incorrect sorting.
  • a forklift truck comprises: first camera system with a first camera which is designed to capture a first side of goods to be picked up; a second camera system with a second camera which is designed to capture a second side of the goods during or after they have been picked up by the forklift truck, the first side differing from the second side; a first sensor system with a first sensor which is designed to detect a distance between the forklift truck and the goods; a controller which is designed to output a first activation signal to the first camera when the distance detected by the first sensor falls below a minimum distance; and a transmission unit which is designed to transmit the image data of the goods captured by the first camera and by the second camera to a data processing system for identifying the goods.
  • the image data of the goods to be picked up / picked up, in particular a pallet of empty boxes are generated in real time and transferred to the data processing system.
  • Analysis of the Image data has the advantage that the goods, in particular a type detection of the empties, can be determined. This allows a precise recording of the current stocks of empties and thus a more efficient production process.
  • the first sensor system is based on ultrasound or radar in order to determine the distance of the forklift exactly.
  • the forklift truck comprises a second sensor system with a second sensor which is designed to detect that the forklift truck has picked up the goods.
  • the controller is further designed to output a second activation signal to the second camera when it is determined by the second sensor that goods have been picked up.
  • the second camera system has an extendable device which is designed to bring the second camera into a position which enables the second side of the goods to be detected. A rear side of the goods W can thus be detected.
  • the first camera and the second camera are 3D cameras. In this way, the dimensions of the goods such as height and width can be determined.
  • the forklift includes the data processing system. This has the advantage that the image analysis on the forklift truck can be carried out in real time and the information about the identified goods can be provided directly to an inventory management system.
  • the first camera system further comprises a first mirror and a second mirror, each with a device which is designed to position the mirrors in such a way that they have a third side and a fourth side of the goods picked up by the forklift truck capture. In this way, other sides of the goods can be recorded, which enables more precise identification of empties.
  • the first camera is attached to a mast of the forklift.
  • the second camera system is arranged in one of the forks of the forklift truck. This has the advantage that the second camera can capture a lower side of the goods while the goods are being picked up. So For example, it can be ensured that no empty crates are missing from a pallet
  • the second camera is positioned in such a way that it captures the second side of the goods, the second side of the goods corresponding to the rear side facing away from the forklift truck.
  • the retractable device enables the second camera to be brought into different positions in order to ensure complete coverage of the rear side of the goods.
  • the first mirror and the second mirror are each attached to the forklift and are positioned after the goods have been picked up by the respective device so that the third and fourth sides of the goods, which correspond to the lateral sides, reflect the first mirror and the second mirror being sighted by the first camera after receiving the goods.
  • the mirrors are attached to the side of the forklift and can be extended / unfolded using the respective device.
  • the first camera can then also capture the image data of the side part of the goods by means of the extended / unfolded mirror, which improves the goods identification.
  • the data processing system in the forklift is designed to carry out preprocessing of image data that is captured by one or more camera systems of the forklift, the preprocessing filtering sensor-related errors, preparing the data format of the image data, and sensor data fusion , feature extraction and machine segmentation.
  • the data processing system is also designed to carry out an identification of the goods, in particular of empty crate types, and a number of empty crates, by means of one or more of the following classifications from the preprocessed image data: a geometric classification based on geometric data, which are determined from the image data, or their extracted features, a texture-based classification based on color data, which are determined from the image data, and their extracted features.
  • a geometric classification based on geometric data which are determined from the image data, or their extracted features
  • a texture-based classification based on color data which are determined from the image data, and their extracted features.
  • the bottle types of the goods W can also be determined.
  • the data processing system is further designed to carry out a statistical estimation and an estimation by machine learning on the basis of the captured image data in order to predict the occupancy of individual empty crates of the goods picked up determine.
  • specific image processing algorithms and machine learning methods such as "K-Nearest Neighbors", neural networks, etc.
  • K-Nearest Neighbors neural networks, etc.
  • a system comprises a forklift truck with a positioning system which is designed to detect a geographical position of the forklift truck in a warehouse in real time; a first camera system with a first camera which is designed to capture a side of goods to be picked up facing the forklift truck; a first sensor system with a first sensor which is designed to detect a distance of the forklift truck from goods to be picked up; a controller which is designed to output a first activation signal to the first camera when the distance detected by the first sensor falls below a minimum distance; a transmission system which is designed to transmit the image data of the goods captured by the first camera to a data processing system; a second camera system with a second camera, which is arranged in a warehouse and which is designed to capture at least one side of the goods facing away from the forklift truck during and after the goods are picked up by the forklift truck, the data processing system being designed to be controlled by the the first camera and the second camera to analyze captured image data in order to identify the goods.
  • the second camera of the second camera system is attached in an area of a goods unloading point within the warehouse, preferably on the hall ceiling; or attached to a gate of the warehouse through which the forklift truck with the picked up goods passes.
  • the data processing system is provided by the forklift itself.
  • the positioning system of the forklift is also designed to provide the image data captured by the first camera with the geographical position in order to subsequently assign the image data captured by the first camera to the image data captured by the second camera enable, wherein the image data captured by the first camera and the second camera each additionally have a time stamp.
  • the data processing system is designed to identify the forklift truck by analyzing the image data captured by the second camera in order to enable the image data captured by the first camera to be associated with the image data captured by the second camera.
  • the data processing system is designed to carry out preprocessing on the basis of the associated image data of the first camera and the second camera, the preprocessing filtering of sensor-related errors, preparing the data format of the image data, and sensor data fusion , feature extraction, and machine segmentation.
  • the data processing system is further designed to carry out an identification of the goods, in particular of empty crate types and a number of empty crates, from the preprocessed image data by one or more of the following classifications: (a) a geometric classification based on geometric data determined from the image data, or their extracted features, and (b) texture-based classification based on color data determined from the image data and their extracted features.
  • the second camera system comprises a plurality of mirrors which are arranged in the warehouse and which are designed to capture the sides of the goods facing away from the forklift truck, the first camera also being designed to display the mirrors at or after Aiming at picking up the goods.
  • the second camera system is formed by a drone with a camera.
  • the forklift 10 shown can preferably include a first camera 12 which detects a first side S1 of a goods W to be picked up, such as a pallet of empty boxes.
  • the field of view of the first camera 12 is in Figure 1A indicated by a dashed line.
  • the first camera 12 is attached to a lifting mast 11 of the forklift 10.
  • the first camera 12 can also be attached to a load carrier 17 of the forklift 10.
  • the forklift 10 can include a first sensor 16 which determines a distance between the forklift 10 and the goods W.
  • the first sensor 16 can determine the distance to the goods W by measuring the transit time of electromagnetic waves or sound waves, which is indicated by the dashed line in FIG Figure 1A is marked.
  • the first sensor 16 comprises an ultrasonic sensor, a radar sensor and a LIDAR (“Light Detection and Ranging”) sensor.
  • the first sensor 16 is attached to the roof of the forklift 10.
  • the first sensor 16 can also be attached to a lifting mast 11 of the forklift 10, or can be attached in one of the forks 13 or on the load carrier 17 of the forklift.
  • a controller 20 of the first embodiments comprised by the forklift 10 as shown in FIG Figure 1B shown, can output a first activation signal to the first camera 12 if the distance determined by the first sensor 16 falls below a minimum distance.
  • the first camera 12 is activated by the first activation signal and begins to record image data of the first side S1 of the goods W. For example, the first camera records a front side of the goods W to be picked up facing the forklift truck 10.
  • the forklift truck 10 of the first embodiment can include a second camera 14 which, during or after the recording, detects further sides of the goods W that differ from the first side S1.
  • the second camera 14 is received in one of the forks 13 of the forklift 10.
  • the second camera 14 can also be attached to the lifting mast 11 of the forklift 10.
  • the forklift 10 of the first embodiment can preferably comprise a second sensor 18 which determines the pick-up of the goods W by the forklift 10.
  • the in Figure 1B second sensor 18 shown is an ultrasonic sensor.
  • the second sensor 18 can include further sensors, such as a load sensor, which detects the picking up of the goods W by means of a load indication.
  • the second sensor 18 is received in one of the forks 13 of the fork lift truck 10.
  • the second sensor 18 can also be attached to the lifting mast 11 or to the load carrier 17 of the forklift 10.
  • the controller 20 of the forklift truck 10 can preferably output a second activation signal to the second camera 14 of the forklift truck 10 in order to activate the second camera 14.
  • the second camera 14 detects a lower side of the goods W while it is being picked up by the forks 13 of the forklift 10.
  • the second camera records a rear side of the goods W facing away from the forklift truck 10 after it has been recorded. This will be referred to later on Figures 2A to 2C described in more detail.
  • the system 100 of the first embodiments shown comprise one or more external cameras 114 which are arranged in a warehouse and which Detect at least one side of the goods W facing away from the forklift truck 10 during and after the goods W are picked up by the forklift truck 10.
  • An external camera 114 can preferably be attached within a hall ceiling included in the warehouse in order to capture an upper side of the goods W.
  • an external camera 114 can be attached to a lamp post in the warehouse in order to capture a lateral side of the goods W.
  • an external camera 114 can be attached to a gate of the warehouse through which the forklift truck passes the goods W that have been picked up, in order to capture a lateral side of the goods W.
  • An external camera 114 can preferably be recorded in an airworthy drone that films the forklift 10 during and after the goods W are picked up. For example, if the controller outputs the first activation signal, a signal with the current geographical coordinates of the forklift 10 can also be output to the drone. The drone then captures the remaining sides of the goods W.
  • an external camera 114 can receive the first or the second activation signal that is output by the controller 20 of the forklift 10 and begin to transmit the image data captured by the external camera 114 to a data processing system 24.
  • the number of cameras and sensors that the forklift 10 and system 100 may include can vary.
  • the forklift 10 can only include one camera and the system 100 can only include one external camera.
  • the forklift 10 can include only one sensor and the system 100 can include one or more sensors.
  • the cameras 12, 16 of the forklift 10 and the external cameras 114 of the system 100 are preferably designed with lighting options, so that the cameras can achieve correct image data acquisition even in dark surroundings through sufficient lighting.
  • the cameras are preferably 3D cameras that operate on the principle of the light flow method in order to additionally enable a determination of the depth data of the goods W to be recorded or recorded.
  • the cameras can also include network cameras that provide digital signals at the output in the form of a video stream that can be transmitted via Internet Protocol (IP).
  • IP Internet Protocol
  • the forklift 10 can include a transfer unit 22, as in FIG Figure 1B which can transmit the image data captured by the first 12 and the second camera 14 of the forklift 10 to a data processing system 24 for identifying the goods W.
  • a transfer unit 22 as in FIG Figure 1B which can transmit the image data captured by the first 12 and the second camera 14 of the forklift 10 to a data processing system 24 for identifying the goods W.
  • the forklift 10 may include a location system (not shown) that detects a geographic position of the forklift in the warehouse in real time.
  • the positioning system can include a GPS sensor that enables the geographic coordinates of the forklift truck 10 to be determined in real time.
  • the coordinates determined in this way can be transmitted to the data processing system 24 by the transmission unit 20 together with the captured image data of the forklift 10 and a time stamp.
  • the data processing system 24 can then, based on the transmitted coordinates and the time stamp, combine the image data captured by the forklift 10 with the image data captured by an external camera 114 of the system 100.
  • the forklift truck 10 can be identified by analyzing the image data of an external camera 114 of the system 100, for example via a QR code, in order to enable the image data to be assigned to the image data of the forklift truck 10.
  • FIG. 2A shows a perspective detail of the forklift 10 according to the first embodiment, which shows the fork 13 with the second camera 14 before the goods W are picked up.
  • the second camera 14 can preferably be formed with impact-resistant and / or water-resistant materials, which protect the second camera 14 from impacts when picking up goods and / or from moisture.
  • the second camera 14 can be arranged in a shock-proof and / or water-proof housing that is received in the fork 13 of the forklift truck.
  • Figure 2B shows a perspective detail of the forklift 10 according to the first embodiment, which shows the fork 13 with the second camera 14 while the goods W are picked up.
  • the second camera can preferably comprise a 3D camera which uses a time of flight (TOF) method to determine the dimensions of the goods W during their recording, as shown in FIG Figure 2B shown, determined. For example, it can be ensured that no empty boxes are missing from a pallet.
  • TOF time of flight
  • Figure 2C shows a perspective detail of the forklift 10 according to first embodiments, which the fork 13 with the second camera 14 after the recording of the Ware W shows.
  • the second camera can preferably be connected to an extendable device 30 (not shown).
  • the extendable device comprises a telescopic device. This makes it possible to position the second camera 18 in such a way that the rear side of the goods W facing away from the forklift truck can be detected.
  • the second camera can be attached to the lifting mast 11 of the forklift (not shown).
  • the second camera 18 can be connected to the extendable device 30 so that the second camera 18 is positioned in such a way that it detects an upper side of the goods W or a side of the goods W.
  • the forklift truck 10 can include one or more mirrors.
  • Figure 3A Figure 12 illustrates a perspective view of the forklift 10 showing the first mirror 26 in a first position.
  • Figure 3B illustrates a perspective view of the forklift 10 showing the first mirror 26 in a second position.
  • Figure 3C Figure 10 illustrates a top perspective view of the forklift truck 10 showing the first mirror 26 and second mirror 28 in the second position.
  • the first mirror 26 which is connected to an extensible / fold-out device, is attached to the side of the forklift 10 and is in the first position, i.e. in a retracted / folded position.
  • the second mirror 28 is also attached on the opposite side of the forklift 10 (not shown).
  • the first mirror 26 and the second mirror 28 can be brought into the second position, that is to say extended / folded out, in order to reflect the lateral sides S3, S4 of the goods W.
  • the first mirror 26 and the second mirror 28 can preferably be extended when it is determined by the second sensor 18 that the goods W are being picked up.
  • the first camera 12 is preferably rotatable up to 90 ° in two directions in order to vary the field of view of the first camera 12. This enables the first mirror 26 and the second mirror 28 to be aimed at, as in FIG Figure 3C is shown to acquire the image data of the side sides of the goods W.
  • the system 100 can also comprise one or more mirrors.
  • one or more mirrors can be attached in the warehouse in the area of the goods unloading point (not shown) in order to reflect at least one side of the goods W facing away from the forklift truck 10.
  • the first camera 10 of the forklift 10 can then aim at the mirror in the warehouse to capture the image data.
  • the image data processing is explained in detail below. After the image data of the first camera 12 and the second camera 18 of the forklift 10, as well as the external cameras 114 of the system 10, have been captured and transmitted to the data processing system 24, data processing takes place.
  • the forklift truck 10 preferably contains the data processing system 24.
  • the data processing system 24 can also be contained in an external processing unit, for example a central computer.
  • the image data captured by the forklift 10 can be combined with those of the image data captured by the external cameras 114.
  • the image data is preprocessed.
  • sensor-related errors such as noise, overexposure, etc. are preferably filtered out of the image data by machine.
  • the data format is then prepared, for example by converting the image data to gray levels or by transforming the depth data of the goods W to determine the dimensions of the goods W.
  • the data determined by the sensors of the forklift truck 10 and the system 100 are preferably also merged with the image data .
  • a machine segmentation of the captured image data is preferably carried out, as in FIG Fig. 4 shown.
  • Many methods of automatic segmentation are possible. Basically, they are often divided into pixel, edge and region-oriented processes.
  • model-based methods in which a certain shape of the objects is assumed, and texture-based methods, in which an internal homogeneous structure of the objects is also taken into account. Different methods can also be combined to achieve better results.
  • Features of the captured image data can preferably be extracted. Examples include, but are not limited to, SIFT ("Scale-Invariant Feature Transform"), HOG (“Histogram of Oriented Gradients”), and SURF ("Speeded Up Robust Features ").
  • SIFT Scale-Invariant Feature Transform
  • HOG Histogram of Oriented Gradients
  • SURF Speeded Up Robust Features
  • SIFT is an algorithm for the detection and description of local features in images.
  • the detector and the feature descriptions are, within certain limits, invariant versus coordinate transformations such as translation, rotation and scaling. They are also robust against lighting variations, image noise and lower geometric deformation of a higher order, such as those caused by projective imaging of an object from different points of view in space.
  • HOG is an algorithm for feature recognition in images that represents the appearance and shape of objects within an image even without detailed knowledge of the positions of edges or corners by distributing the local intensity or the arrangement of the edges. For this, the image is broken down into sub-areas and the orientations of all edges are determined for each sub-area and their number is saved as a histogram.
  • the SURF algorithm is based on the same principles and steps as SIFT. However, the details in each step are different.
  • the algorithm consists of three main parts: identification of points of interest, description of the local environment and mapping.
  • the goods W in particular the empty packaging types of the pallet (s) picked up, are identified.
  • a geometric classification by means of (possibly segmented) depth data or their extracted features, such as height, depth, width and number of boxes, which are obtained from the preprocessed image data, is made possible by comparison with data from a previously created database. This allows the types of empties to be identified, in particular the types of boxes of goods W.
  • a texture-based classification using (possibly segmented) color data or their extracted features includes the application and evaluation of statistical methods for computer-aided recognition of patterns in images.
  • statistical evaluation of the preprocessed image data for example by means of the median color tone determination
  • machine learning for example "shallow learning", “deep learning”
  • an algorithm can be trained with sufficiently real data or their features. This trained algorithm can then be validated and verified using real data and then used to identify empties.
  • the identification of the empty packaging types preferably comprises a combination of the geometric and texture-based classification with merged sensor data.
  • a prediction of the most likely occupancy of the individual empties crates is preferably made by analyzing all visible sides of the goods W, for example two per Pallet. This is done using a statistical estimate and / or an estimate using machine learning methods.
  • the corresponding results of the image data processing are then sent to an ERP system using a standardized communication path (for example using a network protocol such as the "Simple Object Access Protocol” (SOAP)) and standardized data formats (for example CSV, XML, JSON RPC).
  • SOAP Simple Object Access Protocol
  • standardized data formats for example CSV, XML, JSON RPC.

Landscapes

  • Engineering & Computer Science (AREA)
  • Transportation (AREA)
  • Structural Engineering (AREA)
  • Civil Engineering (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Geology (AREA)
  • Mechanical Engineering (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Forklifts And Lifting Vehicles (AREA)
EP20156789.8A 2019-02-15 2020-02-12 Chariot élévateur et système pourvu de chariot élévateur permettant d'identifier une marchandise Active EP3696135B1 (fr)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
DE102019202076.3A DE102019202076A1 (de) 2019-02-15 2019-02-15 Gabelstapler und system mit gabelstapler zur identifikation einer ware

Publications (2)

Publication Number Publication Date
EP3696135A1 true EP3696135A1 (fr) 2020-08-19
EP3696135B1 EP3696135B1 (fr) 2021-12-29

Family

ID=69571899

Family Applications (1)

Application Number Title Priority Date Filing Date
EP20156789.8A Active EP3696135B1 (fr) 2019-02-15 2020-02-12 Chariot élévateur et système pourvu de chariot élévateur permettant d'identifier une marchandise

Country Status (2)

Country Link
EP (1) EP3696135B1 (fr)
DE (1) DE102019202076A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102020211892A1 (de) 2020-09-23 2022-03-24 Robert Bosch Gesellschaft mit beschränkter Haftung Verfahren zur Objekterkennung, Computerprogramm, Speichermedium, Objekterkennungseinrichtung und Überwachungsanordnung
WO2023057749A1 (fr) * 2021-10-05 2023-04-13 Wrs Solutions Ltd Système de surveillance et/ou d'inventaire de marchandises et son procédé d'utilisation

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102021108146A1 (de) 2021-03-31 2022-10-06 Bayerische Motoren Werke Aktiengesellschaft Verfahren und Vorrichtung zum Entladen eines Fahrzeugs

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0254192A2 (fr) * 1986-07-19 1988-01-27 T.E.C. Computer Gmbh Procédé et dispositif pour piloter des véhicules et/ou accroître la sécurité active et passive des conducteurs
US20110234389A1 (en) * 2008-11-11 2011-09-29 Deutsche Post Ag Guidance and collision warning device for forklift trucks
US20140277691A1 (en) * 2013-03-15 2014-09-18 Cybernet Systems Corporation Automated warehousing using robotic forklifts
KR20150000317U (ko) * 2013-07-12 2015-01-21 현대중공업 주식회사 지게차

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0254192A2 (fr) * 1986-07-19 1988-01-27 T.E.C. Computer Gmbh Procédé et dispositif pour piloter des véhicules et/ou accroître la sécurité active et passive des conducteurs
US20110234389A1 (en) * 2008-11-11 2011-09-29 Deutsche Post Ag Guidance and collision warning device for forklift trucks
US20140277691A1 (en) * 2013-03-15 2014-09-18 Cybernet Systems Corporation Automated warehousing using robotic forklifts
KR20150000317U (ko) * 2013-07-12 2015-01-21 현대중공업 주식회사 지게차

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
DATABASE WPI Week 201511, Derwent World Patents Index; AN 2015-095248, XP002799589 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102020211892A1 (de) 2020-09-23 2022-03-24 Robert Bosch Gesellschaft mit beschränkter Haftung Verfahren zur Objekterkennung, Computerprogramm, Speichermedium, Objekterkennungseinrichtung und Überwachungsanordnung
WO2023057749A1 (fr) * 2021-10-05 2023-04-13 Wrs Solutions Ltd Système de surveillance et/ou d'inventaire de marchandises et son procédé d'utilisation

Also Published As

Publication number Publication date
EP3696135B1 (fr) 2021-12-29
DE102019202076A1 (de) 2020-08-20

Similar Documents

Publication Publication Date Title
DE102018006765B4 (de) Verfahren und system(e) für das management von frachtfahrzeugen
EP3696135B1 (fr) Chariot élévateur et système pourvu de chariot élévateur permettant d'identifier une marchandise
EP2439487B1 (fr) Dispositif de mesure de volume pour objets mobiles
EP2384489B1 (fr) Procédé et système de planification d'itinéraire d'un véhicule de transport
DE112017006429T5 (de) Verfahren, Systeme und Vorrichtungen zum Segmentieren von Objekten
EP3177889B1 (fr) Dispositif et procédé de détermination du volume d'un objet déplacé par un chariot de manutention au sol
EP2888060B1 (fr) Procédé et dispositif de transport des objets rectangulaires
DE102016107767A1 (de) mnjBildanalysesystem und -verfahren zur automatisierten Gutidentifizierung
DE102013002554A1 (de) Verfahren zur Erkennung von Objekten in einem Lager und/oder zur räumlichen Orientierung in einem Lager
DE10107208A1 (de) Vorrichtung und Verfahren zur Ortung von Behältern und Inhalten von Behältern mittels Hochfrequenz-Etiketten
EP3651091A1 (fr) Système de gestion de stockage à détermination de position des produits de stockage et des zones de stockage associées
EP3071340B1 (fr) Procédé et dispositif de tri d'objets
DE102018006764B4 (de) Verfahren und system(e) für das management von frachtfahrzeugen
EP3211611A1 (fr) Procede de detection de vide assistee par ordinateur d'un recipient de transport et dispositif de detection de vide assistee par ordinateur d'un recipient de transport
DE102020124613A1 (de) Verbesserte auswahl eines objekts von interesse für neuronale netzwerksysteme an verkaufspunken
DE102020117545A1 (de) System und methode zur identifizierung von objekten in einem zusammengesetzten objekt
DE112019006132T5 (de) Verfahren, system und vorrichtung zur hilfsetikett-erkennung und -zuordnung
EP3977225B1 (fr) Procédé de fabrication d'une carte d'environnement pour l'utilisation lors de la navigation autonome d'un robot mobile
DE112022000929T5 (de) Automatisierte einheitslast-abfertigungsverfahren und -systeme
DE102021104352A1 (de) Transporter-segmentierung für datenerfassungssystem
DE102020213566A1 (de) Verfahren und Computersystem zur Objekterkennung oder Objektregistrierung basierend auf einer Bildklassifizierung
DE102018117541B4 (de) Verfahren zur Überwachung des Beladungszustandes von Nutzfahrzeugen oder Wechselaufbauten für Nutzfahrzeuge
DE112017008146T5 (de) Verfahren und vorrichtung zum detektieren und zum erkennen von graphischen zeichendarstellungen in bilddaten unter verwendung symmetrisch angeordneter leerer flächen
DE112019005299T5 (de) Verfahren zur Erkennung einer Falschplatzierung von Paketen in falschen Anhängern unter Verwendung einer Anhängerüberwachungseinheit
DE102007055704A1 (de) Vorrichtung und Verfahren zum Zählen von Tieren beim Transport

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20200212

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

RBV Designated contracting states (corrected)

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

INTG Intention to grant announced

Effective date: 20210721

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

Free format text: NOT ENGLISH

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 502020000480

Country of ref document: DE

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 1458554

Country of ref document: AT

Kind code of ref document: T

Effective date: 20220115

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

Free format text: LANGUAGE OF EP DOCUMENT: GERMAN

REG Reference to a national code

Ref country code: NL

Ref legal event code: FP

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG9D

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211229

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211229

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211229

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220329

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211229

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220329

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211229

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211229

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220330

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211229

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211229

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211229

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220429

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211229

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211229

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211229

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211229

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211229

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220429

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 502020000480

Country of ref document: DE

REG Reference to a national code

Ref country code: BE

Ref legal event code: MM

Effective date: 20220228

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220212

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211229

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211229

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

26N No opposition filed

Effective date: 20220930

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220212

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211229

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220228

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20230222

Year of fee payment: 4

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: IT

Payment date: 20230228

Year of fee payment: 4

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20230228

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20230228

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: NL

Payment date: 20240222

Year of fee payment: 5

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211229

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211229

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20240226

Year of fee payment: 5

Ref country code: GB

Payment date: 20240221

Year of fee payment: 5