WO2022132239A1 - Procédé, système et appareil de gestion d'entrepôt par détection d'une cargaison endommagée - Google Patents

Procédé, système et appareil de gestion d'entrepôt par détection d'une cargaison endommagée Download PDF

Info

Publication number
WO2022132239A1
WO2022132239A1 PCT/US2021/045017 US2021045017W WO2022132239A1 WO 2022132239 A1 WO2022132239 A1 WO 2022132239A1 US 2021045017 W US2021045017 W US 2021045017W WO 2022132239 A1 WO2022132239 A1 WO 2022132239A1
Authority
WO
WIPO (PCT)
Prior art keywords
cargo
warehouse
damaged
images
image
Prior art date
Application number
PCT/US2021/045017
Other languages
English (en)
Inventor
Taegyu Lim
Byungsoo Kim
Original Assignee
Motion2Ai
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Motion2Ai filed Critical Motion2Ai
Publication of WO2022132239A1 publication Critical patent/WO2022132239A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/08Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
    • G06Q10/087Inventory or stock management, e.g. order filling, procurement or balancing against orders
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/20Administration of product repair or maintenance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/001Industrial image inspection using an image reference approach
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/06Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons
    • G06N3/063Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons using electronic means

Definitions

  • Embodiments described herein are generally related to methods, systems, and apparatus for managing warehouses by detecting damaged cargos.
  • embodiments of the present disclosure relate to inventive and unconventional methods, systems, and apparatus which may acquire image data from one or more sensors, detect cargos from the image data, detect damages of the cargos, and provide information related to the damages, to increase efficiency and safety of warehouse environments.
  • automated warehouse systems can automate the storage and retrieval of boxes, goods, and pallets in a warehouse.
  • Automated warehouse systems can include conveyors designed for transporting boxes, goods, and pallets to specific warehouse locations, and racking systems for storing and retrieving the boxes, goods, and pallets.
  • damage may occur due to collisions with other objects.
  • the damaged cargo can cause customer dissatisfaction when it is delivered to the customer.
  • the damaged cargo should be promptly found and replaced in the logistics processing stage before it is delivered to the customer.
  • the damaged cargo increases the overall logistics processing time and the loss of the logistics warehouse. Further, additional workforce and time may be required to inspect the damaged logistics.
  • Artificial intelligence employing machine learning algorithms and neural network models can be used in a variety of image and video processing tasks. For instance, in video processing and analytics applications using Al, each frame of a video is fed to an Al system. The Al system typically repeats a similar set of computations on each frame of the video, and outputs associated analytics. Such an Al system can be used to improve the accuracy of analysis for images captured by low-cost image sensors mounted on working vehicles located in a warehouse.
  • the present disclosure relates to methods, systems, and apparatus, that use the above- mentioned Al system and machine learning technology to analyze images acquired from cameras in real time to detect all cargoes present in the images, and to classify damaged cargoes and damaged parts, for managing a warehouse.
  • a method for managing a warehouse comprises acquiring at least one image of inside of the warehouse, identifying at least one cargo in the acquired image, determining whether the identified cargo has been damaged, and when the identified cargo has been damaged, displaying at least one first graphic object indicating a damaged status of the damaged cargo.
  • FIG. 1 illustrates a conventional vehicle used for moving cargos in a warehouse.
  • FIG. 2 illustrates an environment for managing a warehouse, in accordance with some embodiments.
  • FIG. 3 is a flowchart of a method for managing a warehouse, in accordance with some embodiments.
  • FIG. 4 shows an architecture for managing a warehouse, in accordance with some embodiments.
  • FIG. 5 shows a detailed architecture for managing a warehouse, in accordance with some embodiments.
  • FIG. 6 shows a detailed architecture for managing a warehouse, in accordance with some other embodiments.
  • FIG. 7 illustrates an environment for managing a warehouse, in accordance with some embodiments.
  • FIG. 8 illustrates an environment for managing a warehouse, in accordance with some other embodiments.
  • FIG. 9 illustrates an environment for managing a warehouse, in accordance with some other embodiments.
  • FIG. 10 illustrates an environment for managing a warehouse, in accordance with some other embodiments.
  • FIG. 11 A illustrates an environment for managing a warehouse, in accordance with some other embodiments.
  • FIG. 11B illustrates a flowchart of a detailed process for managing a warehouse, in accordance with some other embodiments.
  • FIG. 12 shows a flowchart for performing a machine learning process, in accordance with some embodiments.
  • FIGS. 13A and 13B show images showing screen images for managing a warehouse, in accordance with some embodiments.
  • FIGS. 14A and 14B show images showing different screen images for managing a warehouse, in accordance with some other embodiments.
  • FIGS. 15A and 15B show images showing different screen images for managing a warehouse, in accordance with some other embodiments.
  • FIG. 1 illustrates a conventional vehicle used for moving cargo in a warehouse.
  • a term “cargo” may be used to refer to an object that can be used to contain or encompass another object. That is, a term “cargo” may be used to refer to a cargo, a box, a container, a unit, etc.
  • a forklift as shown in FIG. 1 can be used as a vehicle for moving cargo in a warehouse.
  • the vehicle for moving cargo in a warehouse is not limited to a forklift. It is understand that the term “vehicle” explained in the present application refers to any vehicles that can be used in the warehouse.
  • the forklift is a powered industrial truck used to lift and move materials over short distances and is known for its ability to handle a wide variety of warehouse jobs.
  • some sensors may be mounted on the forklift, to control the forklift in an efficient and safe manner.
  • 3D LiDAR sensors can be equipped for performing localization mapping, collision avoidance, and load engagement
  • draw wire encoders can be equipped for performing mast height for positioning
  • hall effect sensors can be equipped for performing mast angle
  • pressure sensors can be equipped for performing load weight measurement
  • SIL3/PLe rated functional safety encoder can be equipped for performing ground speed direction control.
  • FIG. 2 illustrates an environment for managing a warehouse, in accordance with some embodiments.
  • the present disclosure relates to a system for automatically inspecting cargoes using image sensors (e.g., camera sensors) in the warehouse for logistics.
  • image sensors e.g., camera sensors
  • the data acquired by the image sensors can be automatically detected and determined in real time whether the cargo is damaged or not using machine learning technology.
  • the image sensors 100, 200 and 300 may be mounted on a vehicle 10 by mobile devices or by vehicle-attached devices, as shown in FIG.2.
  • the image sensors may be mounted on a fixed structure in the warehouse, as shown in FIGS. 10 and 11 A. Images can be acquired by such image sensors in real time, and thus, a cargo can be detected automatically in real time, and a cargo damage can also be detected and automatically in real time, from the images acquired by the image sensors.
  • Cargo detection and damage detection can be performed through image-based machine learning model or algorithm, and Al technology.
  • Al modules for operations related to cargo detection and damage detection can be processed by edge computing devices or by server equivalent computing devices, for example by from processors mounted on mobile devices other than vehicles, processors mounted on vehicles, or processors configured in server computers networked to the image sensors. That is, the entity performing the cargo detection and damage detection may vary according to embodiments. Further, the results of cargo detection and damage detection can be provided to users of the warehouse management system 20.
  • vehicle 10 uses image sensors 100, 200, 300 attached to the vehicle, to capture images of cargo, perform cargo detection and damage detection operations by using processors in the device attached to the vehicle, and send the processing results to warehouse management system 20.
  • FIG. 3 is a flowchart of a method for managing a warehouse, in accordance with some embodiments.
  • the flowchart shown in FIG. 3 may be implemented by the environments shown in FIG. 2.
  • the flowchart may begin at step S10 by acquiring at least one image of inside of the warehouse.
  • the images of cargos located inside the warehouse can be captured by camera sensors mounted on a vehicle located in the warehouse.
  • the warehouse management system 20 acquires image data including the images captured by camera sensors mounted on the vehicle 10.
  • the images include still images and/or video images of the loading area of the vehicle 10.
  • the images may be real-time images.
  • the flowchart may then proceed to step S20 by identifying at least one cargo in the acquired image.
  • the identifying operation identifies all of the cargos in the acquired image. In some other embodiments, the identifying operation selectively identifies some of the cargos in the acquired image. For example, in the image, only box-type cargos may be identified. In some embodiments, the identifying operation uses a plurality of images to identify cargos.
  • the identification of cargos is performed by using an image data analysis model trained by machine learning.
  • the image data analysis is performed by an artificial intelligence (Al) system, such as machine learning module or a neural network module, which is trained to classify frames or slices of frames as being critical or non-critical.
  • Al artificial intelligence
  • the Al system is mounted on one or more of the vehicle 10, the warehouse management system 20, and the mobile device.
  • the image data analysis model is trained, programmed or otherwise configured to differentiate between frames or portions/slices of frames that need to be processed by a computation engine.
  • raw video image data or raw still image data is provided to the Al system to train the image data analysis model.
  • the Al system loads the image data, applies the image data analysis model such as a deep learning model, and generates a trained image data analysis model.
  • the image data analysis model is trained to detect objects with varying dimensions, and derive position, angle, velocity, orientation, skew, alignment, and distances between objects, using the key points model prediction and Kalman filtering.
  • Kalman filters allow for the incorporation of temporal information into time-independent deep learning models for robust object tracking with assignment of identifiers.
  • the Al system mounted on one or more of the vehicle 10, the warehouse management system 20, and the mobile device comprises a computation engine employing various Al and computer vision analytics techniques that are relatively computationally intensive to provide a robust set of analytics for image frames processed by that computation engine.
  • the computation engine comprises a trained machine learning module and/or a neural network module, and further employs one or more computer vision algorithms to generate accurate analytics data.
  • the vehicle 10 may be equipped with a global positioning system (GPS) sensor, an inertial measurement unit (IMU) system, or both that records the location of the vehicle 10 (e.g., one or more of the latitude, longitude, and/or altitude) and its speed and acceleration in every direction.
  • GPS global positioning system
  • IMU inertial measurement unit
  • the GPS information may be used to geotag the location of the detected objects in the captured images and then store the data for integration with a mapping application.
  • the system that performs the image data analysis may analyze the data and image information in real-time or near real-time, identify the objects in the captured images, and generate a graphic information system (GIS) map of the objects using the information collected from the GPS sensor and the IMU system.
  • GIS graphic information system
  • a machine-learning algorithm detects an object, such as a cargo or a box or a container, in the images captured by the vehicle-mounted cameras.
  • the software may calculate the location of each object. Then, using the calculated location of the object, and based on the GPS location of the vehicle when each frame of the video was taken, the exact location (e.g., latitude, longitude, and/or altitude) of each detected object is calculated and marked on a GIS map.
  • the location information and pictures of each detected object may be used in the displaying step S40.
  • An object representation associated with each image can be used to determine whether a type of classification of the image corresponds to a category and includes particular attributes, or types of attributes, for which a model (e.g., logistic regression, neural network, or another machine learning algorithm) can be trained.
  • the different object representations can include, for example, objects of different sizes, dimensions, and shape (e.g., cube, brick, etc.) and the characteristics (e.g., size, dimension, shape) of the different object representations can influence how objects appear when engraved in or otherwise within or on a surface of an object.
  • the flowchart may then proceed to step S30 by determining whether the identified cargo has been damaged.
  • the aforementioned Al-related technologies and machine learning technologies can be used for performing the determination whether the identified cargo has been damaged.
  • the determination operation detects all of the damages of the cargos. In some other embodiments, the determination operation selectively identifies some of the damages of the cargos. For example, only the damages that have a size greater than a threshold size can be detected. Also, the detected damages can be classified according to types of damages. The types of damages may include a water damage, a permanent damage, and a non-permanent damage. In some embodiments, the determination operation uses a plurality of images of the same cargo to detect damages.
  • this method displays at least one first graphic object indicating a damaged status of the damaged cargo.
  • this method identifies at least one damaged portion of the damaged cargo, and displays at least one second graphic object indicating the identified damaged portion. Examples of the first graphic object and the second graphic object are shown in FIGS. 13A-15B.
  • a vehicle that has a mounted-on image sensor, such as a camera or a set of cameras, and does not have additional expensive sensors may be used for efficient and safe warehouse management. That is, the invention disclosed in the present application provides cost-effective management method, system, and apparatus. Further, this technology may also improve the centralization of warehouse management, which improves security. Such warehouse management can be implemented in real-time or near real-time by using a high-speed wireless communication scheme, such as Wi-FiTM communications.
  • a high-speed wireless communication scheme such as Wi-FiTM communications.
  • FIG. 4 shows an architecture for managing a warehouse, in accordance with some embodiments.
  • An environment for managing the warehouse may be implemented by a training module 11 and a processing module 12.
  • the training module 11 is configured to perform a machine learning for training an image data analysis model by using training data
  • the processing module 12 is configured to perform an image analysis by using the trained image data analysis model.
  • the training module 11 performs the above training with an off-line status
  • the processing module 12 performs the image analysis with an online status.
  • the training module 11 is included in the warehouse management system 20, which has a high-performance processor having sufficient processing power for processing machine learning, and the processing module 12 is included in a device mounted on the vehicle 10, along with the image sensors 100, 200 and 300 so that determining the damaged cargo can be more quickly performed. By these configuration processing resources may be efficiently managed.
  • the high-performance processor of the training module 11 may be a high-end parallelizable GPUs, which can perform ten (10) Tera (10 12 ) calculations per second can be used, and the processor of the processing module 12 may be an edge computer having a small processor such as an ARM or can be an embedded board having eight or 16 or more calculation chips.
  • the processing module 12 may be ARM Cortex A53 processor for a real-time processing.
  • the processing module 12 is included in the warehouse management system 20.
  • FIG. 5 shows a detailed architecture for managing a warehouse, in accordance with some embodiments.
  • An environment for managing the warehouse may be implemented by a training module 11 and a processing module 12. Further, the training module 11 may be implemented by a visual feature extractor 110 and a damaged box detector 111, and the processing module 12 may also be implemented by a visual feature extractor 120 and a damaged box detector 121.
  • the training module 11 receives training image data.
  • the training image data includes images and labeling data related to the images.
  • the visual feature extractor 110 extracts visual features of objects shown in images included in the training image data.
  • the damaged box detector 111 detects damages of the objects shown in the images included in the training image data.
  • the training module 11 may perform training an image data analysis model by using training data. This detection operation may be repeated by a predermined number. This detection operation may be repeated until a predetermined amount of training result data is accumulated. This detection operation may be repeated until a detection accracy is reached to a predetermined rate.
  • the processing module 12 receives image/video data from the sensors mounted on the vehicle 10.
  • the training image/video data includes still images, video images, gray-scale images, color images, which are captured by the camera sensors mounted on the vehicle 10.
  • the visual feature extractor 120 extracts visual features of objects shown in images included in the received image/video data.
  • the damaged box detector 121 detects damages of the objects shown in the images included in the received image/video data.
  • the damaged box detector 121 outputs labels of the damaged cargo, and shows the location of the damaged cargo by using two or more of dots on a display screen. By performing the above processes, the processing module 12 may perform a determination of damaged cargos.
  • FIG. 6 shows a detailed architecture for managing a warehouse, in accordance with some other embodiments.
  • An environment for managing the warehouse may be implemented by a sensor input receiver 201, a cargo detector 202, a damaged cargo detector 203, an alarm signal generator 204.
  • the sensor input receiver 201 receives image/video data from at least one of the sensor mounted on a mobile device 101, the sensor mounted on a vehicle 102, and the sensor mounted on a fixed structure 103 located inside of the warehouse. Then, the cargo detector 202 detects cargos shown in the images included in the received image/video data. Then, the damaged cargo detector 203 detects damages on the detected cargos in the images included in the received image/video. Then, the alarm signal generator 204 transmits an alarm signal or an alarm message to at least one of a mobile device 301, a vehicle-equipped device 302, a fixed structure 303 and the warehouse management system 20. In some embodiments, the alarm signal may be a signal for outputting an alarm sound, an alarm text, or an alarm vibration.
  • FIG. 7 illustrates an environment for managing a warehouse, in accordance with some embodiments.
  • the environment includes one or more mobile devices 101 and 104 having image sensor(s), an image storage (illustrated as a box under the lable “Captured Images”), a server 30, and a warehouse management system 20.
  • the one or more mobile devices 101 and 104 are smartphones or handheld devices, equipped with the image sensor(s) (e.g., camera(s)).
  • the image sensors of the one or more mobile devices 101 and 104 capture images of cargos at one or more locations “A” and “B” in the warehouse, and transmits the captures images to the image storage.
  • the image sensors of the one or more mobile devices 101 and 104 include any type of visual or optical sensors, such as cameras, ultraviolet (UV) sensors, laser rangefinders (e.g., light detection and ranging (LIDAR)), infrared (IR) sensors, electro- optical/infrared (EO/IR) sensors, and so forth.
  • UV ultraviolet
  • LIDAR light detection and ranging
  • IR infrared
  • EO/IR electro- optical/infrared
  • the image sensors of the one or more mobile devices 101 and 104 capture a plurality of images of each cargo with a plurality of viewing angles (e.g., at one or more locations “A” and “B” in the warehouse).
  • one image sensor of one mobile device may capture a plurality of images of each cargo with a plurality of viewing angles by moving around the cargo.
  • the plurality of images have an overlapped portion, and a damaged portion of a cargo or a box may be identified by checking the overlapped portion of the plurality of images to increase an accuracy of the damage detection.
  • the image storage collects images captures by the image sensors of the one or more mobile devices 101 and 104.
  • the image storage is combined with the image sensors of the one or more mobile devices 101 and 104, and in some other embodiments, the image storage is separately located from the image sensors of the one or more mobile devices 101 and 104 and connected with the image sensors of the one or more mobile devices 101 and 104 wirelessly. Further, the image sensors of the one or more mobile devices 101 and 104 transmits the collected images to the server 30 (“1”). In some embodiments, the image storage is separately located from the server 30 and connected with the server 30 wirelessly.
  • the image storage is random access memory (RAM) in accordance with a Joint Electron Devices Engineering Council (JEDEC) design such as the DDR or mobile DDR standards (e.g., LPDDR, LPDDR2, LPDDR3, or LPDDR4).
  • JEDEC Joint Electron Devices Engineering Council
  • the image storage is implemented via a solid state disk drive (SSDD), is implemented via flash memory cards, such as SD cards, microSD cards, xD picture cards, and the like, and USB flash drives, and is implemented via a hard disk drive (HDD).
  • SSDD solid state disk drive
  • HDD hard disk drive
  • any number of new technologies may be used for the image storage in addition to, or instead of, the technologies described, such resistance change memories, phase change memories, holographic memories, or chemical memories, among others.
  • the wireless connection between the image storage and the image sensors of the one or more mobile devices 101 and 104, or the wireless connection between the image storage and the server 30 uses any number of frequencies and protocols, such as 2.4 Gigahertz (GHz) transmissions under the IEEE 802.15.4 standard, using the Bluetooth® low energy (BLE) standard, as defined by the Bluetooth® Special Interest Group, or the ZigBee ⁇ standard, among others. Any number of radios, configured for a particular wireless communication protocol, may be also used. For example, a WLAN unit may be used to implement Wi-FiTM communications in accordance with the Institute of Electrical and Electronics Engineers (IEEE) 802.11 standard. Also, wireless wide area communications, e.g., according to a cellular or other wireless wide area protocol, may occur via a WWAN unit.
  • IEEE Institute of Electrical and Electronics Engineers
  • the server 30 performs the aforementioned operations for identifying cargos in the images and for determining whether the identified cargo has been damaged (“2”).
  • the server 30 includes any system that has a processor, such as, for example; a digital signal processor (DSP), a microcontroller, an application-specific integrated circuit (ASIC), or a microprocessor.
  • DSP digital signal processor
  • ASIC application-specific integrated circuit
  • an image processing algorithm is used to determine whether there is a cargo or not, and whether there is a damage or not.
  • the image processing algorithm is executed as to whether any cargo exists in the image.
  • the image processing algorithm determines whether there is any damage in the one or more cargos.
  • the server 30 is configured inside the warehouse management system 20.
  • the warehouse management system 20 receives, from the server 30, via a wired connection or a wireless connection, the results of damage determination, which includes results of the image processing performed by the server 30 (“4-1”).
  • the wireless connection is also implemented likewise the aforementioned wireless connection between the image storage and the image sensor of the mobile device , or the wireless connection between the image storage and the server 30.
  • the one or more mobile devices 101 and 104 receives, from the server 30, via a wired connection or a wireless connection, the results of damage determination (“3”).
  • the warehouse management system 20 in some embodiments acquires information on cargos in the warehouse, from the one or more mobile devices 101 and 104 (“4- 2”). With such information, the warehouse management system 20 may identify which cargo should be removed or should not be delivered to a customer. Thus, more efficient warehouse management would be available.
  • the warehouse management system 20 that has received the information on cargos in the warehouse may operate for evaluating each vehicle or driver by checking the moving route of the cargos. For example, if a particular vehicle/driver is frequently involved in the moving routes of damaged cargos, such a particular vehicle/driver can be set as a vehicle/driver to watch in detail.
  • FIG. 8 illustrates an environment for managing a warehouse, in accordance with some other embodiments.
  • the environment includes a vehicle 10 having an image sensor, an image storage (illustrated as a box under the lable “Captured Images”), a server 30, a warehouse management system 20, and a processing unit (illustrated as a box above the lable “Edge Computer”).
  • a vehicle 10 having an image sensor, an image storage (illustrated as a box under the lable “Captured Images”), a server 30, a warehouse management system 20, and a processing unit (illustrated as a box above the lable “Edge Computer”).
  • the vehicle 10 is a forklift equipped with the image sensor (e.g., camera).
  • the image sensor captures images of the vehicle 10’s loading area and transmits the captures images to the image storage.
  • the image sensor includes any type of visual or optical sensors, such as cameras, ultraviolet (UV) sensors, laser rangefinders (e.g., light detection and ranging (LIDAR)), infrared (IR) sensors, electro- optical/infrared (EO/IR) sensors, and so forth.
  • UV ultraviolet
  • LIDAR light detection and ranging
  • IR infrared
  • EO/IR electro- optical/infrared
  • the vehicle 10 includes a plate configured to be rotated, the vehicle loads a cargo on the plate, rotates the plate, and captures, while the cargo is rotating with the rotating plate, a plurality of images of the cargo by using a camera sensor mounted on the vehicle 10.
  • the plurality of images include images of a plurality of surfaces of the cargo.
  • the image storage collects images captures by the image sensor of the vehicle 10. In some embodiments, the image storage is combined with the image sensor of the vehicle 10, and in some other embodiments, the image storage is separately located from the image sensor of the vehicle 10 and connected with the image sensor of the vehicle 10 wirelessly. Further, the image storage transmits the collected images to the server 30 (“1”). In some embodiments, the image storage is separately located from the server 30 and connected with the server 30 wirelessly.
  • the wireless connection between the image storage and the image sensor of the vehicle 10, or the wireless connection between the image storage and the server 30 uses any number of frequencies and protocols, such as 2.4 Gigahertz (GHz) transmissions under the IEEE 802.15.4 standard, using the Bluetooth® low energy (BLE) standard, as defined by the Bluetooth® Special Interest Group, or the ZigBee ⁇ standard, among others. Any number of radios, configured for a particular wireless communication protocol, may be also used.
  • a WLAN unit may be used to implement Wi-FiTM communications in accordance with the Institute of Electrical and Electronics Engineers (IEEE) 802.11 standard.
  • IEEE Institute of Electrical and Electronics Engineers
  • wireless wide area communications e.g., according to a cellular or other wireless wide area protocol, may occur via a WWAN unit.
  • the server 30 performs the aforementioned operations for identifying cargos in the images and for determining whether the identified cargo has been damaged (“2”).
  • the server 30 includes any system that has a processor, such as, for example; a digital signal processor (DSP), a microcontroller, an application-specific integrated circuit (ASIC), or a microprocessor.
  • the processing unit performs the aforementioned operations for identifying cargos in the images and for determining whether the identified cargo has been damaged (“1-1”).
  • the image storage is combined with the server 30, as shown as the processing unit.
  • the processing unit performs the above-mentioned operations of the image storage and the server 30.
  • the processing unit includes any system that has a processor, such as, for example, an edge computer including a digital signal processor (DSP), a microcontroller, an application-specific integrated circuit (ASIC), or a microprocessor.
  • DSP digital signal processor
  • ASIC application-specific integrated circuit
  • the processing unit is attached to the vehicle 10, along with the image sensor. In some other embodiments, the processing unit is configured inside the warehouse management system 20.
  • the warehouse management system 20 receives, from the server 30 or the processing unit, via a wired connection or a wireless connection, the results of damage determination, which includes results of the image processing performed by the server 30 or the processing unit (“4-1” and “2-1”).
  • the wireless connection is also implemented likewise the aforementioned wireless connection between the image storage and and the image sensor of the vehicle 10, or the wireless connection between the image storage and the server 30.
  • the vehicle 10 receives, from the server 30, via a wired connection or a wireless connection, the results of damage determination (“3-1”), and then, checks the accuracy of the results of damage determination and transmits the checked results of damage determination to the WMS 20 (“4-3”).
  • the warehouse management system 20 acquires information on cargos in the warehouse. With such information, the warehouse management system 20 may identify which cargo should be removed or should not be delivered to a customer. Thus, more efficient warehouse management would be available.
  • FIG. 9 illustrates an environment for managing a warehouse, in accordance with some other embodiments.
  • FIG. 9 The embodiments shown in FIG. 9 are similar to the embodiments shown in FIG. 8 except that the vehicle 10 has a plurality of image sensors 100, 200 and 300, mounted on the vehicle 10.
  • the plurality of image sensors 100, 200 and 300 By using the plurality of image sensors 100, 200 and 300, a plurality of images of the same cargo may be captured, from different points of view, and by using the plurality of images, the accuracy of the operations for identifying cargos in the images and for determining whether the identified cargo has been damaged can be increased.
  • FIG. 10 illustrates an environment for managing a warehouse, in accordance with some other embodiments.
  • FIG. 10 The embodiments shown in FIG. 10 are similar to the embodiments shown in FIG. 9 except that a plurality of sensors 400, 500, and 600, which are mounted on a fixed structure located in the warehouse, are used for capturing images of cargos.
  • the fixed structure can be a gate-shaped structure. Accordingly, when the vehicle moving cargos passes through the gateshaped structure, the plurality of sensors 400, 500, and 600 may capture images of the cargos.
  • FIG. 11 A illustrates an environment for managing a warehouse, in accordance with some other embodiments.
  • an edge computer included in a network server 40 may receive cpatured images (“2-2”), which are collected by a plurality of sensors 400, 500, and 600, which are mounted on a fixed structure located in the warehouse (“1-2”). Then, the edge computer may perform the aforementioned operations for identifying cargos in the images and for determining whether the identified cargo has been damaged (“3-2”) and may transmit the operation results to the warehouse management system 20 (“5-2”). In some embodiments, the edge computer may transmit the operation results to the vehicle 10 (“3-3”). The vehicle checks the accuracy of the operation results, and then transmits the checked results to the WMS 20 (“5-1”).
  • FIG. 11B illustrates a flowchart of a detailed process for managing a warehouse, in accordance with some other embodiments.
  • the detailed process for managing the warehouse includes collecting captured images, recognizing an object from the collected images, performing auditing process by edge computing, detecting a damaged box, and generaing an alarm signal.
  • the edge computer may be mounted on the vehicle 10 and may include any system that has a processor, such as, for example; a digital signal processor (DSP), a microcontroller, an application-specific integrated circuit (ASIC), or a microprocessor.
  • DSP digital signal processor
  • ASIC application-specific integrated circuit
  • the edge computer mounted on the vehicle 10 may perform the operations shown in FIG. 11B, which include collecting captured images, recognizing an object, auditing processing, detecting damaged cargo, and generating an alarm signal.
  • FIG. 12 shows a flowchart for performing a machine learning process, in accordance with some embodiments.
  • the Al algorithm presented in this disclosure uses the Deep Learning method, which can be divided into two parts: forward and backward.
  • the forward takes the learning image as input and calculates the probability value, where the ground truth and loss are calculated. This loss value indicates how well the currently learned model reflects the actual value.
  • the backward the back- propagation algorithm is used to adjust weights learned through loss. These steps can be repeated to complete the learning.
  • FIGS. 13A-14B show images showing screen images for managing a warehouse, in accordance with some embodiments.
  • overlay boxes on the image of cargo shown in FIGS. 13A and 14A, information on whether the cargo is damaged or not can be displayed as overlay boxes, as shown in FIGS. 13B and 14B.
  • the overlay box may be along with overlay texts that include “Damaged” or “Undamaged”. Such an overlay box can be considered as a first graphic object mentioned above, which indicates a damaged status of the damaged cargo.
  • the overlay texts are displayed in the text of other languages than English. The overlay texts may vary depending on the type of work vehicle or cargo.
  • an icon can be displayed to indicate the damaged status of the cargo.
  • the warehouse management system can more accurately check whether the cargo is damaged or not in the vehicle at real-time, and it can be controlled accurately.
  • FIGS. 15A and 15B show images showing different screen images for managing a warehouse, in accordance with some other embodiments.
  • the damaged status of the cargo can be more specifically displayed.
  • a damaged status of a damaged cargo or a damaged box can be illustrated by using a different line color or a different line shape from a normal status of a normal box or a normal cargo.
  • a damaged portion or damaged portions can be emphasized by a different line color or a thicker line so that a user of the warehouse management system can easily identify where is the damaged portion.
  • Such differently displayed graphic object can be considered as a second graphic object mentioned above, which indicates the identified damaged portion.
  • the colors of the overlay box indicating the damaged cargos can be differentiated according to sizes of the damages and/or seriousness levels of the damages.
  • a plurality of damaged portions of the damaged cargo may be identified, and the plurality of damaged portions may be classified according to damage type categories (e.g., categories of damage level such as a soft damage, a serious damage, or categories of damage characteristic such as a recoverable damage, an irrecoverable damage).
  • damage type categories e.g., categories of damage level such as a soft damage, a serious damage, or categories of damage characteristic such as a recoverable damage, an irrecoverable damage.
  • at least one third graphic objects indicating the identified damaged portions may be displayed by using texts, different colors, or different line shapes, to state the classified damage type categories.
  • Embodiments described in the present disclosure can be implemented in digital electronic circuitry, in tangibly-embodied computer software or firmware, in computer hardware, including the structures disclosed in this specification and their structural equivalents, or in combinations of one or more of them.
  • Embodiments described in the present disclosure can be implemented as one or more computer programs, i.e., one or more modules of computer program instructions encoded on a tangible non-transitory storage medium for execution by, or to control the operation of, data processing apparatus.
  • the computer storage medium can be a machine-readable storage device, a machine-readable storage substrate, a random or serial access memory device, or a combination of one or more of them.
  • the program instructions can be encoded on an artificially-generated propagated signal, e.g., a machine-generated electrical, optical, or electromagnetic signal, that is generated to encode information for transmission to suitable receiver apparatus for execution by a data processing apparatus.
  • an artificially-generated propagated signal e.g., a machine-generated electrical, optical, or electromagnetic signal
  • processor or “processing unit” refers to data processing hardware and encompasses all kinds of apparatus, devices, and machines for processing data, including by way of example a programmable processor, a computer, or multiple processors or computers.
  • the apparatus can also be, or further include, special purpose logic circuitry, e.g., an FPGA (field- programmable gate array) or an ASIC (application-specific integrated circuit).
  • the apparatus can optionally include, in addition to hardware, code that creates an execution environment for computer programs, e.g., code that constitutes processor firmware, a protocol stack, a database management system, an operating system, or a combination of one or more of them.
  • a computer program which may also be referred to or described as a program, software, a software application, a module, a software module, a script, or code, can be written in any form of programming language, including compiled or interpreted languages, or declarative or procedural languages, and it can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a digital computing environment.
  • a computer program may, but need not, correspond to a file in a file system.
  • a program can be stored in a portion of a file that holds other programs or data, e.g., one or more scripts stored in a markup language document, in a single file dedicated to the program in question, or in multiple coordinated files, e.g., files that store one or more modules, sub-programs, or portions of code.
  • a computer program can be deployed to be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a data communication network.
  • the processes and logic flows can also be performed by, and apparatus can also be implemented as, special purpose logic circuitry, e.g., an FPGA or an ASIC, or by a combination of special purpose logic circuitry and one or more programmed computers.
  • special purpose logic circuitry e.g., an FPGA or an ASIC
  • For a system of one or more digital computers to be “configured to” perform particular operations or actions means that the system has installed on its software, firmware, hardware, or a combination of them that in operation cause the system to perform the operations or actions.
  • one or more computer programs to be configured to perform particular operations or actions means that the one or more programs include instructions that, when executed by digital data processing apparatus, cause the apparatus to perform the operations or actions.
  • Digital computers suitable for the execution of a computer program can be based on general or special purpose microprocessors or both, or any other kind of central processing unit.
  • a central processing unit will receive instructions and data from a read-only memory or a random access memory or both.
  • the essential elements of a computer are a central processing unit for performing or executing instructions and one or more memory devices for storing instructions and data.
  • the central processing unit and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.
  • a digital computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto-optical disks, or optical disks. However, a computer need not have such devices.
  • Computer-readable media suitable for storing computer program instructions and data include all forms of non-volatile memory, media and memory devices, including by way of example semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices; magnetic disks, e.g., internal hard disks or removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks.
  • semiconductor memory devices e.g., EPROM, EEPROM, and flash memory devices
  • magnetic disks e.g., internal hard disks or removable disks
  • magneto-optical disks e.g., CD-ROM and DVD-ROM disks.
  • Control of the various systems described in this specification, or portions of them, can be implemented in a computer program product that includes instructions that are stored on one or more non-transitory machine-readable storage media, and that are executable on one or more digital processing devices.
  • the systems described in this specification, or portions of them, can each be implemented as an apparatus, method, or electronic system that may include one or more digital processing devices and memory to store executable instructions to perform the operations described in this specification.
  • Other examples of the present disclosure may include any number and combination of machine-learning models having any number and combination of characteristics.
  • the machinelearning model(s) can be trained in a supervised, semi-supervised, or unsupervised manner, or any combination of these.
  • the machine-learning model(s) can be implemented using a single computing device or multiple computing devices.
  • Implementing some examples of the present disclosure at least in part by using machinelearning models can reduce the total number of processing iterations, time, memory, electrical power, or any combination of these consumed by a computing device when analyzing data.
  • a neural network may more readily identify patterns in data than other approaches. This may enable the neural network to analyze the data using fewer processing cycles and less memory than other approaches, while obtaining a similar or greater level of accuracy.
  • Some machine-learning approaches may be more efficiently and speedily executed and processed with machine-learning specific processors (e.g., not a generic CPU). Such processors may also provide an energy savings when compared to generic CPUs.
  • some of these processors can include a graphical processing unit (GPU), an application-specific integrated circuit (ASIC), a field-programmable gate array (FPGA), an artificial intelligence (Al) accelerator, a neural computing core, a neural computing engine, a neural processing unit, a purpose-built chip architecture for deep learning, and/or some other machine-learning specific processor that implements a machine learning approach or one or more neural networks using semiconductor (e.g., silicon (Si), gallium arsenide (GaAs)) devices.
  • semiconductor e.g., silicon (Si), gallium arsenide (GaAs)
  • processors may also be employed in heterogeneous computing architectures with a number of and a variety of different types of cores, engines, nodes, and/or layers to achieve various energy efficiencies, processing speed improvements, data communication speed improvements, and/or data efficiency targets and improvements throughout various parts of the system when compared to a homogeneous computing architecture that employs CPUs for general purpose computing.

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Economics (AREA)
  • Quality & Reliability (AREA)
  • Human Resources & Organizations (AREA)
  • Strategic Management (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Marketing (AREA)
  • Operations Research (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Software Systems (AREA)
  • Development Economics (AREA)
  • Finance (AREA)
  • Artificial Intelligence (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Medical Informatics (AREA)
  • Accounting & Taxation (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Image Analysis (AREA)

Abstract

La présente divulgation concerne des procédés, des systèmes et appareils de gestion d'un entrepôt sur la base d'un facteur de chargement, qui procèdent à la réception des données d'image acquises par au moins un capteur monté sur un véhicule situé dans l'entrepôt, à l'analyse des données d'image reçues à l'aide d'un modèle d'analyse de données d'image formé par un apprentissage automatique, à la détermination d'un état de chargement et d'un niveau de chargement du véhicule, sur la base des données d'image analysées, et à la commande du système de gestion d'entrepôt selon le statut de chargement et le niveau de chargement déterminés du véhicule.
PCT/US2021/045017 2020-12-16 2021-08-06 Procédé, système et appareil de gestion d'entrepôt par détection d'une cargaison endommagée WO2022132239A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202063126523P 2020-12-16 2020-12-16
US63/126,523 2020-12-16

Publications (1)

Publication Number Publication Date
WO2022132239A1 true WO2022132239A1 (fr) 2022-06-23

Family

ID=82057955

Family Applications (2)

Application Number Title Priority Date Filing Date
PCT/US2021/045017 WO2022132239A1 (fr) 2020-12-16 2021-08-06 Procédé, système et appareil de gestion d'entrepôt par détection d'une cargaison endommagée
PCT/US2021/044989 WO2022132238A1 (fr) 2020-12-16 2021-08-06 Procédé, système et appareil de gestion d'entrepôt basée sur un facteur de charge

Family Applications After (1)

Application Number Title Priority Date Filing Date
PCT/US2021/044989 WO2022132238A1 (fr) 2020-12-16 2021-08-06 Procédé, système et appareil de gestion d'entrepôt basée sur un facteur de charge

Country Status (1)

Country Link
WO (2) WO2022132239A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102613341B1 (ko) * 2023-06-27 2023-12-14 한국철도기술연구원 딥러닝 기반 훈련데이터 증강을 통해 개선된 화물 인식방법 및 장치

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102022116397A1 (de) * 2022-06-30 2024-01-04 Still Gesellschaft Mit Beschränkter Haftung Automatische Erkennung eines Beladungszustandes eines Ladungsträgers
DE102022116398A1 (de) * 2022-06-30 2024-01-04 Still Gesellschaft Mit Beschränkter Haftung Automatische Lokalisierung eines Ladungsträgers

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7116814B2 (en) * 2002-09-27 2006-10-03 Chunghwa Telecom Co., Ltd. Image-based container defects detector
US20150254600A1 (en) * 2014-03-05 2015-09-10 Wipro Limited System and method for real time assessment of cargo handling
US20170262717A1 (en) * 2015-04-08 2017-09-14 Spireon, Inc. Camera array system and method to detect a load status of a semi-trailer truck
WO2017178712A1 (fr) * 2016-04-15 2017-10-19 Conexbird Oy Procédé, logiciel et appareil pour l'inspection de cargaisons
US9826213B1 (en) * 2015-09-22 2017-11-21 X Development Llc Generating an image-based identifier for a stretch wrapped loaded pallet based on images captured in association with application of stretch wrap to the loaded pallet

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6009572B2 (ja) * 2012-09-21 2016-10-19 日立建機株式会社 運搬車両の走行管理装置
GB2541898B (en) * 2015-09-02 2018-09-19 Jaguar Land Rover Ltd A monitoring system for use on a vehicle
KR101815583B1 (ko) * 2016-02-29 2018-01-05 성균관대학교 산학협력단 3d 카메라와 빅데이터 플랫폼을 이용한 물류 적재 보조 시스템 및 그 적재 보조방법
KR102187446B1 (ko) * 2019-10-28 2020-12-08 동명대학교산학협력단 가상현실 및 증강현실 기반 차량 내 화물 적재 시스템

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7116814B2 (en) * 2002-09-27 2006-10-03 Chunghwa Telecom Co., Ltd. Image-based container defects detector
US20150254600A1 (en) * 2014-03-05 2015-09-10 Wipro Limited System and method for real time assessment of cargo handling
US20170262717A1 (en) * 2015-04-08 2017-09-14 Spireon, Inc. Camera array system and method to detect a load status of a semi-trailer truck
US9826213B1 (en) * 2015-09-22 2017-11-21 X Development Llc Generating an image-based identifier for a stretch wrapped loaded pallet based on images captured in association with application of stretch wrap to the loaded pallet
WO2017178712A1 (fr) * 2016-04-15 2017-10-19 Conexbird Oy Procédé, logiciel et appareil pour l'inspection de cargaisons

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102613341B1 (ko) * 2023-06-27 2023-12-14 한국철도기술연구원 딥러닝 기반 훈련데이터 증강을 통해 개선된 화물 인식방법 및 장치

Also Published As

Publication number Publication date
WO2022132238A1 (fr) 2022-06-23

Similar Documents

Publication Publication Date Title
CN111461107B (zh) 用于识别感兴趣区的材料搬运方法、装置和系统
WO2022132239A1 (fr) Procédé, système et appareil de gestion d'entrepôt par détection d'une cargaison endommagée
US11526973B2 (en) Predictive parcel damage identification, analysis, and mitigation
US11752936B2 (en) Industrial vehicle feedback system
US11783568B2 (en) Object classification using extra-regional context
US10366502B1 (en) Vehicle heading prediction neural network
KR20200125731A (ko) 객체 검출 및 특성화를 위한 뉴럴 네트워크들
US20200109963A1 (en) Selectively Forgoing Actions Based on Fullness Level of Containers
US11567197B2 (en) Automated object detection in a dusty environment
US10630944B2 (en) Method and system for door status detection and alert generation
US20210009365A1 (en) Logistics Operation Environment Mapping for Autonomous Vehicles
CN114901514A (zh) 改进的资产投递系统
US20210056492A1 (en) Providing information based on detection of actions that are undesired to waste collection workers
CN111582778B (zh) 操作场地货物堆积度量方法、装置、设备及存储介质
JP7421925B2 (ja) 情報処理装置、情報処理方法、及びプログラム
Pradeep et al. Automatic railway detection and tracking inspecting system
Hamieh et al. LiDAR and Camera-Based Convolutional Neural Network Detection for Autonomous Driving
Gunal Data collection inside industrial facilities with autonomous drones
US20210027051A1 (en) Selectively Forgoing Actions Based on Presence of People in a Vicinity of Containers
Hernandes et al. GISA: A Brazilian platform for autonomous cars trials
Khalid et al. Machine Vision-Based Conveyor and Structural Health Monitoring Robot for Industrial Application Using Deep Learning
Caldana et al. Comparison of Pallet Detection and Location Using COTS Sensors and AI Based Applications
Katsamenis et al. Real time road defect monitoring from UAV visual data sources
ABDELHAK et al. An Image Processing Approach for Real-Time Safety Assessment of Autonomous Drone Delivery
Kirci et al. EuroPallet Detection with RGB-D Camera Based on Deep Learning

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21907361

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 30.10.2023)