WO2021258195A1 - Système et procédé basés sur l'image permettant de gérer des conteneurs d'expédition par informatique en périphérie de réseau - Google Patents

Système et procédé basés sur l'image permettant de gérer des conteneurs d'expédition par informatique en périphérie de réseau Download PDF

Info

Publication number
WO2021258195A1
WO2021258195A1 PCT/CA2021/050849 CA2021050849W WO2021258195A1 WO 2021258195 A1 WO2021258195 A1 WO 2021258195A1 CA 2021050849 W CA2021050849 W CA 2021050849W WO 2021258195 A1 WO2021258195 A1 WO 2021258195A1
Authority
WO
WIPO (PCT)
Prior art keywords
container
cameras
shipping
images
yard
Prior art date
Application number
PCT/CA2021/050849
Other languages
English (en)
Inventor
Jennifer IVENS
Bruce IVENS
Original Assignee
Canscan Softwares And Technologies Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canscan Softwares And Technologies Inc. filed Critical Canscan Softwares And Technologies Inc.
Publication of WO2021258195A1 publication Critical patent/WO2021258195A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • G06V20/54Surveillance or monitoring of activities, e.g. for recognising suspicious objects of traffic, e.g. cars on the road, trains or boats
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B61RAILWAYS
    • B61LGUIDING RAILWAY TRAFFIC; ENSURING THE SAFETY OF RAILWAY TRAFFIC
    • B61L25/00Recording or indicating positions or identities of vehicles or trains or setting of track apparatus
    • B61L25/02Indicating or recording positions or identities of vehicles or trains
    • B61L25/04Indicating or recording train identities
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/08Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/08Detecting or categorising vehicles

Definitions

  • the present invention generally relates to the field of cargo shipping containers, and more particularly to an image-based system and method for managing transport and handling of shipping containers at terminals.
  • TOS Terminal Operating Systems
  • TOS Terminal Operating Systems
  • handheld OCR devices used by terminal operators or Radio Frequency tagging of containers, to track the containers within terminals.
  • existing TOS are expensive to implement, poorly integrated with existing TOS and do not allow to track in real-time, intermodal displacement of shipping containers.
  • Existing systems also fail to provide feedback, such as warnings and alerts, to operators and/or to handling equipment, in case of mishandled shipping containers.
  • an image-based automated method for tracking and managing shipping containers in a terminal yard.
  • the method comprises receiving a transit shipment plan for shipping containers arriving soon or already at the terminal yard.
  • the method also comprises receiving images from at least one of yard security cameras and traffic circulation cameras.
  • the images are associated with yard- camera-position coordinates.
  • the method may also comprise receiving additional images from cameras mounted onto container handlers. At least some of the images received show shipping containers provided with container codes.
  • the method comprises detecting, using machine-learning algorithms, the container codes from the images received from the yard security cameras, the traffic circulation cameras and from the container handler cameras.
  • the detected container codes are associated with the position coordinates.
  • the coordinates can correspond to the yard-camera-position and/or the container-handler-position coordinates.
  • the method may comprise deriving, from the yard-camera-position and/or the container-handler-position coordinates, in real time, the respective positions of the shipping containers imaged by the cameras.
  • Transit shipment instructions are sent to operator-interfaces of the container handlers, according to the transit shipment plan.
  • the positions of the shipping containers and the container codes are compared with the transit shipment plan. Discrepancies between the planned positions of the shipping containers, as per the transit shipment plan, and the actual position of said shipping container, as previously determined, are identified and reported. Alerts and/or warning messages can be sent to one of the operator-interfaces when a discrepancy has been identified.
  • some of the images received show railcars provided with railcar codes.
  • the railcars transport shipping containers that need to be loaded onto transport trucks by the container handlers.
  • the method further comprises the detection, using the machine-learning algorithms, of railcar codes and the identification of discrepancies between the planned position of a given container on a railcar, as per the transit shipment plan, and the actual position of said shipping container relative to the railcar, as previously determined.
  • images showing railcars provided with railcar codes are associated with a main rail track or with one of several spur rail tracks.
  • the position of the railcars relative to the main and spur rail tracks of the terminal yard are compared with the transit shipment plan, and discrepancies between the planned position of a railcar, as per the transit shipment plan, and its actual position are identified and reported.
  • track identification codes and dimensions of the railcars imaged by the cameras are detected, using machine-learning algorithms.
  • records associating railcar identification codes, rail track identification codes; position of a given railcar on the main or spur rail tracks; dimension of a given railcar; and shipping container codes are continuously logged, in real-time.
  • the Terminal Operation System application can access the records to verify, confirm and report shipping container movements, in real time, versus the transit shipment plan.
  • the images from the cameras mounted onto the container handlers are sent to Edge Processing Devices and the detection of the container codes is performed locally, by the Edge Processing Devices.
  • image processing is performed remotely, through Cloud-Based Processing Servers, via the Edge Processing Devices.
  • a combination of local and cloud-based processing is also possible.
  • the container handlers may comprise one or more intermodal container cranes, such as Rubber Tyre Gantry (RTG) cranes and Rail Mounted Gantry (RMG) cranes, and one or more mobile intermodal container handlers, such as stackers and lifter vehicles.
  • intermodal container cranes such as Rubber Tyre Gantry (RTG) cranes and Rail Mounted Gantry (RMG) cranes
  • mobile intermodal container handlers such as stackers and lifter vehicles.
  • the method comprises sending digital instructions to an on-machine controller such as a Programmable Logic Controller (PLC) of the container handlers to control operation thereof, for example to prevent clamping, lifting or moving a given shipping container determined as incorrectly selected, following the comparison of the shipping container’s position with the transit shipment plan, based on the discrepancies identified between the planned position of the shipping container being handled, as per the transit shipment plan, and the actual position of said shipping container.
  • PLC Programmable Logic Controller
  • comparing the positions of the shipping containers with the transit shipment plan can be performed by the terminal Central Processing Device and communicated to the Edge Processing Devices.
  • the method may also comprise assessing a physical condition of the shipping containers by processing the images from the security yard cameras and/or the container-handler mounted cameras though convolutional neural networks algorithms, to detect defects on the containers, such as rust, deformations, state of the container’s handles and security seals.
  • physical condition of containers is stored in a database, for access by the TOS and/or by the Central Processing Device.
  • a system for implementing the method described above comprises a Terminal Operation System (TOS) interface for receiving the transit shipment plan for shipping containers arriving soon or already at the terminal yard.
  • TOS Terminal Operation System
  • the system also comprises a plurality of Edge Processing Devices, each comprising one or more processors, memory, a communication module and a position module.
  • the Edge Processing Devices are configured to receive images from cameras mounted onto container handlers.
  • the system also comprises a Central Processing Device with one or more processors, memory/storage means and a communication module.
  • the Edge Processing Devices are configured to detect and recognize, using machine-learning algorithms, container identification codes from the images received from the yard security cameras, the traffic circulation cameras and from the container handler cameras, to associate the detected container codes with the yard- camera-position and/or the container-handler-position coordinates, to derive therefrom, in real time, respective positions of the shipping containers and to associate the recognized container codes with position coordinates, so as to determine in real time, the position of the shipping containers imaged by the cameras.
  • the Central Processing Device and/or the Edge Processing Devices are further configured to send transit shipment instructions to operator-interfaces of the container handlers according to the transit shipment plan; and to compare, during execution of the transit shipment plan by the container handlers, the positions of the shipping containers with the transit shipment plan and identify discrepancies between a planned position of the shipping containers, as per the transit shipment plan, and the actual position of said shipping container, as determined.
  • the Central Processing Device sends, directly or via the Edge Processing Devices and/or the TOS, an alert or a warning message to one of the operator-interfaces when a discrepancy has been identified.
  • the Central Processing Device and/or the Edge Processing Devices comprise a Container Yard Video Management System interface, to access the images from the yard security cameras.
  • the system comprises a cloud-based shipping container ID and Inspection module.
  • the Central Processing Device and/or the Edge Processing Devices have access to the cloud-based module, such that the images received can be processed remotely with trained machine-learning algorithms and models.
  • the Edge Processing Devices may include one or more video cameras within its enclosure, or alternatively, the Edge Processing Devices can communicate, via wired or wireless communication links, with the cameras mounted onto the container handlers.
  • the communication modules of the Edge Processing Devices may include one or more of the following interfaces: a Wi-Fi interface, a Bluetooth interface, a 4G or 5G interface.
  • the system is adapted to interface with video cameras having imbedded Graphical Processing Unit (GPU) and/or Tensor Processing Units (TPU).
  • An object detection algorithm (such as container detection) can be executed by the imbedded GPU and/or TPU in the camera, which converts video to frames. In such implementations, only the images having containers therein are then sent for container identification code recognition and/or defect detection.
  • the Edge Processing Devices include a PLC interlock control module, to communicate instructions and statuses to PLCs of the container handlers.
  • the Central Processing Device and/or the Edge Processing Device preferably both comprise a shipping container status module to provide shipping container statuses, warnings and/or transit shipment instructions to operator-interfaces of the container handlers.
  • the system may include its own databases to store records associating the railcar identification codes, track identification codes, position of railcars, dimensions of railcars; and shipping container codes.
  • the system can access a third party database to access, update and/or retrieve railcar and shipping container related records.
  • a non-transitory storage medium for storing executable instructions for causing one or more processing devices to perform the steps of the method described above.
  • FIG. 1 is a block diagram of possible components of an image-based system for shipping container management, and of their interactions, according to a possible implementation.
  • FIG. 2 is a schematic diagram of possible components of an image-based system for shipping container management, according to a possible implementation.
  • FIG. 3 is a schematic diagram showing components of the system of FIG.2, for tracking railcars, shipping containers mounted thereom
  • FIG. 4 is a schematic diagram, showing cameras positioned to capture images of railcars transporting shipping containers, as they are being shunted from the main rail track to the spur rail track.
  • FIG. 5 is a schematic diagram, showing cameras positioned on RTGs or other handling vehicles to capture images of railcars moving on terminal rail tracks, from which containers are moved to and from hauling trucks.
  • FIG. 6 is a table summarizing the different steps for mapping rail lines, rail cars and shipping containers in a terminal.
  • FIG. 7 is a schematic diagram of components of the system of FIG.2, for a Rubber Tyred Gantry Crane, according to an exemplary implementation.
  • FIG. 8 is schematic diagram of possible steps of the image-based shipping container management process, for the detection of shipping container codes, including an audio or visual operator interface within the container handling equipment.
  • FIG. 9A is a flow chart of an exemplary process for detecting a shipping container code displayed vertically on the side of the container.
  • FIG. 9B is a functional diagram of modules for shipping container code recognition and seal detection and condition assessment, according to an exemplary implementation.
  • FIG. 10 is a schematic diagram of components of the system of FIG.1 , for a stacking equipment, according to an exemplary implementation.
  • FIG. 11 is a schematic diagram showing container code validation by a container handling vehicle operator, following the Al-based container code recognition by the shipping container management system.
  • the proposed system and its associated method relate to an image-based, automated method for tracking and managing shipping containers in a terminal.
  • the system and method use machine-learning algorithms to recognize and detect shipping container and railcar identification codes from images captured by yard security cameras, traffic circulation cameras and/or cameras mounted onto container handlers, and associate the detected codes with position coordinates, to track in real time, the position of the shipping containers imaged by the cameras. Tracking of the shipping containers can be compared with transit shipment plans, to make sure shipping containers transit within the terminal as planned.
  • the proposed system and method have been developed to provide an organized large-scale management of container movement and storage on a container terminal site.
  • the system comprises several compact mobile machine vision and artificial intelligence devices, also referred to as edge processing devices.
  • the edge processing devices are designed for applications such as container handling vehicles. They detect and recognize, using Al algorithms, the alpha numeric characters which may be painted, stencilled or posted as on warning signs, on surfaces of stationary or on moving shipping containers and railcars.
  • the proposed system also comprises a central processing device that may be integrated with other supervisory platforms, such as a Terminal Operating System (TOS) and Terminal Security Management (TSM) applications, to exchange data that will improve efficiency of movement and will assemble progress updates to ensure loading plans are being followed or corrected, as necessary.
  • TOS Terminal Operating System
  • TSM Terminal Security Management
  • the proposed system is designed to use a minimum level of electrical power which permits its application in installations that are remote and must operate using energy developed and stored locally.
  • Possible implementations of the system include attachment of edge processing devices to heavy mobile equipment working in shipping container depots and to fixed installations monitoring rail traffic entering and parking on yard loading spur tracks.
  • the detection and inspection capacity of the edge processing device can be selected to meet the specific needs of terminals and may vary from simple code recognition and container tracking to complex damage detection and classification tasks.
  • the system comprises a combination of edge processing/computing devices and web service platforms.
  • the selection of the system’s configuration will vary according to the terminal site infrastructure opportunities for data communication, and interface needs with other installations and third-party platforms.
  • the edge processing devices are provided with communication means, such as WIFI or digital cellular communication, and where data is considered sensitive, encryption techniques can be applied.
  • a typical container depot or rail yard is managed through a central Terminal Operating System (TOS).
  • TOS Terminal Operating System
  • the proposed system can be integrated with the TOS to provide a two-way exchange of data which permits the TOS to verify, confirm and report actual events versus original plans generated for rail car and container deliveries.
  • the proposed system can also be integrated with a site security management system.
  • Video and image frames can be directed to the edge processing devices, to provide data that can be used to identify the position of containers and equipment in real time.
  • the system 100 comprises a central processing device 200 and a plurality of satellite edge processing devices 300, which may be mobile or fixed.
  • the central processing device 200 can consist of one or more computers or servers, provided with processors, memory and communication interfaces.
  • the central processing device may reside locally, at the terminal site, or remotely, in a cloud-based implementation.
  • a software application runs on the central processing device. The software application communicates with and controls the plurality of edge processing devices 300.
  • the terminal staff receive various types of communication from freight companies, providing instructions related to the delivery of shipping containers scheduled to pass through the terminal. These instructions are combined and converted into an electronic file format such as csv. which is then entered into the TOS as a portion of the overall transit shipment plan (C).
  • C overall transit shipment plan
  • execution of the plan within the boundaries of the terminal yard results in a series of simple single container move communications between the central processor device, the TOS and one of the field edge processing devices.
  • An example could be an instruction, sent as a digital message, from the TOS, via the central processing device to an edge processing device, to find a specific container and to move it to a defined coordinate near the dock.
  • the individual shipping container moves, or displacements, are part of the overall transit shipping plan which at this point is only concerned with the terminal yard.
  • the shipping instructions relate to terminal yard operation only and include information such as the container identification code, related railcar and delivery coordinates within the yard boundaries.
  • the TOS also typically has access to the Container Yard Inventory 16, which lists all shipping containers on site, and the different container handling equipment and vehicles available. Under the management of the TOS and the central processing device 200, the coordinates of all onsite units (including for example, shipping containers and container handlers) are logged and stored in a database or similar system, for quick reference (B).
  • the central processing device 200 and/or the edge processing devices can also communicate with the Terminal Security Management application 20 (identified in FIG. 1 as a Container Yard Video Management System).
  • the application 20 receives from various yard security cameras, images from different regions of the terminal yard, which may or not comprise containers.
  • the Central processing device comprises a Container Yard Video Management System interface, to access the images from the yard security cameras and/or additional data, such as the position and/or state (on/off, defective, etc.) of the cameras.
  • the TOS and/or the Container Yard Video Management System interfaces can for example include APIs and/or webservices used to communicate and exchange data with the TOS and Video Management System.
  • the TOS manages the distribution of work tasks to all site container transporting equipment, via the central processing device.
  • the container transporting equipment are equipped with an edge processing device 300, which can communicate the central processing device and preferably, with one another.
  • the container move instructions are sent to operator interfaces, and advantageously, alerts and messages are also sent to the operator interfaces, via the edge processing devices, to inform operators, in real time, when there are discrepancies between a planned position of one of the shipping containers, as per the transit shipment plan, and the actual position of said shipping container.
  • the edge processing devices 300 are modules or units, provided with one or more processors, memory means and communication modules.
  • the edge processing device 300 communicates with one or more cameras 380, which can include security yard cameras, traffic circulation cameras, cameras mounted onto container handlers, or their own dedicated camera(s), either directly or via the Video Management System.
  • the communication modules may include one or more of the following interfaces: a Wi-Fi interface, a Bluetooth interface, a 4G, 5G interface, to communicate with the central processing device, the cameras located close to the edge device and/or with other edge processing devices.
  • the edge processing devices have access to image data from onsite cameras.
  • the edge processing devices 300 communicate with the TOS, either directly or via hubs or other processing devices which may act as hubs.
  • the edge processing devices 300 can also return data relating to the progress and accuracy of the executed portions of the plan (E).
  • Yard security cameras can also send video images to the video management system 20, and the yard security video images can be used by the central processing device to track shipping containers on site (G, H).
  • the plurality of edge processing devices each comprise one or more processors, memory, a communication module and a position module. Edge processing devices are thus configured to receive images from different types of cameras, including yard security cameras and traffic circulation cameras, and cameras mounted onto container handlers. The images are associated with yard-camera-position and/or the container-handler-position coordinates.
  • the central processing device can also associate the images and positional coordinates in the yard, to derive the positions of the containers detected in the images. At least some of the images show the shipping containers provided with container codes.
  • the images can be analysed by the edge processing devices to first detect shipping containers in the images or video frames (as indicated in module 366 on the left-side of Fig. 9 - but recognizing the shape of containers), and then detect container codes (with container identification module 362).
  • the container code is associated with the coordinates (such as GPS coordinates or other position indicator) of the camera that captured the image.
  • the association of the shipping container code with a position in the terminal yard can be stored temporarily by the edge processing device and transmitted to the TOS which can update its own database.
  • the respective positions of all shipping containers imaged by the cameras, for which the container code has been successfully detected, can thus be derived from the captured images.
  • no additional or specialized cameras are needed, as images captured by existing security and/or traffic cameras can be used and their content is leveraged for tracking in real-time, the positions of the shipping containers on terminal sites.
  • the edge processing devices and/or the cloud-based application can interface with video cameras having imbedded Graphical Processing Unit (GPU) and/or Tensor Processing Units (TPU).
  • GPU Graphical Processing Unit
  • TPU Tensor Processing Units
  • TPU are Al accelerator application-specific integrated circuit (ASIC) specifically adapted and configured for neural network machine learning.
  • ASIC application-specific integrated circuit
  • the system 100 can be designed and adapted to interface and/or integrate such cameras.
  • An object detection algorithm (such as container detection) is executed by the imbedded GPU and/or TPU in the camera, which converts video to frames.
  • container detection is executed by the imbedded GPU and/or TPU in the camera, which converts video to frames.
  • only the images having containers therein can be sent to the cloud-based container identification code module 362 and/or defect detection module 266 for further analysis.
  • the central and/or the edge processing devices 200, 300 have access to cloud-based Al-algorithms 400, that can process the images captured by the various cameras and proceed with container and/or railcar identification code recognition, and return the detected identification code and the associated position coordinate, based on either one of metadata tagged within the image, or GPS coordinated provided by the edge processing devices (I).
  • the container and railcar code detection can be performed entirely locally, at the terminal, by the edge processing devices, or partly by the edge processing device, partly by the central processing device.
  • the TOS can be configured to send the transit shipment instructions to operator- interfaces (e.g. graphical user interfaces) of the container handlers according to the transit shipment plan, via the central processing device 200.
  • the central processing device 200 can be configured to provide text alarms during execution of the transit shipment plan by the container handlers. For example, should the handler move to the instruction provided coordinates but is unable to find the container number identified for that coordinate, then a text alarm would be sent for assistance.
  • the comparison of the shipping containers positions and codes with the transit shipment plan can be performed by the central processing device and/or by the edge processing devices. Discrepancies between a planned position of one of the shipping containers, as per the transit shipment plan, and the actual position of said shipping container, as previously determined, can be identified by the central processing device 200 or edge processor. Discrepancies can include a difference between a container code in the transit shipment plan and the container code being handled or imaged.
  • Discrepancies can also include a difference between a container’s expected position as per the transit shipment plan and the actual position of the container derived from the yard-camera-position and/or the container- handler-position coordinates.
  • the actual position of the container can correspond to one of the yard-camera-position and/or the container-handler-position coordinates, or to an average of the of the yard-camera-position and/or the container-handler-position coordinates.
  • the edge processing devices 300 are designed as secured enclosures, housing the one or more processors 310, memory means (RAM, ROM, flash, etc.) 320, communication interfaces 330 and position module (GPS or DGPS) 352.
  • the edge processing devices preferably connect to the terminal cameras and/or to the terminal’s video management system to process the images for container code recognition and physical condition assessment, in possible embodiments, at least some of the edge processing device may comprise one or more video cameras within their enclosure.
  • the edge processing devices 300 run a software application 360 that includes or accesses a container identification module 362 (and optionally, a railcar identification module), and preferably, an audio or graphical operator’s interface module 364, to be able to send container ID and/or container positions thereto.
  • the software application 360 comprises a shipping container status module to provide shipping container statuses, warnings and/or transit shipment instructions to operator-interfaces of the container handlers. The different shipping container statuses are logged and stored in a database.
  • the cameras 380 are either fixed cameras, positioned on posts at strategic locations in the terminal yard, such as near the main rail track and spur rail tracks; on cranes structures, such as Rubber Tyred Gantry (RTG) cranes and Rail Mounted Gantry (RMG) cranes 42; or on container handling vehicles, such as lifters and stackers 44.
  • the edge processing devices can communicate, via wired or wireless communication links, with the cameras mounted onto the container handlers.
  • the container handler vehicle and/or cranes are typically provided with Programmable Logic Controllers (PLCs) 46, and, in some implementations, the edge processing device comprises a PLC interlock control module 350 to communicate with the PLCs.
  • PLCs Programmable Logic Controllers
  • Operators of the container handlers can have access to an operator interface, controlled by the TOS, the central processing device and/or the edge processing devices, that can display or provide audio indications regarding shipping container statuses (such as their actual positions and code) and discrepancies between the transit shipment plans and the actual positions of the shipping containers.
  • the central processing device 200 and/or the edge processing devices 300 may therefore comprise, in possible implementations, a software module with processor- executable instructions to provide shipping container statuses, warnings and/or transit shipment instructions to operator-interfaces of the container handlers.
  • the central processing device 200 can be provided in a computer room, with other servers of the terminal, hosting the TOS 10 and Security Management System 20, but as explained previously, the central processing device may also be part of a cloud-based server farm, remote from the terminal.
  • the Central processing device comprises one or more processors 210, data storage and memory means 220, and communication modules 230, as well the TOS and TMS interfaces 250, 256.
  • the Central processing device runs a shipping container management software application, and may include a loading and unloading planning module, a container and railcar position tracking and management module 264, a container health assessment module 266 and a graphical user interface 268.
  • the proposed system can be used at intermodal container terminals to track and control the selection and transport of shipping containers between railcars sitting on terminal rail spur tracks, the depot yard storage piles and the city/yard trucks.
  • the loading and unloading of railcars becomes more: organized, accurate, efficient, safe and has a higher rate of transfer.
  • the edge processing device 300 communicates with first and second video cameras 380i, 380N, where one is oriented such that its field of view captures moving rail cars 70 and the rail car identification code 72, while the second camera is oriented to detect shipping containers 60 with the shipping containers identification code 62.
  • first and second video cameras 380i, 380N where one is oriented such that its field of view captures moving rail cars 70 and the rail car identification code 72, while the second camera is oriented to detect shipping containers 60 with the shipping containers identification code 62.
  • other camera configurations are possible, including for example a single camera capturing both the rail cars and shipping containers transported thereon, as in FIG. 4.
  • the images are sent from the cameras to the edge processing device, via a wired and/or wireless connection, such as Wi-Fi connection, for example.
  • the edge process device 300 can access, via its communication module, remote machine algorithms 400 that have been previously trained to detect rail cars, shipping containers on rail cars, shipping containers, and shipping containers codes and rail car codes. Alternately, where cloud-based access is not allowed from the terminal site for security reasons, it is possible to store and execute the trained Al-algorithms by the edge processing device 300, or by the central processing device (not shown in FIG. 3).
  • a confirmation request can be sent to an operator’s interface, such as to a rail operations checker’s interface (on a tablet, smart phone, laptop or the likes), to confirm the rail car and/or the shipping container code.
  • This feedback can be used to further train the Al-algorithms, so as to improve the accuracy of the Al-code detection code algorithms.
  • the final code determination is sent to the TOS’ database, such as in the terminal yard inventory database, via the central processing device.
  • an alert can be sent to the TOS system, and/or to operator’s interfaces, via the central processing device 200 or the edge processing devices 300.
  • the images received at the edge processing devices 300 show railcars provided with railcar codes.
  • the railcars transport shipping containers that need to be loaded onto transport trucks by the container handlers.
  • the edge processing devices 300 can recognize, using the machine-learning algorithms, the railcar codes.
  • the railcar codes, along with the shipping container codes, can be stored locally; in cloud-based servers; or in a central database accessible to the central processing device 200. Positions of the railcars can also be determined by the edge processing devices 300.
  • the edge processing devices and/or the central processing device are configured to identify discrepancies between a planned position of a given container on a railcar, as per the transit shipment plan, and the actual position of said shipping container relative to a railcar, as previously determined.
  • a software application preferably executed by the central processing device, compares the matching of containers and railcars, as per the transit shipment plan, with the actual position of containers on railcars, and detects any difference, either in the container identification or the railcar identification.
  • An alert or message can be sent to a graphical user interface, accessible via the central processing device, the edge processing devices or other devices, such as smart phones or tablets carried by terminal checkers or container handler operators.
  • the comparison of the positions of the shipping containers with the transit shipment plan is thus preferably performed by the central processing device 200 and communicated to the edge processing devices 300, which can send the information to operators and/or terminal checkers user interface.
  • the first step of container loading management begins with fixed cameras 380i and 380N, positioned to capture rail cars 70 and shipping containers 60 moving along the main rail track 74 (or main line) as they are shunted into positions on the terminal rail spur tracks 76.
  • the edge processing device (not shown in FIG. 4) is designed and configured to contribute to a rail cars management database, which can be accessed by the central processing device 200 and/or by the TOS.
  • the edge processing device detects the presence of moving rail cars 70, and recognizes the identification code 72 of the rail cars.
  • a log is created of each rail car 70 and its position on one of the multiple spur tracks.
  • a single track is typically employed which then, by way of a switch-track, directs rail cars onto any of the several spur tracks 76.
  • the edge processing device 300 dedicated to rail car identification is be positioned with one camera 380i imaging in profile all railcar and containers passing and the second camera 380N facing the multi-track layout and imaging the cars as they pass onto one of the spur tracks 76. Where yard and track layouts have obstacles to viewing, additional cameras may be required to circumvent data loss.
  • the car numbers 72, car sequence and track selection are held in the database so that all cars can be identified by their rail car code and position on a track.
  • the images showing railcars provided with railcar codes are associated with a main rail track or with one of several spur rail tracks.
  • the central processing device and/or the edge processing devices can be configured to compare, during execution of the transit shipment plan, the position of the railcars relative to the main and spur rail tracks of the terminal yard, with the transit shipment plan, and identify discrepancies between a planned position of one of the railcars, as per the transit shipment plan, and the actual position of said railcar.
  • the position of the railcars relative to other railcars on the main and/or spur rail tracks can also be detected, and discrepancies can be identified, including for example the order of the railcars on the main and/or spur tracks.
  • the edge processing devices can detect, using machine learning algorithms (executed locally or via cloud-based servers), track identification codes and dimensions of the railcars imaged by the cameras.
  • the central processing device 300 logs, in real time, records associating railcar identification codes with track identification codes, position of a given railcar on the main or spur rail tracks; and dimension of a given railcar; and shipping container codes with railcar identification codes and/or track identification codes.
  • the Terminal Operation System application can access the records to verify, confirm and report shipping container movements, in real time, versus the transit shipment plan.
  • the process can also be performed when shipping containers are being unloaded from hauling trucks 80 to terminal rail tracks (or rail lines).
  • Video camera 380iii captures images of the truck loading station, and images arriving hauling trucks, while video camera 380iv captures images of the different terminal lines 78.
  • rail car codes and shipping container codes are continuously compared to the transit shipment plans, and discrepancies can be identified and notified in real time, via graphical or audio interfaces destined to terminal operators.
  • FIG. 6 shows a table summarizing the different steps of the rail car, shipping container and hauling truck mapping process, where export shipping containers and rail car numbers are identified, and where the rail cars are associated to a rail line. Using positional coordinates from the edge processing devices and/or from the cameras, the position of the rail cars within the terminal can be determined.
  • Container Loading and Unloading Management System for Shipping Containers Handling Vehicles and Cranes
  • Shipborne containers arriving at a maritime port may contain valuable import goods with an origin in a distant foreign country.
  • the shipping containers are offloaded and parked in the terminal yard until arrangements can be made for their delivery to the final customer.
  • the arrangements require that the rail transporter prepare a transit shipment plan that schedules rail car deliveries, and the containers onboard; to a depot/terminal conveniently located for the final miles of the delivery. It is therefore important that the selection of containers by the terminals’ machine operators, be executed exactly as planned, otherwise the container will be delivered to the wrong destination.
  • a camera 380i mounted to the crane structure in a position where a container delivered by terminal trailer from its parked position in the yard, sits beneath the crane waiting to be lifted and loaded to a rail car. From that position, the container identification code can be imaged and the loading positioning details relating to the container is retrieved by (or pushed to) the edge processing device 300, via a communication module 330, such as a compact industrial cellular router, for example.
  • a camera 380N is also placed on the RTG structure in a position to image the area across all of the multiple rail tracks, which provides the edge processor 300 with the real-time data relating to the efforts of the RTG operator, to load containers onto a railcar sitting on one of the multiple tracks.
  • the edge processing device 300 Housed within a suitable environment on the crane is the edge processing device 300, provided with enclosure including a power module, one or more processors, a GPS module, a 4G, 5G or next-generation module and a Wi-Fi communication module.
  • the edge processing device may also include a human interface screen.
  • This edge processing device 300 can direct the movement of containers, using data stored in the data base, to maximize container throughput volume, by sending corresponding instructions to operators, via the operator-interfaces of the container handlers
  • the container handlers may thus encompass not only trucks and railcars, but also one or more intermodal container cranes, such as Rubber Tyre Gantry (RTG) cranes and Rail Mounted Gantry (RMG) cranes, and mobile intermodal container handlers, such as stackers and lifter vehicles.
  • RTG Rubber Tyre Gantry
  • RMG Rail Mounted Gantry
  • some of the edge processing devices may have a fixed location, but they can also be “mobile” edge processing devices, when provided on moving vehicles.
  • Control of the loading operation relies on two sets of data, the first being the loading instructions, electronically delivered to the operating RTG edge processing device 300, by the TOS system directly or via the central processing system 200, and which may appear on the operator-interfaces of the crane interface panel (as shown in FIG. 8), in the elevated control booth.
  • This edge processing device 300 supervises the loading of the rail cars, referring to the conveyed loading instructions and observing, via the captured images, the actual efforts of the RTG operator.
  • the second data set required is generated by the edge processing device located at the rail verification station, which has recorded the positions of all rail cars on the terminal tracks.
  • the crane edge processing device compares loading instructions for the transit shipping plans with real-time events and warns the operators, via the control booth interface, when attempts are being made to load containers onto an unassigned rail car.
  • control instructions can be sent from the edge processing device 300 at the crane, to PLCs of the cranes, to block operations thereof.
  • the edge processing devices send instructions to a given Programmable Logic Controllers (PLCs) of the container handlers to control its operation.
  • PLCs Programmable Logic Controllers
  • the instructions sent by an edge processing device to a given PLC can comprise instructions that prevent clamping, lifting, or moving a given shipping container determined as incorrectly selected, following the comparison of the shipping container’s position with the transit shipment plan.
  • the edge processing device 300 can detect in real time that a railcar has entered the camera field of view, record the images of rail cars and associated containers, execute edge computing applications for object detection and identification, transmit data to an on or off-site central processor device 200 and if so designed, transmit the data to a secure web service 400, store data related to containers, railcars and locations, and interface and integrate with site Terminal Operating System, either directly or via the central processing device 300.
  • This is an end-to-end solution that begins with the system receiving the details related to a series of container handling movements planned through a resident computer management system and based on planned daily transportation arrangements. The electronic messages are distributed by the system to available container handling equipment and supervisory performance monitoring stations.
  • the execution of the plan is monitored by the system in real-time allowing the system to intercede at planned moments to prevent errors or to initiate changes to the plan. Confirmation of the completed tasks, including accepted changes to the plan are returned to the resident computer management system for archival updates.
  • the edge processing device and/or the remote cloud-based platform can comprise a shipping container code recognition module 362 and a defect detection module 266.
  • smart mobile devices 300 can be used to capture and process. Image analysis produces character identification of container and equipment mounted serial numbers visibly scribed to surfaces of the equipment. The size, location and detail of these numbers follow codes particular to the equipment which includes but not limited to shipping containers, rail cars, rubber tyre gantry cranes, top handlers, and other container handling mobile equipment.
  • the image-based automated system comprises and/or has access to a cloud-based shipping container ID 362 and Inspection module 266 (as illustrated in Fig. 9B)
  • the Central Processing Device and/or the Mobile Edge Processing Devices can access the cloud-based modules to process the images received.
  • Performance of the Central Processing Device 200 and satellite Mobile Edge Processing Device 300 use graphics processing units (GPU) to perform high speed character analysis.
  • the deep neural network (DNN-based) architecture is run using Tensor Flow or Tensor Flow Lite.
  • the processor identifies image frames of interest using object detection models such as Faster-RCNN. Further processing identifies areas of interest within the frames localizing text and physical deviations such as damage.
  • Machine learning models such as Faster RCNN and selected non-machine learning techniques are used to select the areas.
  • text may be scribed on equipment surfaces in both horizontal and vertical orientation, each with a particular characteristic such as shape and character spacing.
  • an algorithm such as arbitrary oriented text recognition (AON)
  • text orientation can be determined in an unsupervised manner.
  • Data is formulated using methods and modules such as a directional encoded mask (DEM) or selective attention network (SAN.F) or Baseline, thereby reducing the learning time and improving accuracy.
  • DEM directional encoded mask
  • SAN.F selective attention network
  • Baseline Baseline
  • Various text detection models, text recognition models and modifiers are available and are selected based on the application environment. This includes Rosetta which is a two-step software model for the detection and recognition of text and StarNet which is a trainable classifier that reduces the amount of training data required for new visual domains. Also available is the algorithm Efficient and Accurate Scene Text detection pipeline (EAST), which is a trainable algorithm that offers an advantage to this application of direct text detection without the typical requirement of additional sub-algorithms that aggregate and partition text. Also available for this application is VGG which is a deep convolutional neural network (CNN) that incorporates small size convolutional filters that allow multiple filter layers resulting in improved performance.
  • EAST Efficient and Accurate Scene Text detection pipeline
  • VGG which is a deep convolutional neural network (CNN) that incorporates small size convolutional filters that allow multiple filter layers resulting in improved performance.
  • a rating for the container’s condition can also be determined at different stages of the container handling process, by the shipping container inspection module 266.
  • a reference database may be included as part of the back-end system 300 and/or cloud-based platform 400, to store baseline container codes, types of damages, condition ratings, and other information on standard undamaged containers for comparison purposes, etc.
  • the printed information provided on the shipping containers e.g., codes, labels, seals, and signs, is detected and recognized automatically with intelligent customized software modules and algorithms, executed locally or in the cloud. Different types of shipping container defects can be detected, including cracks, deformations and corrosion, and integrity of handles and security seals.
  • the inspection module can identify damage type and extent of those damages to corner fittings, door header, top end rails and forklift pockets.
  • the processing steps include the detection of the area damaged, the use of an adaptive image threshold method to isolate the damaged portion of the pixel level through segmentation and area outline within the bounding box.
  • the tracking and management process of shipping containers in terminal can not only identify discrepancies in the execution of the transit shipment plan: it can also asses a physical condition of the shipping containers by processing the images from the security yard cameras and/or the container-handler mounted cameras using machine learning algorithms, such as through convolutional neural networks.
  • Provision of the overall type and level estimate according to the detected area The terminal operators can be offered access to the information from deployed web or mobile applications and interface 364, via the edge processing devices 300, or the information can be updated to the TOS’ database.
  • mobile shipping container vehicles 44 are used to lift containers from a position in the yard or on a truck and transported to a new location which may be a container pile that may be as much as six containers high.
  • a new location which may be a container pile that may be as much as six containers high.
  • Each of these types of mobile vehicles are designed to perform similar duties as previously described under the RTG/RMG crane operation.
  • Cameras 380 are placed in strategic positions where the container identification details can be best imaged, and the on-board electronics of edge processing devices 300 performs object identification, recognition and communication services.
  • the container code detection module 362 analyzes the images and determines a shipping container code. As per FIG. 11, container codes detected by the container code detection module can be confirmed by operators, when the level of confidence is too low, and feedback from the operators can be sent back to the machine learning algorithms to be further trained. Also, logics may be used and with an edge computer interface to the onboard programmable logic controller (PLC), equipment functions may be controlled, via a PLC-control module 350. This may include system refusal to clamp on to, lift or move a container which the edge computing station has determined, has been incorrectly selected.
  • PLC programmable logic controller
  • the edge processing device via the operator’s interface 364 controls can direct the mobile vehicle operator to specific coordinates for container retrieval or storage.
  • Positional instructions originate through the GPS module 352 and is verified through relative markers such as the previously recorded container identification and last position inventory data.
  • Vehicle traffic patterns can be established and modified in real time with integration of yard security cameras through recognition applications and interface with the terminal operating system.
  • the proposed system 100 may advantageously be integrated with other operating platforms and can serve to exchange data for the purpose of directing terminal container movements and ensuring that the terminal records are kept accurate and complete.
  • the system 100 comprises a central processing device 200 with multiple smaller satellite processing devices 300 located on fixed or moving container handling equipment 42, 44. Multiple cameras 380 on the equipment provide vision data to the satellite processors 200. A variety of camera types are available and are selected based on the application. Communication between processor devices is primarily wireless, using any of WIFI, 4G or 5G cellular. Communications between systems is essential to the good functioning of the entire operation and therefore must be specifically designed to meet the individual needs of each terminal and the data management system.
  • terminals that are not operating high level management systems can use the data management system 100 to operate loading and unloading operations primarily in real time with instructions provided moment to moment by supervisory input. Historical records may be kept but bulk loading and unloading instructions would not be necessary.
  • the proposed system interfaces to both the terminal operating system (TOS) 10 and terminal security management (TSM) 20 system.
  • the TOS can bulk download the loading and unloading plans to the system 100, based on the operators chosen criteria, such as by 4-hour operating periods or by individual delivery manifest.
  • the system 100 can, from that point, coordinate the operation by delivering work instructions to equipment 42, 44 in the field and retrieve field feedback to update records.
  • the edge processing devices 300 receive work instructions from the central processing device 200 and manages the operation of the attached equipment by display (or audio instructions) to the equipment operator and by interlock control with the equipment programmable logic controller 46 (PLC).
  • the cameras 380 provide vision to the details of the operator activity and image the identification details on sides and tops of containers being manipulated by the equipment lifting devices 44.
  • the edge processing devices 300 run algorithms to recognize these details and determine the container identification numbers 62 which in turn are compared with the shipping plan. Should the container being manipulated not match the plan or should other electronic instructions not be followed by the machine operator correctly, the edge processing interlocking instructions can prevent completion of the activity while notifying the error to the machine operator and to the central processing device 200. With completion of the activity, the edge processing device 300 will notify the central processing device 200 and will perform the next task. The central processing device 200 will in turn update the activity progress logs.
  • the GPS modules 352 provide the edge processing devices with coordinates on a continuous basis and at appropriate moments the coordinates are captured and logged. Such is the case when the edge device equipped mobile vehicles are placing containers in the terminal yard for storage.
  • the machine PLC controls are monitored by the edge processing device and upon completion of the setting of the container for storage, the GPS coordinates are captured and uploaded to the central processing device 200 along with all related data.
  • a plurality of security cameras 380 are normally installed at terminals, given the value of goods and national border security issues.
  • the video images of these cameras are typically brought to a single vendor management system normally situated in an IT room.
  • the shipping container data management system 100 can be interfaced with the video management system 20, allowing the movement and positioning to containers to be followed through the processing of the video images and identification of container identification details.
  • the data management system 100 can, in possible implementations, be used to provide and assemble the field data required to prepare a terminal site location plan of containers transiting at the terminal site. Residing in the TOS, the location plan can be updated in real time, from the edge processing devices, allowing other activities such as container servicing to be directed to the container location.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Warehouses Or Storage Devices (AREA)

Abstract

L'invention concerne un système automatisé basé sur l'image servant au suivi et à la gestion de conteneurs d'expédition dans une aire de stockage de conteneurs. Le système comprend une pluralité de dispositifs de traitement en périphérie de réseau et un dispositif de traitement central. Les dispositifs de traitement en périphérie de réseau reçoivent des images provenant de caméras de sécurité d'aire de stockage, de caméras de circulation et de gestionnaires de conteneurs, les images étant associées à des coordonnées de position. Les dispositifs de traitement en périphérie de réseau et le dispositif de traitement central détectent, à l'aide d'algorithmes d'apprentissage automatique, des codes de conteneurs à partir des images, et associent les codes de conteneurs détectés aux coordonnées de position, et déterminent en temps réel les positions des conteneurs d'expédition imagés par les caméras au niveau de l'aire de stockage. Pendant l'exécution du plan d'expédition avec escale, les positions des conteneurs d'expédition avec le plan d'expédition avec escale sont comparées et des divergences sont identifiées entre une position planifiée d'un conteneur parmi les conteneurs d'expédition et la position réelle dudit conteneur, tel que précédemment déterminée.
PCT/CA2021/050849 2020-06-22 2021-06-22 Système et procédé basés sur l'image permettant de gérer des conteneurs d'expédition par informatique en périphérie de réseau WO2021258195A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202063042151P 2020-06-22 2020-06-22
US63/042,151 2020-06-22

Publications (1)

Publication Number Publication Date
WO2021258195A1 true WO2021258195A1 (fr) 2021-12-30

Family

ID=79282430

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CA2021/050849 WO2021258195A1 (fr) 2020-06-22 2021-06-22 Système et procédé basés sur l'image permettant de gérer des conteneurs d'expédition par informatique en périphérie de réseau

Country Status (1)

Country Link
WO (1) WO2021258195A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115361032A (zh) * 2022-08-17 2022-11-18 佛山市朗盛通讯设备有限公司 一种用于5g通信的天线单元
CN115409472A (zh) * 2022-08-19 2022-11-29 广东省泰维思信息科技有限公司 一种智慧办案流程管理方法、系统及电子设备
WO2024011926A1 (fr) * 2022-07-11 2024-01-18 卡奥斯工业智能研究院(青岛)有限公司 Système et procédé de surveillance de sécurité basé sur la 5g, dispositif électronique et support de stockage

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030191555A1 (en) * 2002-04-09 2003-10-09 Paceco Corp. Method and apparatus for quay container crane-based automated optical container code recognition with positional identification
US20090108065A1 (en) * 2007-10-30 2009-04-30 Henry King Methods and apparatus processing container images and/or identifying codes for front end loaders or container handlers servicing rail cars
CN106203239A (zh) * 2015-05-04 2016-12-07 杭州海康威视数字技术股份有限公司 用于集装箱理货的信息处理方法、装置和系统
IN201621028008A (fr) * 2016-08-17 2017-03-24
CN109492449A (zh) * 2019-01-04 2019-03-19 清华大学 箱体识别系统、识别方法、检查设备及港口设施
CN110348451A (zh) * 2019-07-18 2019-10-18 西南交通大学 铁路集装箱装卸过程中的箱号自动采集及识别方法
CN110969054A (zh) * 2018-09-29 2020-04-07 杭州海康威视数字技术股份有限公司 一种集装箱箱号识别方法及装置
WO2020124247A1 (fr) * 2018-12-21 2020-06-25 Canscan Softwares And Technologies Inc. Système d'inspection automatisée et procédé associé pour évaluer l'état de conteneurs d'expédition
CN212302516U (zh) * 2020-08-27 2021-01-05 上海西井信息科技有限公司 基于空间扫描的集装箱识别系统
KR102206662B1 (ko) * 2020-08-14 2021-01-22 아이티플래닛 주식회사 항만 컨테이너 터미널에서 차량 출입 관리와 객체를 인식하는 비전 카메라 시스템 및 방법

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030191555A1 (en) * 2002-04-09 2003-10-09 Paceco Corp. Method and apparatus for quay container crane-based automated optical container code recognition with positional identification
US20090108065A1 (en) * 2007-10-30 2009-04-30 Henry King Methods and apparatus processing container images and/or identifying codes for front end loaders or container handlers servicing rail cars
CN106203239A (zh) * 2015-05-04 2016-12-07 杭州海康威视数字技术股份有限公司 用于集装箱理货的信息处理方法、装置和系统
IN201621028008A (fr) * 2016-08-17 2017-03-24
CN110969054A (zh) * 2018-09-29 2020-04-07 杭州海康威视数字技术股份有限公司 一种集装箱箱号识别方法及装置
WO2020124247A1 (fr) * 2018-12-21 2020-06-25 Canscan Softwares And Technologies Inc. Système d'inspection automatisée et procédé associé pour évaluer l'état de conteneurs d'expédition
CN109492449A (zh) * 2019-01-04 2019-03-19 清华大学 箱体识别系统、识别方法、检查设备及港口设施
CN110348451A (zh) * 2019-07-18 2019-10-18 西南交通大学 铁路集装箱装卸过程中的箱号自动采集及识别方法
KR102206662B1 (ko) * 2020-08-14 2021-01-22 아이티플래닛 주식회사 항만 컨테이너 터미널에서 차량 출입 관리와 객체를 인식하는 비전 카메라 시스템 및 방법
CN212302516U (zh) * 2020-08-27 2021-01-05 上海西井信息科技有限公司 基于空间扫描的集装箱识别系统

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
ANONYMOUS: "Artificial intelligence to improve terminal efficiencies", CAMCO TIMES, 1 June 2018 (2018-06-01), pages 3 - 23, XP055892974, Retrieved from the Internet <URL:https://www.camco.be/wp-content/uploads/2018/06/Times6_LowRes.pdf> [retrieved on 20220217] *
ANONYMOUS: "OCR in Ports and Terminals", PEMA INFORMATION PAPER, 1 January 2013 (2013-01-01), pages 1 - 22, XP055892971, Retrieved from the Internet <URL:https://www.pema.org/wp-content/uploads/downloads/2013/01/PEMA-IP4-OCR-in-Ports-and-Terminals.pdf> [retrieved on 20220217] *
ZHANG RAN; BAHRAMI ZHILA; WANG TENG; LIU ZHENG: "An Adaptive Deep Learning Framework for Shipping Container Code Localization and Recognition", IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, IEEE, USA, vol. 70, 13 August 2020 (2020-08-13), USA, pages 1 - 13, XP011820301, ISSN: 0018-9456, DOI: 10.1109/TIM.2020.3016108 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024011926A1 (fr) * 2022-07-11 2024-01-18 卡奥斯工业智能研究院(青岛)有限公司 Système et procédé de surveillance de sécurité basé sur la 5g, dispositif électronique et support de stockage
CN115361032A (zh) * 2022-08-17 2022-11-18 佛山市朗盛通讯设备有限公司 一种用于5g通信的天线单元
CN115409472A (zh) * 2022-08-19 2022-11-29 广东省泰维思信息科技有限公司 一种智慧办案流程管理方法、系统及电子设备
CN115409472B (zh) * 2022-08-19 2023-08-01 广东省泰维思信息科技有限公司 一种智慧办案流程管理方法、系统及电子设备

Similar Documents

Publication Publication Date Title
WO2021258195A1 (fr) Système et procédé basés sur l&#39;image permettant de gérer des conteneurs d&#39;expédition par informatique en périphérie de réseau
US11507913B2 (en) Smart terminal facility and method suitable for the handling of cargo containers
US11427229B2 (en) Demand-based distribution of items using intermodal carriers and unmanned aerial vehicles
US10043154B2 (en) Processing container images and identifiers using optical character recognition and geolocation
US10625859B2 (en) Mobile fulfillment centers with intermodal carriers and unmanned aerial vehicles
CN109683577B (zh) 一种仓储控制系统及计算机设备
US9718564B1 (en) Ground-based mobile maintenance facilities for unmanned aerial vehicles
US7508956B2 (en) Systems and methods for monitoring and tracking movement and location of shipping containers and vehicles using a vision based system
WO2018033933A1 (fr) Système de détection automatique de position de conteneur
CN110262355A (zh) 基于智慧矿山管控平台的智慧矿山管理系统及方法
CN107767091A (zh) 一种物流运输信息管理系统的控制方法
EP3596677A1 (fr) Centres d&#39;exécution mobiles à supports intermodaux et véhicules aériens sans pilote
CN109573839B (zh) 一种仓库货物的监控方法和装置
CN109685436A (zh) 自动驾驶车辆跨境运输系统及相关设备
WO2022257397A1 (fr) Système et procédé de gestion de parc de stockage intelligent
CN109685433A (zh) 一种自动驾驶车辆跨境运输系统及相关设备
CN114358697A (zh) 自动化集装箱码头生产业务数据链系统
US20140255130A1 (en) Dock-to-rail and rail-to-dock container handling system and method
US20240078499A1 (en) System for monitoring transportation, logistics, and distribution facilities
Azzahra et al. Refrigerated container handling process design improvement in container terminal through the implementation of internet of things (IOT) using business process reengineering approach
Valentina New technologies used for automation of container handling at terminals
CN109685434A (zh) 一种自动驾驶车辆跨境运输系统及相关设备
KR102061788B1 (ko) 클라우드 서비스를 이용한 적하계획 통합 제공 방법
US20230005121A1 (en) Method and system for detecting damages in freight container
CN114283512B (zh) 一种基于双识别引擎的智能闸口管理方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21830267

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 23.03.2023)

122 Ep: pct application non-entry in european phase

Ref document number: 21830267

Country of ref document: EP

Kind code of ref document: A1