CN116152243B - Method for detecting container loading and unloading operation state based on image analysis - Google Patents

Method for detecting container loading and unloading operation state based on image analysis Download PDF

Info

Publication number
CN116152243B
CN116152243B CN202310416761.4A CN202310416761A CN116152243B CN 116152243 B CN116152243 B CN 116152243B CN 202310416761 A CN202310416761 A CN 202310416761A CN 116152243 B CN116152243 B CN 116152243B
Authority
CN
China
Prior art keywords
image
lifting appliance
container
target
loading
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310416761.4A
Other languages
Chinese (zh)
Other versions
CN116152243A (en
Inventor
翁年年
张向辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Pingfang Science And Technology Co ltd
Original Assignee
Shenzhen Pingfang Science And Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Pingfang Science And Technology Co ltd filed Critical Shenzhen Pingfang Science And Technology Co ltd
Priority to CN202310416761.4A priority Critical patent/CN116152243B/en
Publication of CN116152243A publication Critical patent/CN116152243A/en
Application granted granted Critical
Publication of CN116152243B publication Critical patent/CN116152243B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/0014Image feed-back for automatic industrial control, e.g. robot with camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/207Analysis of motion for motion estimation over a hierarchy of resolutions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/50Extraction of image or video features by performing operations within image blocks; by using histograms, e.g. histogram of oriented gradients [HoG]; by summing image-intensity values; Projection analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Quality & Reliability (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Robotics (AREA)
  • Control And Safety Of Cranes (AREA)

Abstract

The invention provides a method for detecting the loading and unloading operation state of a container based on image analysis, which comprises the following steps: acquiring an initial state image of the lifting appliance based on a preset acquisition device, locking and tracking the lifting appliance based on the initial state image, and determining a real-time tracking image corresponding to the lifting appliance based on a locking and tracking result; preprocessing the real-time tracking image, determining the target size of the lifting appliance, and locking a target container to be loaded and unloaded below the lifting appliance in the real-time tracking image based on the target size; and acquiring a loading and unloading image of the lifting appliance for loading and unloading the target container to be loaded and unloaded based on the locking result, analyzing the loading and unloading image to determine the moving linear relation between the lifting appliance and the target container to be loaded and unloaded, and determining the loading and unloading state of the lifting appliance for loading and unloading the target container based on the moving linear relation. The method and the device ensure the rapid and accurate determination of the target container to be loaded and unloaded, thereby realizing the accurate and effective detection of the loading and unloading state of the target container to be loaded and unloaded.

Description

Method for detecting container loading and unloading operation state based on image analysis
Technical Field
The invention relates to the technical field of image generation and image processing analysis, in particular to a method for detecting a container loading and unloading operation state based on image analysis.
Background
The container is a necessary carrier for long-distance transportation and sea area transportation, can be loaded and packaged or unpackaged for transportation, is convenient to use mechanical equipment to carry out loading and unloading, and can improve the loading and unloading efficiency of the container by effectively detecting the loading and unloading state of the container;
at present, when the loading and unloading state of a container is detected, a detection algorithm of the container is generally adopted to analyze the loading and unloading state of the container, so that more problems exist while the loading and unloading efficiency is improved;
1. when detecting that a plurality of containers exist in the background, the detection algorithm cannot filter the background container, and the detection is positioned on the background container so as to miss an operation container object and output an error detection result;
2. when the lifting appliance lifts a container, the existing container algorithm can lose a detection target due to different swing in the air and different presented angles, so that detection omission is caused;
3. due to the adoption of a container detection algorithm, a container can be detected only by exposing a fixed container surface under a background with a complex color, and the container can not be detected at the time point of fastening the container by a hoisting action in the hoisting process;
accordingly, in order to overcome the above-mentioned drawbacks, the present invention provides a method for detecting a container handling operation state based on image analysis.
Disclosure of Invention
The invention provides a method for detecting the loading and unloading operation state of a container based on image analysis, which is used for analyzing the acquired loading and unloading image of a target container to be loaded and unloaded by a lifting appliance, so that the target container to be loaded and unloaded is ensured to be rapidly and accurately determined, the loading and unloading state of the target container to be loaded and unloaded is accurately and effectively detected, and the reliability and the high efficiency of the loading and unloading state analysis of the target container to be loaded and unloaded are ensured.
The invention provides a method for detecting the loading and unloading operation state of a container based on image analysis, which comprises the following steps:
step 1: acquiring an initial state image of the lifting appliance based on a preset acquisition device, locking and tracking the lifting appliance based on the initial state image, and determining a real-time tracking image corresponding to the lifting appliance based on a locking and tracking result;
step 2: preprocessing the real-time tracking image, determining the target size of the lifting appliance, and locking a target container to be loaded and unloaded below the lifting appliance in the real-time tracking image based on the target size;
step 3: and acquiring a loading and unloading image of the lifting appliance for loading and unloading the target container to be loaded and unloaded based on the locking result, analyzing the loading and unloading image to determine the moving linear relation between the lifting appliance and the target container to be loaded and unloaded, and determining the loading and unloading state of the lifting appliance for loading and unloading the target container based on the moving linear relation.
Preferably, in step 1, an initial state image of a lifting appliance is collected based on a preset collection device, which includes:
acquiring historical loading and unloading data of the lifting appliance on the container based on a preset server, analyzing the historical loading and unloading data, and determining the lifting height of the lifting appliance when the lifting appliance loads and unloads the container;
acquiring target requirements for locking and tracking the lifting appliance, and determining the optimal height for image acquisition of the lifting appliance based on the target requirements and the lifting height;
determining a upward shooting angle of the preset acquisition device for image acquisition of the lifting appliance based on the relative position of the preset acquisition device and the optimal height, adapting the preset acquisition device based on the upward shooting angle, and associating the adapted preset acquisition device with a preset height sensor;
the current lifting height of the lifting appliance is monitored in real time based on a preset height sensor, when the lifting height is consistent with the optimal height, an image acquisition instruction is issued to a preset acquisition device, and the lifting appliance is subjected to image acquisition based on the preset acquisition device, so that an initial state image of the lifting appliance is obtained.
Preferably, in step 1, locking tracking is performed on a lifting appliance based on an initial state image, including:
Acquiring an obtained initial state image and shape characteristics of the lifting appliance, performing pixel matching on the initial state image based on the shape characteristics, and determining an image area range of the lifting appliance in the initial state image;
acquiring a target color of the lifting appliance, extracting a color histogram of an image area range, determining a target proportion of the target color in the image area range based on the color histogram, and correcting the image area range based on the color histogram when the target proportion is greater than a preset proportion threshold;
and carrying out detection frame marking on the corrected image area range based on a preset detection frame, locking the lifting appliance based on a detection frame marking result, configuring the movement direction and the movement speed of the preset detection frame based on the movement direction and the movement speed of the lifting appliance, predicting the next position of the lifting appliance through the preset detection frame based on a configuration result, updating the real-time position of the preset detection frame based on a prediction result, and completing locking tracking of the lifting appliance.
Preferably, in step 1, a real-time tracking image corresponding to a lifting appliance is determined based on a locking tracking result, which includes:
Acquiring a locking tracking result of the lifting appliance, determining an initial position of the lifting appliance in an initial state image based on the locking tracking result, and determining a target distance between the initial position and an initial state image boundary;
monitoring the change condition of the target distance in real time based on locking tracking of the lifting appliance, and carrying out image acquisition on the lifting appliance again based on a preset acquisition device according to a locking tracking result when the monitored target distance is smaller than a preset distance threshold value to obtain a tracking image to be sequenced;
and sequencing the acquired tracking images to be sequenced based on the time development sequence, and obtaining real-time tracking images corresponding to the lifting appliance based on the sequencing result.
Preferably, a method for detecting a container handling operation state based on image analysis, obtains real-time tracking images corresponding to a lifting appliance based on a sequencing result, includes:
acquiring an obtained real-time tracking image, and determining a corresponding target static image when the lifting appliance reaches the loading and unloading position based on the real-time tracking image;
extracting an object feature set recorded in a target static image, and simultaneously, acquiring the object feature of the container and matching the object feature of the container with the object feature set;
when the object characteristics in the object characteristic set are matched with the object characteristics of the container, judging that the container exists below the lifting appliance;
Otherwise, judging that the container does not exist below the lifting appliance, sending a prompt to the management terminal, and correcting the hanging direction of the lifting appliance until the object characteristics in the object characteristic set are matched with the object characteristics of the container.
Preferably, in step 2, preprocessing is performed on a real-time tracking image to determine a target size of a lifting appliance, including:
the method comprises the steps of obtaining a real-time tracking image, identifying the real-time tracking image, determining a target object contained in the real-time tracking image, and carrying out image division on the real-time tracking image based on the target object to obtain M image blocks;
respectively extracting image parameters of M image blocks, evaluating the image parameters of the M image blocks based on a preset effect evaluation model, determining a definition value of each image block, and judging the image block with the definition value smaller than a preset definition threshold as an image block to be optimized;
training a preset convolution network based on reference image parameters of a preset sample image to obtain an image optimization model, analyzing image parameters of image blocks to be optimized based on the image optimization model, determining a target difference value of the image parameters of each image block and the reference image parameters, and determining image optimization parameters of the image blocks to be optimized based on the target difference value;
Performing image optimization on the image blocks to be optimized based on the image optimization parameters, and splicing the optimized image blocks with the definition meeting a preset definition threshold to obtain a preprocessed image;
selecting a target reference point from the preprocessed image, determining the target distance between the target reference point and a preset acquisition device, and determining depth information of the preprocessed image based on the target distance and imaging configuration of the preset acquisition device;
performing point cloud scanning on the preprocessing image based on the depth information of the preprocessing image to obtain point cloud data of a target object in the preprocessing image, determining edge texture characteristics of the target object based on the preprocessing image, correcting the point cloud data of the target object based on the edge texture characteristics, and obtaining contour characteristics of the target object based on a correction result;
and determining the target size of the lifting appliance in the preprocessing image based on the outline characteristics of the target object.
Preferably, in step 2, locking a target container to be loaded and unloaded below a lifting appliance in a real-time tracking image based on a target size, the method comprises the following steps:
acquiring the target size of the lifting appliance, taking the target size of the lifting appliance as a butt joint condition, acquiring a position corresponding relation between a container loading and unloading surface and the lifting appliance, and determining the reference size information of the container to be loaded and unloaded, which can be loaded and unloaded by the current lifting appliance, in a real-time tracking image based on the position corresponding relation and the butt joint condition;
Determining the size information of each container below the lifting appliance based on the real-time tracking image, matching the reference size information with the size information of each container below the lifting appliance, and determining a target container to be loaded and unloaded based on a matching result;
determining a butt joint position point of the lifting appliance and the target container to be unloaded, and monitoring the butt joint state of the butt joint position point of the lifting appliance and the target container to be unloaded based on a real-time tracking state image;
when the lifting appliance is correspondingly overlapped with the butt joint position point of the target container to be loaded and unloaded, judging that the lifting appliance meets the loading and unloading condition of the target container to be loaded and unloaded, and sending a first reminding notice to the management terminal;
otherwise, judging that the lifting appliance does not meet the loading and unloading conditions of the target container to be loaded and unloaded, and sending a second reminding notice to the management terminal.
Preferably, in step 3, a loading and unloading image of a container to be loaded and unloaded by a lifting appliance is obtained based on a locking result, a moving linear relationship between the lifting appliance and the container to be loaded and unloaded is determined by analyzing the loading and unloading image, and the loading and unloading state of the container to be loaded and unloaded by the lifting appliance is determined based on the moving linear relationship, and the method comprises the following steps:
Acquiring a locking result of a target container to be loaded and unloaded, loading and unloading the target container to be loaded and unloaded through a lifting appliance based on the locking result, and acquiring an image of the loading and unloading process of the target container to be loaded and unloaded through the lifting appliance based on a preset acquisition device to obtain a loading and unloading image set;
carrying out time sequence arrangement on each loading and unloading image in the loading and unloading image set based on the time acquisition sequence, and determining the relative position relationship between the target container to be loaded and unloaded and the lifting appliance at different moments based on the time sequence arrangement result;
determining a movement linear relation between the lifting appliance and the target container to be unloaded based on the relative position relation;
when the linear relation of movement is that the relative position relation between the target container to be loaded and unloaded and the lifting appliance is unchanged at different moments, and the proportion of the target container to be loaded and unloaded in the loading and unloading image is increased along with the time development, judging that the lifting appliance successfully lifts the target container to be loaded and unloaded, otherwise, judging that the lifting appliance does not successfully lift the target container to be loaded and unloaded;
and determining the loading and unloading state of the lifting appliance to the target container to be loaded and unloaded based on the judging result.
Preferably, a method for detecting a container handling operation state based on image analysis, determining a handling state of a lifting appliance to handle a target container based on a determination result, includes:
Acquiring the loading and unloading state of the lifting appliance to-be-loaded and unloaded target container, sending an alarm prompt to the management terminal when the lifting appliance does not successfully lift the to-be-loaded and unloaded target container, and butting the lifting appliance with the to-be-loaded and unloaded target container again until the to-be-loaded and unloaded target container is successfully lifted;
meanwhile, constructing a loading and unloading record table, and determining a target identifier and target loading and unloading time information of the target container to be loaded and unloaded when the target container to be loaded and unloaded is successfully lifted;
and recording the target identification and the target loading and unloading time information of each target container to be loaded and unloaded in a loading and unloading record table sequentially based on the loading and unloading sequence of the target containers to be loaded and unloaded until the loading and unloading operation of all the target containers to be loaded and unloaded is completed.
Preferably, in step 3, the method for detecting the loading and unloading operation state of the container based on image analysis determines the loading and unloading state of the lifting appliance to be loaded and unloaded on the basis of the movement linear relation, and includes:
acquiring a loading and unloading state of a lifting appliance to load and unload a target container, and determining the time length for completing the detection of the loading and unloading operation state of the container when the loading and unloading state is that the lifting appliance successfully lifts the target container to be loaded and unloaded;
calculating total duration for the lifting appliance to finish all container loading and unloading operations based on the length of time for finishing the container loading and unloading operation state detection, and calculating the loading and unloading efficiency of the lifting appliance to the container based on the total duration;
Comparing the calculated loading and unloading efficiency with preset loading and unloading efficiency;
if the calculated loading and unloading efficiency is greater than or equal to the preset loading and unloading efficiency, judging that the loading and unloading efficiency of the lifting appliance to the container is qualified, and continuously detecting the loading and unloading operation of the container based on the current container loading and unloading operation state detection strategy until the loading and unloading operation of all containers is completed;
otherwise, judging that the loading and unloading efficiency of the lifting appliance to the container is qualified, and detecting the speed of the current container loading and unloading operation state detection strategy to the container loading and unloading operation state until the calculated loading and unloading efficiency is greater than or equal to the preset loading and unloading efficiency.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and claims thereof as well as the appended drawings.
The technical scheme of the invention is further described in detail through the drawings and the embodiments.
Drawings
The accompanying drawings are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate the invention and together with the embodiments of the invention, serve to explain the invention. In the drawings:
FIG. 1 is a flow chart of a method for detecting a container handling operation status based on image analysis in an embodiment of the invention;
FIG. 2 is a flowchart of step 1 in a method for detecting a container handling operation status based on image analysis according to an embodiment of the present invention;
fig. 3 is a flowchart of step 3 in a method for detecting a container handling operation status based on image analysis according to an embodiment of the present invention.
Detailed Description
The preferred embodiments of the present invention will be described below with reference to the accompanying drawings, it being understood that the preferred embodiments described herein are for illustration and explanation of the present invention only, and are not intended to limit the present invention.
Example 1:
the embodiment provides a method for detecting a container loading and unloading operation state based on image analysis, as shown in fig. 1, including:
step 1: acquiring an initial state image of the lifting appliance based on a preset acquisition device, locking and tracking the lifting appliance based on the initial state image, and determining a real-time tracking image corresponding to the lifting appliance based on a locking and tracking result;
step 2: preprocessing the real-time tracking image, determining the target size of the lifting appliance, and locking a target container to be loaded and unloaded below the lifting appliance in the real-time tracking image based on the target size;
Step 3: and acquiring a loading and unloading image of the lifting appliance for loading and unloading the target container to be loaded and unloaded based on the locking result, analyzing the loading and unloading image to determine the moving linear relation between the lifting appliance and the target container to be loaded and unloaded, and determining the loading and unloading state of the lifting appliance for loading and unloading the target container based on the moving linear relation.
In this embodiment, the preset collecting device is set in advance, and is used for collecting images of lowering of the lifting appliance when the container is loaded and unloaded, images when the container is loaded and unloaded, and the like.
In this embodiment, the initial state image refers to image acquisition of the lifting appliance in the descending process by a preset acquisition device, so that the lifting appliance in the descending process is conveniently locked.
In this embodiment, locking and tracking the lifting appliance based on the initial state image means that the current position of the lifting appliance is locked according to the initial state image, so that the position of the lifting appliance at the next moment is conveniently determined according to the locking result.
In this embodiment, the real-time tracking of the image means that the image is acquired from the real-time position of the spreader after the current position of the spreader is changed according to the locking tracking result, so that the container is conveniently locked according to the spreader when the spreader approaches the container.
In this embodiment, preprocessing refers to performing sharpness adjustment on the obtained real-time tracking image, so as to facilitate determination of the target size of the spreader in the real-time tracking image.
In this embodiment, the target size refers to the image size that the spreader presents in the real-time tracking image, thereby facilitating the locking of containers to be handled according to the spreader's image size.
In this embodiment, the target container to be loaded and unloaded refers to a container in which a spreader in the current position can load and unload the container below, and is one of a plurality of containers below.
In this embodiment, the loading and unloading image refers to an image obtained after image acquisition of a process of loading and unloading a target container to be loaded and unloaded by a lifting appliance, and is used for analyzing whether the lifting appliance successfully lifts the target container to be loaded and unloaded.
In this embodiment, acquiring the loading and unloading image of the lifting appliance for loading and unloading the target container to be loaded and unloaded based on the locking result refers to image acquisition of the loading and unloading process of the lifting appliance for loading and unloading the target container to be loaded and unloaded by the preset acquisition device.
In this embodiment, the movement linear relationship refers to whether the movement direction of the target container to be loaded and unloaded is the same as that of the spreader, so that the loading and unloading state of the target container to be loaded and unloaded can be accurately and effectively determined.
In this embodiment, determining the loading and unloading state of the lifting appliance to the target container to be loaded and unloaded based on the movement linear relationship means that when the movement linear relationship represents that the lifting appliance and the target container to be loaded and unloaded move upwards synchronously, the lifting appliance indicates that the target container to be loaded and unloaded successfully lifts up, wherein the loading and unloading state includes that the target container to be loaded and unloaded successfully lifts up and the target container to be loaded and unloaded successfully lifts up.
The beneficial effects of the technical scheme are as follows: by analyzing the acquired lifting appliance loading and unloading image of the target container to be loaded and unloaded, the target container to be loaded and unloaded is ensured to be rapidly and accurately determined, so that the loading and unloading state of the target container to be loaded and unloaded is accurately and effectively detected, and the reliability and the high efficiency of the loading and unloading state analysis of the target container to be loaded and unloaded are ensured.
Example 2:
on the basis of embodiment 1, the present embodiment provides a method for detecting a container handling operation state based on image analysis, as shown in fig. 2, in step 1, an initial state image of a lifting appliance is acquired based on a preset acquisition device, including:
step 101: acquiring historical loading and unloading data of the lifting appliance on the container based on a preset server, analyzing the historical loading and unloading data, and determining the lifting height of the lifting appliance when the lifting appliance loads and unloads the container;
Step 102: acquiring target requirements for locking and tracking the lifting appliance, and determining the optimal height for image acquisition of the lifting appliance based on the target requirements and the lifting height;
step 103: determining a upward shooting angle of the preset acquisition device for image acquisition of the lifting appliance based on the relative position of the preset acquisition device and the optimal height, adapting the preset acquisition device based on the upward shooting angle, and associating the adapted preset acquisition device with a preset height sensor;
step 104: the current lifting height of the lifting appliance is monitored in real time based on a preset height sensor, when the lifting height is consistent with the optimal height, an image acquisition instruction is issued to a preset acquisition device, and the lifting appliance is subjected to image acquisition based on the preset acquisition device, so that an initial state image of the lifting appliance is obtained.
In this embodiment, the preset server is set in advance, and is used for storing historical loading and unloading data of the lifting appliance on the container.
In this embodiment, the historical handling data may be operational data generated when the spreader is handling different containers.
In this embodiment, the lifting height refers to the highest point that the spreader can reach and the lowest point that the container descends when it is handled.
In this embodiment, the target requirements are for characterizing the tracking timeliness of the spreader, positioning accuracy, etc.
In this embodiment, the optimal height is used to characterize the position of the spreader in which the lock is tracked, for example, by capturing an image of the spreader when the spreader is lowered to one third of the lifting height, and locking the spreader according to the captured image.
In this embodiment, the adaptation of the preset acquisition device based on the upward shooting angle means that the shooting angle of the preset acquisition device is adjusted, so that the final shooting of the lifting appliance is ensured.
In this embodiment, the preset height sensor is set in advance, and is used for detecting the current lowering height of the lifting appliance in real time.
In this embodiment, the purpose of associating the adapted preset collecting device with the preset height sensor is to timely send an image collecting instruction to the preset collecting device when the preset height sensor detects that the lifting appliance descends to the optimal height, so that an initial state image is collected by the preset collecting device.
The beneficial effects of the technical scheme are as follows: the historical loading and unloading data of the container are analyzed by the lifting appliance, so that the optimal height for locking and tracking the lifting appliance is accurately and effectively determined, and the image acquisition is carried out on the lifting appliance at the optimal height by the preset acquisition device, so that the accurate and effective acquisition of the initial state image of the lifting appliance is realized, convenience and guarantee are provided for realizing the tracking and locking of the lifting appliance, and the detection accuracy of the loading and unloading operation state of the container is also facilitated to be improved.
Example 3:
on the basis of embodiment 1, the embodiment provides a method for detecting a container loading and unloading operation state based on image analysis, in step 1, locking tracking is performed on a lifting appliance based on an initial state image, and the method comprises the following steps:
acquiring an obtained initial state image and shape characteristics of the lifting appliance, performing pixel matching on the initial state image based on the shape characteristics, and determining an image area range of the lifting appliance in the initial state image;
acquiring a target color of the lifting appliance, extracting a color histogram of an image area range, determining a target proportion of the target color in the image area range based on the color histogram, and correcting the image area range based on the color histogram when the target proportion is greater than a preset proportion threshold;
and carrying out detection frame marking on the corrected image area range based on a preset detection frame, locking the lifting appliance based on a detection frame marking result, configuring the movement direction and the movement speed of the preset detection frame based on the movement direction and the movement speed of the lifting appliance, predicting the next position of the lifting appliance through the preset detection frame based on a configuration result, updating the real-time position of the preset detection frame based on a prediction result, and completing locking tracking of the lifting appliance.
In this embodiment, the shape features refer to the appearance of the spreader as well as the shape of the appearance.
In this embodiment, pixel matching refers to similarity analysis of the initial state image through shape features of the lifting appliance, so as to determine the range of the lifting appliance in the initial state image.
In this embodiment, the target color refers to the color of the spreader itself, and may be, for example, yellow or red, etc.
In this embodiment, the color histogram refers to the target proportion of the target color corresponding to the image area range spreader in the current area, so as to facilitate correction of the image area range where the spreader is located in the initial state image.
In this embodiment, the target ratio is a ratio representing the ratio of the color corresponding to the spreader to the image area.
In this embodiment, the preset ratio threshold is set in advance, and is used to measure whether the ratio of the target color in the image area meets the criterion of determining the current image area as the position of the lifting appliance.
In this embodiment, the preset detection frame is set in advance, and is used for performing frame selection marking on the corrected image area range, that is, performing frame selection marking on the lifting appliance.
In this embodiment, the detection frame mark refers to frame selection of the image area range where the lifting appliance is located in the initial state image through a preset detection frame.
In this embodiment, configuring the movement direction and the movement speed of the preset detection frame based on the movement direction and the movement speed of the lifting appliance means synchronously adjusting the position of the preset detection frame according to the movement direction and the movement speed of the lifting appliance, so as to lock the lifting appliance.
The beneficial effects of the technical scheme are as follows: the acquired initial state image is analyzed, so that the image area of the lifting appliance in the initial state image is accurately and effectively determined, and the lifting appliance in the initial state image is marked through the preset detection frame, thereby realizing locking tracking of the lifting appliance, providing convenience and guarantee for timely and accurately determining the to-be-loaded and unloaded target container below the lifting appliance, and being convenient for improving the reliability and the high efficiency of the loading and unloading state analysis of the to-be-loaded and unloaded target container.
Example 4:
on the basis of embodiment 1, the present embodiment provides a method for detecting a container loading and unloading operation state based on image analysis, in step 1, determining a real-time tracking image corresponding to a lifting appliance based on a locking tracking result, including:
Acquiring a locking tracking result of the lifting appliance, determining an initial position of the lifting appliance in an initial state image based on the locking tracking result, and determining a target distance between the initial position and an initial state image boundary;
monitoring the change condition of the target distance in real time based on locking tracking of the lifting appliance, and carrying out image acquisition on the lifting appliance again based on a preset acquisition device according to a locking tracking result when the monitored target distance is smaller than a preset distance threshold value to obtain a tracking image to be sequenced;
and sequencing the acquired tracking images to be sequenced based on the time development sequence, and obtaining real-time tracking images corresponding to the lifting appliance based on the sequencing result.
In this embodiment, the initial position refers to the position of the spreader in the initial state image of the spreader acquired for the first time.
In this embodiment, the target distance refers to the distance between the image area where the spreader is located in the initial state image and the boundary of the initial state image.
In this embodiment, the preset distance threshold is set in advance, and is used to characterize the minimum standard for re-image acquisition of the spreader.
In this embodiment, the tracking image to be sequenced refers to an image obtained by acquiring an image of the lifting appliance again after the position of the lifting appliance changes.
In this embodiment, the time development sequence of sorting the collected tracking images to be sorted refers to sorting the collected tracking images to be sorted according to the sequence of image collection (the sequence of images collected when the lifting appliance descends).
The beneficial effects of the technical scheme are as follows: the dynamic position of the lifting appliance is detected in real time according to the locking tracking result of the lifting appliance, and the real-time position image of the lifting appliance is acquired according to the dynamic position of the lifting appliance, so that the real-time tracking image of the lifting appliance is accurately and effectively acquired, the rapid and accurate determination of the target container to be loaded and unloaded is ensured, and the loading and unloading state of the target container to be loaded and unloaded is also ensured to be accurately and effectively detected.
Example 5:
on the basis of embodiment 1, the embodiment provides a method for detecting a container loading and unloading operation state based on image analysis, and a real-time tracking image corresponding to a lifting appliance is obtained based on a sequencing result, which comprises the following steps:
acquiring an obtained real-time tracking image, and determining a corresponding target static image when the lifting appliance reaches the loading and unloading position based on the real-time tracking image;
extracting an object feature set recorded in a target static image, and simultaneously, acquiring the object feature of the container and matching the object feature of the container with the object feature set;
When the object characteristics in the object characteristic set are matched with the object characteristics of the container, judging that the container exists below the lifting appliance;
otherwise, judging that the container does not exist below the lifting appliance, sending a prompt to the management terminal, and correcting the hanging direction of the lifting appliance until the object characteristics in the object characteristic set are matched with the object characteristics of the container.
In this embodiment the loading and unloading position refers to the position where the spreader is docked with the container.
In this embodiment, the target still image refers to an image of the positional relationship between the spreader and the target container to be loaded and unloaded, which is currently recorded, when the position of the spreader does not change.
In this embodiment, the object feature set refers to shape features of all objects recorded in the target still image, and the like, and may be, for example, shape features of a spreader, a container, and a disturbing object.
In this embodiment, the object features refer to shape features of the container and corresponding size information.
The beneficial effects of the technical scheme are as follows: by analyzing the obtained real-time tracking image, image acquisition of all objects below the lifting appliance is realized, object characteristics of the container are matched with the object characteristics in the acquired image, accurate and effective judgment of whether the container exists below the lifting appliance is realized, accurate and effective detection of the loading and unloading state of the target container to be loaded and unloaded is facilitated, and reliability and high efficiency of loading and unloading state analysis of the target container to be loaded and unloaded are ensured.
Example 6:
on the basis of embodiment 1, the present embodiment provides a method for detecting a container loading and unloading operation state based on image analysis, in step 2, preprocessing a real-time tracking image, determining a target size of a lifting appliance, including:
the method comprises the steps of obtaining a real-time tracking image, identifying the real-time tracking image, determining a target object contained in the real-time tracking image, and carrying out image division on the real-time tracking image based on the target object to obtain M image blocks;
respectively extracting image parameters of M image blocks, evaluating the image parameters of the M image blocks based on a preset effect evaluation model, determining a definition value of each image block, and judging the image block with the definition value smaller than a preset definition threshold as an image block to be optimized;
training a preset convolution network based on reference image parameters of a preset sample image to obtain an image optimization model, analyzing image parameters of image blocks to be optimized based on the image optimization model, determining a target difference value of the image parameters of each image block and the reference image parameters, and determining image optimization parameters of the image blocks to be optimized based on the target difference value;
performing image optimization on the image blocks to be optimized based on the image optimization parameters, and splicing the optimized image blocks with the definition meeting a preset definition threshold to obtain a preprocessed image;
Selecting a target reference point from the preprocessed image, determining the target distance between the target reference point and a preset acquisition device, and determining depth information of the preprocessed image based on the target distance and imaging configuration of the preset acquisition device;
performing point cloud scanning on the preprocessing image based on the depth information of the preprocessing image to obtain point cloud data of a target object in the preprocessing image, determining edge texture characteristics of the target object based on the preprocessing image, correcting the point cloud data of the target object based on the edge texture characteristics, and obtaining contour characteristics of the target object based on a correction result;
and determining the target size of the lifting appliance in the preprocessing image based on the outline characteristics of the target object.
In this embodiment, the target object refers to different objects contained in the real-time tracking image, and may specifically be a hanger, a container, or other objects.
In this embodiment, image division of the real-time tracking image based on the target object refers to splitting an area corresponding to each target image, that is, each image block corresponds to one target object.
In this embodiment, the image block refers to a partial image area obtained by splitting the real-time tracking image according to the target object.
In this embodiment, the image parameters refer to a color threshold value, a pixel value, a resolution value, and the like of each image block.
In this embodiment, the preset effect evaluation model is set in advance, and is used for analyzing the image parameters of each image block, so as to determine the sharpness value of each image block.
In this embodiment, the evaluation of the image parameters of the M image blocks based on the preset effect evaluation model refers to the analysis of the image parameters of each image block by the image evaluation indexes included in the preset effect evaluation model, where the image evaluation indexes include a reference image resolution, a reference image color threshold, a reference pixel value, and the like, and the analysis processing of the image parameters according to the weights and the specific image evaluation index values is implemented by respectively determining the weights of the different image evaluation indexes.
In this embodiment, the preset sharpness threshold is set in advance, and is used to measure whether the sharpness of the image block meets the minimum requirement, so that the sharpness can be adjusted.
In this embodiment, the image block to be optimized refers to an image block for which sharpness of the image block needs to be adjusted, and is a part of M image blocks.
In this embodiment, the preset sample image is set in advance, i.e., a standard image that satisfies the analysis requirements.
In this embodiment, the reference image parameters refer to parameters such as standard resolution and standard color threshold corresponding to the preset sample image.
In this embodiment, the preset convolution network is set in advance, and is a model framework, and the preset convolution network is trained through the reference image parameters, so as to implement optimization processing on the image block to be optimized.
In this embodiment, the image optimization model refers to a model that is obtained by training a preset convolution network through reference image parameters and is capable of performing optimization processing on an image block.
In this embodiment, the target difference refers to the degree of difference between the image parameter of each image block and the reference image parameter, so as to facilitate the optimization processing operation on the image blocks.
In this embodiment, the image optimization parameter refers to specific data for optimizing the graphics block to be optimized, and may be, for example, the adjustment degree of the pixel value, or the like.
In this embodiment, the preprocessed image refers to an image obtained by performing optimization processing on an image block to be optimized and splicing the image block without optimization, and the preprocessed image can be used for determining the size of the lifting appliance.
In this embodiment, the target reference point refers to a reference point selected from the preprocessed image, for determining depth information of the preprocessed image.
In this embodiment, the target distance is an actual physical distance used to characterize the target reference point and the preset acquisition device.
In this embodiment, the depth information is used to characterize the distance between the target object recorded in the preprocessed image and the preset acquisition device during imaging, so as to facilitate determination of stereoscopic parameters of the target object in the image.
In this embodiment, the point cloud scanning refers to scanning the preprocessed image according to the depth information, so as to determine stereo data corresponding to different targets in the preprocessed image.
In this embodiment, the point cloud data refers to stereoscopic parameters of a target object in a preprocessed image obtained after the point cloud scanning is performed on the preprocessed image.
In this embodiment, the edge texture features refer to information such as the corresponding image size of the target object in the preprocessed image and the trend of each boundary.
In this embodiment, the profile features refer to shape features of the target object in order to effectively confirm the size of the spreader.
The beneficial effects of the technical scheme are as follows: the obtained real-time tracking image is preprocessed to ensure the definition and reliability of the finally obtained image, and the preprocessed image is subjected to point cloud scanning to finally realize the accurate and effective determination of the target size of the lifting appliance, so that convenience and guarantee are provided for determining the target container to be loaded and unloaded, and the detection accuracy of the container loading and unloading operation state is also facilitated to be improved.
Example 7:
on the basis of embodiment 1, the present embodiment provides a method for detecting a container loading and unloading operation state based on image analysis, in step 2, locking a target container to be loaded and unloaded below a lifting appliance in a real-time tracking image based on a target size, including:
acquiring the target size of the lifting appliance, taking the target size of the lifting appliance as a butt joint condition, acquiring a position corresponding relation between a container loading and unloading surface and the lifting appliance, and determining the reference size information of the container to be loaded and unloaded, which can be loaded and unloaded by the current lifting appliance, in a real-time tracking image based on the position corresponding relation and the butt joint condition;
determining the size information of each container below the lifting appliance based on the real-time tracking image, matching the reference size information with the size information of each container below the lifting appliance, and determining a target container to be loaded and unloaded based on a matching result;
determining a butt joint position point of the lifting appliance and the target container to be unloaded, and monitoring the butt joint state of the butt joint position point of the lifting appliance and the target container to be unloaded based on a real-time tracking state image;
when the lifting appliance is correspondingly overlapped with the butt joint position point of the target container to be loaded and unloaded, judging that the lifting appliance meets the loading and unloading condition of the target container to be loaded and unloaded, and sending a first reminding notice to the management terminal;
Otherwise, judging that the lifting appliance does not meet the loading and unloading conditions of the target container to be loaded and unloaded, and sending a second reminding notice to the management terminal.
In this embodiment, the docking condition refers to taking the target size of the spreader as a requirement for determining the target container to be loaded and unloaded, i.e., the loading and unloading condition can be satisfied when the size of the container is consistent with the target size of the spreader.
In this embodiment, the position correspondence is a correspondence between different position points in the spreader and different position points in the container for characterizing the spreader when loading and unloading the container.
In this embodiment, the reference size information refers to size information that a container capable of matching a target size of a spreader should have.
In this embodiment, the docking point refers to a coincidence point between the spreader and the target container to be loaded and unloaded, so that loading and unloading of the target container to be loaded and unloaded by the spreader are realized.
In this embodiment, the first alert notification may be a notification that the current spreader may load and unload the target container to be loaded and unloaded to the management terminal.
In this embodiment, the second alert notification may be a notification that the current spreader cannot load and unload the target container to be loaded and unloaded to the management terminal.
The beneficial effects of the technical scheme are as follows: by analyzing the target inch of the lifting appliance, the container which needs to be loaded and unloaded below the lifting appliance is accurately and effectively locked according to the target size of the lifting appliance, and then the butt joint condition of the butt joint position point between the lifting appliance and the target container to be loaded and unloaded is monitored, so that the reliability of loading and unloading the target container to be loaded and unloaded by the lifting appliance is ensured, and the loading and unloading state of the container is effectively monitored.
Example 8:
on the basis of embodiment 1, the present embodiment provides a method for detecting a container loading and unloading operation state based on image analysis, in step 3, a loading and unloading image of a lifting appliance for loading and unloading a target container is obtained based on a locking result, a movement linear relationship between the lifting appliance and the target container to be loaded and unloaded is determined by analyzing the loading and unloading image, and the loading and unloading state of the lifting appliance for loading and unloading the target container is determined based on the movement linear relationship, including:
step 301: acquiring a locking result of a target container to be loaded and unloaded, loading and unloading the target container to be loaded and unloaded through a lifting appliance based on the locking result, and acquiring an image of the loading and unloading process of the target container to be loaded and unloaded through the lifting appliance based on a preset acquisition device to obtain a loading and unloading image set;
Step 302: carrying out time sequence arrangement on each loading and unloading image in the loading and unloading image set based on the time acquisition sequence, and determining the relative position relationship between the target container to be loaded and unloaded and the lifting appliance at different moments based on the time sequence arrangement result;
step 303: determining a movement linear relation between the lifting appliance and the target container to be unloaded based on the relative position relation;
step 304: when the linear relation of movement is that the relative position relation between the target container to be loaded and unloaded and the lifting appliance is unchanged at different moments, and the proportion of the target container to be loaded and unloaded in the loading and unloading image is increased along with the time development, judging that the lifting appliance successfully lifts the target container to be loaded and unloaded, otherwise, judging that the lifting appliance does not successfully lift the target container to be loaded and unloaded;
step 305: and determining the loading and unloading state of the lifting appliance to the target container to be loaded and unloaded based on the judging result.
In this embodiment, the loading and unloading image set refers to a plurality of images obtained after image acquisition is performed for a plurality of times in a loading and unloading process of a target container to be loaded and unloaded by a preset acquisition device.
In this embodiment, the time acquisition sequence refers to a sequence of image acquisition of the loading and unloading.
In this embodiment, the time sequence arrangement means that the loading and unloading images are sequentially arranged according to the acquisition order.
In this embodiment, the increasing proportion of the target container to be unloaded in the loading and unloading image over time is characterized in that when the image acquisition angle is unchanged, the increasing area of the container in the image indicates that the lifting appliance successfully lifts the container.
The beneficial effects of the technical scheme are as follows: the method has the advantages that the images corresponding to the loading and unloading processes of the lifting appliance to the target container to be loaded and unloaded are acquired, the acquired images are analyzed, and the accurate and effective judgment on the movement linear relation of the lifting appliance and the target container to be loaded and unloaded is realized, so that the loading and unloading states of the target container to be loaded and unloaded can be accurately and effectively analyzed according to the movement linear relation, and the accuracy and reliability of the loading and unloading state analysis of the target container to be loaded and unloaded are ensured.
Example 9:
on the basis of embodiment 8, the present embodiment provides a method for detecting a loading and unloading operation state of a container based on image analysis, determining a loading and unloading state of a lifting appliance to load and unload a target container based on a determination result, including:
acquiring the loading and unloading state of the lifting appliance to-be-loaded and unloaded target container, sending an alarm prompt to the management terminal when the lifting appliance does not successfully lift the to-be-loaded and unloaded target container, and butting the lifting appliance with the to-be-loaded and unloaded target container again until the to-be-loaded and unloaded target container is successfully lifted;
Meanwhile, constructing a loading and unloading record table, and determining a target identifier and target loading and unloading time information of the target container to be loaded and unloaded when the target container to be loaded and unloaded is successfully lifted;
and recording the target identification and the target loading and unloading time information of each target container to be loaded and unloaded in a loading and unloading record table sequentially based on the loading and unloading sequence of the target containers to be loaded and unloaded until the loading and unloading operation of all the target containers to be loaded and unloaded is completed.
In this embodiment, the loading and unloading record table records the lifted container after the lifting appliance successfully lifts the target container to be loaded and unloaded.
In this embodiment, the target identifier is a tag label for marking different target containers to be loaded and unloaded.
The beneficial effects of the technical scheme are as follows: by recording the loading and unloading states of the target containers to be loaded and unloaded, the loading and unloading results of different containers can be managed reliably and effectively, and the management terminal can be managed reasonably.
Example 10:
on the basis of embodiment 1, the present embodiment provides a method for detecting a loading and unloading operation state of a container based on image analysis, in step 3, determining a loading and unloading state of a lifting appliance to load and unload a target container based on a movement linear relationship, including:
Acquiring a loading and unloading state of a lifting appliance to load and unload a target container, and determining the time length for completing the detection of the loading and unloading operation state of the container when the loading and unloading state is that the lifting appliance successfully lifts the target container to be loaded and unloaded;
calculating total duration for the lifting appliance to finish all container loading and unloading operations based on the length of time for finishing the container loading and unloading operation state detection, and calculating the loading and unloading efficiency of the lifting appliance to the container based on the total duration;
the total time length for the lifting appliance to complete all container loading and unloading operations is calculated according to the following formula:
wherein,,representing the total time length for the lifting appliance to complete the loading and unloading operations of all containers; />Representing the total duration of the lifting appliance which is not effectively utilized in the working process; />Indicating the total duration of the operation which the lifting appliance can provide and having a value greater than +.>;/>A serial number value of the container which needs to be loaded and unloaded; />Indicating the total number of containers to be handled; />Indicating the need for lowering and lifting the spreaderA rise height value; />A lowering speed value of the lifting appliance under no load is represented; />Indicating the lifting tool is->A lifting speed value of each container; />Indicating the sling and->The time period used when the containers are docked; />Representing a distance value of the container to be carried; / >Indicating that the sling is in the pair->A transport speed value when transporting the individual containers; />The duration of the lifting appliance needing to be maintained in the working process is represented; />Indicating completion of the%>The time length for detecting the loading and unloading operation states of the containers;
calculating the loading and unloading efficiency of the lifting appliance on the container according to the following formula:
wherein,,the loading and unloading efficiency of the lifting appliance to the container is represented, and the value range is (0, 1); />Representing error coefficient, and the value range is (0.02, 0.04); />Indicating the expected total length of time taken to complete all container handling operations; />Representing the total time length for the lifting appliance to complete the loading and unloading operations of all containers; />The time length for making up the container loading and unloading of the lifting appliance is represented;
comparing the calculated loading and unloading efficiency with preset loading and unloading efficiency;
if the calculated loading and unloading efficiency is greater than or equal to the preset loading and unloading efficiency, judging that the loading and unloading efficiency of the lifting appliance to the container is qualified, and continuously detecting the loading and unloading operation of the container based on the current container loading and unloading operation state detection strategy until the loading and unloading operation of all containers is completed;
otherwise, judging that the loading and unloading efficiency of the lifting appliance to the container is qualified, and detecting the speed of the current container loading and unloading operation state detection strategy to the container loading and unloading operation state until the calculated loading and unloading efficiency is greater than or equal to the preset loading and unloading efficiency.
In this embodiment, the length of time taken to complete the detection of the state of the container handling operation means the sum of the length of time for locking and tracking the spreader and the length of time for docking the spreader with the target container to be handled.
In this embodiment, the offset duration refers to the time that allows the spreader to be operated more than once, i.e. the duration that can be withheld from the total duration that the spreader has been used to complete all container handling operations.
In this embodiment, the preset loading and unloading efficiency is set in advance, and the minimum standard required to be met by the loading and unloading efficiency of the lifting appliance to the container is used for measuring, and can be adjusted.
The beneficial effects of the technical scheme are as follows: the loading and unloading efficiency of the lifting appliance to the container is calculated, so that the loading and unloading state of the lifting appliance to be loaded and unloaded to the target container is accurately and effectively analyzed, and when the loading and unloading efficiency does not meet the preset loading and unloading efficiency, the detection mode of the loading and unloading operation state of the container is timely adjusted, so that the detection mode of the loading and unloading operation state of the container is conveniently and timely optimized, and the loading and unloading efficiency of the container is also improved.
It will be apparent to those skilled in the art that various modifications and variations can be made to the present invention without departing from the spirit or scope of the invention. Thus, it is intended that the present invention also include such modifications and alterations insofar as they come within the scope of the appended claims or the equivalents thereof.

Claims (9)

1. A method for detecting a container handling operation status based on image analysis, comprising:
step 1: acquiring an initial state image of the lifting appliance based on a preset acquisition device, locking and tracking the lifting appliance based on the initial state image, and determining a real-time tracking image corresponding to the lifting appliance based on a locking and tracking result;
step 2: preprocessing the real-time tracking image, determining the target size of the lifting appliance, and locking a target container to be loaded and unloaded below the lifting appliance in the real-time tracking image based on the target size;
step 3: acquiring a loading and unloading image of the lifting appliance for loading and unloading the target container to be loaded and unloaded based on a locking result, analyzing the loading and unloading image to determine the moving linear relation between the lifting appliance and the target container to be loaded and unloaded, and determining the loading and unloading state of the lifting appliance for loading and unloading the target container to be loaded and unloaded based on the moving linear relation;
in step 1, lock tracking is performed on a lifting appliance based on an initial state image, and the method comprises the following steps:
acquiring an obtained initial state image and shape characteristics of the lifting appliance, performing pixel matching on the initial state image based on the shape characteristics, and determining an image area range of the lifting appliance in the initial state image;
acquiring a target color of the lifting appliance, extracting a color histogram of an image area range, determining a target proportion of the target color in the image area range based on the color histogram, and correcting the image area range based on the color histogram when the target proportion is greater than a preset proportion threshold;
And carrying out detection frame marking on the corrected image area range based on a preset detection frame, locking the lifting appliance based on a detection frame marking result, configuring the movement direction and the movement speed of the preset detection frame based on the movement direction and the movement speed of the lifting appliance, predicting the next position of the lifting appliance through the preset detection frame based on a configuration result, updating the real-time position of the preset detection frame based on a prediction result, and completing locking tracking of the lifting appliance.
2. The method for detecting the state of a container handling operation based on image analysis according to claim 1, wherein in step 1, an initial state image of a lifting appliance is acquired based on a preset acquisition device, comprising:
acquiring historical loading and unloading data of the lifting appliance on the container based on a preset server, analyzing the historical loading and unloading data, and determining the lifting height of the lifting appliance when the lifting appliance loads and unloads the container;
acquiring target requirements for locking and tracking the lifting appliance, and determining the optimal height for image acquisition of the lifting appliance based on the target requirements and the lifting height;
determining a upward shooting angle of the preset acquisition device for image acquisition of the lifting appliance based on the relative position of the preset acquisition device and the optimal height, adapting the preset acquisition device based on the upward shooting angle, and associating the adapted preset acquisition device with a preset height sensor;
The current lifting height of the lifting appliance is monitored in real time based on a preset height sensor, when the lifting height is consistent with the optimal height, an image acquisition instruction is issued to a preset acquisition device, and the lifting appliance is subjected to image acquisition based on the preset acquisition device, so that an initial state image of the lifting appliance is obtained.
3. The method for detecting a container handling operation state based on image analysis according to claim 1, wherein in step 1, determining a real-time tracking image corresponding to a spreader based on a lock tracking result comprises:
acquiring a locking tracking result of the lifting appliance, determining an initial position of the lifting appliance in an initial state image based on the locking tracking result, and determining a target distance between the initial position and an initial state image boundary;
monitoring the change condition of the target distance in real time based on locking tracking of the lifting appliance, and carrying out image acquisition on the lifting appliance again based on a preset acquisition device according to a locking tracking result when the monitored target distance is smaller than a preset distance threshold value to obtain a tracking image to be sequenced;
and sequencing the acquired tracking images to be sequenced based on the time development sequence, and obtaining real-time tracking images corresponding to the lifting appliance based on the sequencing result.
4. A method for detecting a container handling operation status based on image analysis according to claim 3, wherein obtaining real-time tracking images corresponding to a spreader based on the sorting result comprises:
acquiring an obtained real-time tracking image, and determining a corresponding target static image when the lifting appliance reaches the loading and unloading position based on the real-time tracking image;
extracting an object feature set recorded in a target static image, and simultaneously, acquiring the object feature of the container and matching the object feature of the container with the object feature set;
when the object characteristics in the object characteristic set are matched with the object characteristics of the container, judging that the container exists below the lifting appliance;
otherwise, judging that the container does not exist below the lifting appliance, sending a prompt to the management terminal, and correcting the hanging direction of the lifting appliance until the object characteristics in the object characteristic set are matched with the object characteristics of the container.
5. The method for detecting the loading and unloading operation state of the container based on the image analysis according to claim 1, wherein in the step 2, the real-time tracking image is preprocessed, and the target size of the lifting appliance is determined, which comprises the following steps:
the method comprises the steps of obtaining a real-time tracking image, identifying the real-time tracking image, determining a target object contained in the real-time tracking image, and carrying out image division on the real-time tracking image based on the target object to obtain M image blocks;
Respectively extracting image parameters of M image blocks, evaluating the image parameters of the M image blocks based on a preset effect evaluation model, determining a definition value of each image block, and judging the image block with the definition value smaller than a preset definition threshold as an image block to be optimized;
training a preset convolution network based on reference image parameters of a preset sample image to obtain an image optimization model, analyzing image parameters of image blocks to be optimized based on the image optimization model, determining a target difference value of the image parameters of each image block and the reference image parameters, and determining image optimization parameters of the image blocks to be optimized based on the target difference value;
performing image optimization on the image blocks to be optimized based on the image optimization parameters, and splicing the optimized image blocks with the definition meeting a preset definition threshold to obtain a preprocessed image;
selecting a target reference point from the preprocessed image, determining the target distance between the target reference point and a preset acquisition device, and determining depth information of the preprocessed image based on the target distance and imaging configuration of the preset acquisition device;
performing point cloud scanning on the preprocessing image based on the depth information of the preprocessing image to obtain point cloud data of a target object in the preprocessing image, determining edge texture characteristics of the target object based on the preprocessing image, correcting the point cloud data of the target object based on the edge texture characteristics, and obtaining contour characteristics of the target object based on a correction result;
And determining the target size of the lifting appliance in the preprocessing image based on the outline characteristics of the target object.
6. The method for detecting a container handling operation state based on image analysis according to claim 1, wherein in step 2, locking a target container to be handled below a spreader in a real-time tracking image based on a target size comprises:
acquiring the target size of the lifting appliance, taking the target size of the lifting appliance as a butt joint condition, acquiring a position corresponding relation between a container loading and unloading surface and the lifting appliance, and determining the reference size information of the container to be loaded and unloaded, which can be loaded and unloaded by the current lifting appliance, in a real-time tracking image based on the position corresponding relation and the butt joint condition;
determining the size information of each container below the lifting appliance based on the real-time tracking image, matching the reference size information with the size information of each container below the lifting appliance, and determining a target container to be loaded and unloaded based on a matching result;
determining a butt joint position point of the lifting appliance and the target container to be unloaded, and monitoring the butt joint state of the butt joint position point of the lifting appliance and the target container to be unloaded based on real-time tracking images;
when the lifting appliance is correspondingly overlapped with the butt joint position point of the target container to be loaded and unloaded, judging that the lifting appliance meets the loading and unloading condition of the target container to be loaded and unloaded, and sending a first reminding notice to the management terminal;
Otherwise, judging that the lifting appliance does not meet the loading and unloading conditions of the target container to be loaded and unloaded, and sending a second reminding notice to the management terminal.
7. The method for detecting a container handling operation state based on image analysis according to claim 1, wherein in step 3, a handling image of a container to be handled by a spreader is acquired based on a locking result, and the handling image is analyzed to determine a linear relationship of movement of the spreader and the container to be handled, and the handling state of the container to be handled by the spreader is determined based on the linear relationship of movement, comprising:
acquiring a locking result of a target container to be loaded and unloaded, loading and unloading the target container to be loaded and unloaded through a lifting appliance based on the locking result, and acquiring an image of the loading and unloading process of the target container to be loaded and unloaded through the lifting appliance based on a preset acquisition device to obtain a loading and unloading image set;
carrying out time sequence arrangement on each loading and unloading image in the loading and unloading image set based on the time acquisition sequence, and determining the relative position relationship between the target container to be loaded and unloaded and the lifting appliance at different moments based on the time sequence arrangement result;
determining a movement linear relation between the lifting appliance and the target container to be unloaded based on the relative position relation;
When the linear relation of movement is that the relative position relation between the target container to be loaded and unloaded and the lifting appliance is unchanged at different moments, and the proportion of the target container to be loaded and unloaded in the loading and unloading image is increased along with the time development, judging that the lifting appliance successfully lifts the target container to be loaded and unloaded, otherwise, judging that the lifting appliance does not successfully lift the target container to be loaded and unloaded;
and determining the loading and unloading state of the lifting appliance to the target container to be loaded and unloaded based on the judging result.
8. The method for detecting the state of a container handling operation based on image analysis according to claim 7, wherein determining the state of handling of the object container to be handled by the spreader based on the determination result comprises:
acquiring the loading and unloading state of the lifting appliance to-be-loaded and unloaded target container, sending an alarm prompt to the management terminal when the lifting appliance does not successfully lift the to-be-loaded and unloaded target container, and butting the lifting appliance with the to-be-loaded and unloaded target container again until the to-be-loaded and unloaded target container is successfully lifted;
meanwhile, constructing a loading and unloading record table, and determining a target identifier and target loading and unloading time information of the target container to be loaded and unloaded when the target container to be loaded and unloaded is successfully lifted;
and recording the target identification and the target loading and unloading time information of each target container to be loaded and unloaded in a loading and unloading record table sequentially based on the loading and unloading sequence of the target containers to be loaded and unloaded until the loading and unloading operation of all the target containers to be loaded and unloaded is completed.
9. The method for detecting the loading and unloading operation state of the container based on the image analysis according to claim 1, wherein in the step 3, the loading and unloading state of the lifting appliance to be loaded and unloaded to the target container is determined based on the movement linear relation, comprising:
acquiring a loading and unloading state of a lifting appliance to load and unload a target container, and determining the time length for completing the detection of the loading and unloading operation state of the container when the loading and unloading state is that the lifting appliance successfully lifts the target container to be loaded and unloaded;
calculating total duration for the lifting appliance to finish all container loading and unloading operations based on the length of time for finishing the container loading and unloading operation state detection, and calculating the loading and unloading efficiency of the lifting appliance to the container based on the total duration;
comparing the calculated loading and unloading efficiency with preset loading and unloading efficiency;
if the calculated loading and unloading efficiency is greater than or equal to the preset loading and unloading efficiency, judging that the loading and unloading efficiency of the lifting appliance to the container is qualified, and continuously detecting the loading and unloading operation of the container based on the current container loading and unloading operation state detection strategy until the loading and unloading operation of all containers is completed;
otherwise, judging that the loading and unloading efficiency of the lifting appliance to the container is unqualified, and optimizing the detection speed of the loading and unloading operation state of the container based on the current loading and unloading operation state detection strategy of the container until the calculated loading and unloading efficiency is greater than or equal to the preset loading and unloading efficiency.
CN202310416761.4A 2023-04-19 2023-04-19 Method for detecting container loading and unloading operation state based on image analysis Active CN116152243B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310416761.4A CN116152243B (en) 2023-04-19 2023-04-19 Method for detecting container loading and unloading operation state based on image analysis

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310416761.4A CN116152243B (en) 2023-04-19 2023-04-19 Method for detecting container loading and unloading operation state based on image analysis

Publications (2)

Publication Number Publication Date
CN116152243A CN116152243A (en) 2023-05-23
CN116152243B true CN116152243B (en) 2023-07-25

Family

ID=86339253

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310416761.4A Active CN116152243B (en) 2023-04-19 2023-04-19 Method for detecting container loading and unloading operation state based on image analysis

Country Status (1)

Country Link
CN (1) CN116152243B (en)

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE202012012116U1 (en) * 2012-12-17 2014-03-19 Liebherr-Components Biberach Gmbh Tower Crane
CN106599885B (en) * 2016-08-30 2020-08-11 中远海运科技股份有限公司 Bay level monitoring system and method for container
CN110996068B (en) * 2019-12-20 2021-03-16 上海振华重工(集团)股份有限公司 Automatic tracking system, equipment and method for lifting appliance
CN114368690A (en) * 2021-12-15 2022-04-19 北京国基科技股份有限公司 Gantry crane control method and system based on video real-time adjustment
CN218539045U (en) * 2022-09-21 2023-02-28 成都星云智联科技有限公司 Unmanned crane target position scanning system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于CamShift跟踪算法的塔吊监控系统设计;王旭 等;电子器件;36(6);第859-863页 *

Also Published As

Publication number Publication date
CN116152243A (en) 2023-05-23

Similar Documents

Publication Publication Date Title
US10733723B2 (en) Methods and system for improved quality inspection
US10706525B2 (en) Methods and systems for improved quality inspection
CN110245663B (en) Method for identifying steel coil information
CN110610141A (en) Logistics storage regular shape goods recognition system
JP5168215B2 (en) Appearance inspection device
CN108711148A (en) A kind of wheel tyre defect intelligent detecting method based on deep learning
CN111126393A (en) Vehicle appearance refitting judgment method and device, computer equipment and storage medium
CN114998824A (en) Vehicle loading and unloading task monitoring method, device and system
CN114332622A (en) Label detection method based on machine vision
CN108320799B (en) Image analysis and recognition method for lateral flow paper strip disease diagnosis
CN115512134A (en) Express item stacking abnormity early warning method, device, equipment and storage medium
CN113688965B (en) Automatic storage code scanning detection method and cargo management system
CN110415221B (en) Automatic detection method for preventing container truck from being lifted based on image feature point matching
CN116465315A (en) Automatic screen quality detection method and system
CN116152243B (en) Method for detecting container loading and unloading operation state based on image analysis
CN113269234B (en) Connecting piece assembly detection method and system based on target detection
CN114494845A (en) Artificial intelligence hidden danger troubleshooting system and method for construction project site
CN116551263A (en) Visual control method and system for welding position selection
CN113145473A (en) Intelligent fruit sorting system and method
CN116486212A (en) Water gauge identification method, system and storage medium based on computer vision
CN116448764A (en) Automatic crack detection method for fatigue test of aircraft structure
CN116563889A (en) Device and method for estimating weight of laying hen based on machine vision
CN109685002A (en) A kind of dataset acquisition method, system and electronic device
CN114219400B (en) Material supervision system and method of intelligent factory
CN115984759A (en) Substation switch state identification method and device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant