CN112465874A - Crane lifting appliance guiding and positioning method and system based on image sensing - Google Patents

Crane lifting appliance guiding and positioning method and system based on image sensing Download PDF

Info

Publication number
CN112465874A
CN112465874A CN202110115422.3A CN202110115422A CN112465874A CN 112465874 A CN112465874 A CN 112465874A CN 202110115422 A CN202110115422 A CN 202110115422A CN 112465874 A CN112465874 A CN 112465874A
Authority
CN
China
Prior art keywords
subsequence
candidate data
obtaining
lifting appliance
vector
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110115422.3A
Other languages
Chinese (zh)
Other versions
CN112465874B (en
Inventor
王建玲
郑崴
杨其锋
张佳威
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Henan Institute of Technology
Original Assignee
Henan Institute of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Henan Institute of Technology filed Critical Henan Institute of Technology
Priority to CN202110115422.3A priority Critical patent/CN112465874B/en
Publication of CN112465874A publication Critical patent/CN112465874A/en
Application granted granted Critical
Publication of CN112465874B publication Critical patent/CN112465874B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • G06T7/248Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving reference images or patches
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66CCRANES; LOAD-ENGAGING ELEMENTS OR DEVICES FOR CRANES, CAPSTANS, WINCHES, OR TACKLES
    • B66C13/00Other constructional features or details
    • B66C13/18Control systems or devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66CCRANES; LOAD-ENGAGING ELEMENTS OR DEVICES FOR CRANES, CAPSTANS, WINCHES, OR TACKLES
    • B66C13/00Other constructional features or details
    • B66C13/18Control systems or devices
    • B66C13/46Position indicators for suspended loads or for crane elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • G06F18/232Non-hierarchical techniques
    • G06F18/2321Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/0014Image feed-back for automatic industrial control, e.g. robot with camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/262Analysis of motion using transform domain methods, e.g. Fourier domain methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/74Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Multimedia (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Mathematical Physics (AREA)
  • Probability & Statistics with Applications (AREA)
  • Robotics (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to the technical field of computer vision, in particular to a crane lifting appliance guiding and positioning method and system based on image perception. According to the invention, the movement characteristics of the swing lifting appliance within local time are analyzed, so that the accurate positioning of the moving position of the swing lifting appliance is realized, and the next step of falling of the lifting appliance can be guided; compared with the existing anti-swing equipment, the equipment provided by the invention is simple, the maintenance cost is reduced, the automation of the positioning of the lifting appliance is realized, the operation efficiency of the crane is improved, and the operation energy consumption is reduced.

Description

Crane lifting appliance guiding and positioning method and system based on image sensing
Technical Field
The invention relates to the technical field of computer vision, in particular to a crane lifting appliance guiding and positioning method and system based on image perception.
Background
At present, in order to monitor goods in real time, a lifting appliance of a crane needs to be positioned and guided, and the existing lifting appliance positioning method generally obtains the position of the lifting appliance by sensing the length of a lifting wire and the position of a crown block.
However, a visual blind area may exist in the operation process of the crane, or due to interference of factors such as environment, the lifting appliance is prone to swing in the lifting process, so that the positioning accuracy and the working efficiency of the crane are low, and potential safety hazards exist simultaneously. Although the prior art has increased anti-swing device, it is mostly electric actuator, and the structure is complicated, and manufacturing cost is high.
Disclosure of Invention
The invention provides a crane lifting appliance guiding and positioning method and system based on image sensing, and solves the technical problems that the existing lifting appliance detection method cannot solve the problem of lifting appliance shielding, neglects the influence of a lifting operation environment, causes low positioning precision, and cannot realize efficient automatic detection.
In order to solve the technical problems, the invention provides a crane lifting appliance guiding and positioning method and system based on image perception, which comprises the following steps:
s1, based on a lifting appliance swing angle, performing sliding window fitting by using a simple pendulum model and a random sampling consistency algorithm according to a set window size to obtain a front-back and left-right movement characteristic sequence, and counting local interior points of each element participating in fitting to obtain the accuracy of each element;
s2, randomly sampling the front and back motion characteristic sequences for a plurality of times to obtain a plurality of subsequences, selecting any one of the subsequences, obtaining a Laplace matrix according to the local similarity of any two elements of the subsequence, filtering the subsequences by utilizing the characteristic vectors and energy state distribution of the subsequence to obtain low-frequency vectors, and obtaining the fluctuation degree according to the subsequences and the low-frequency vectors;
s3, obtaining a pre-and-post prediction angle sequence of each subsequence by using a least square method, and obtaining a candidate data set of each subsequence by using mean shift clustering based on the pre-and-post prediction angle sequence and the accuracy corresponding to each element of the pre-and-post prediction angle sequence;
s4, obtaining confidence characteristics of each candidate data set according to the aging characteristics and the fluctuation degree of the subsequence, obtaining candidate similarity of any two candidate data sets according to the contact ratio and the confidence characteristics of the two candidate data sets, constructing an adjacent matrix according to the candidate similarity, carrying out tangent and mean shift clustering on the adjacent matrix to obtain an optimal front-back prediction angle set, and similarly, obtaining an optimal left-right prediction angle set according to the left-right motion characteristic sequence;
and S5, tracking the lifting appliance according to the optimal front-back and left-right prediction angle set.
Further, the forward and backward motion characteristics include amplitude, angular velocity and initial phase of the forward and backward motion;
the left-right motion characteristics include amplitude, angular velocity, and initial phase of the left-right motion.
Further, selecting any one of the subsequences, and obtaining a laplacian matrix according to the local similarity of any two elements of the subsequence, specifically:
selecting any one subsequence, obtaining local similarity according to the negative exponential power of the Euclidean distance of any two elements of the subsequence, and constructing a sub-adjacency matrix according to all the local similarities;
and acquiring a degree matrix according to the sub-adjacency matrix, and acquiring a Laplace matrix according to the sub-adjacency matrix and the degree matrix.
Further, the filtering the subsequence by using the feature vector and the energy state distribution thereof to obtain a low-frequency vector specifically comprises:
taking the characteristic vector of the Laplace matrix as a base, and carrying out image Fourier transform on the amplitude component of each element in the subsequence to obtain a frequency domain vector;
acquiring energy state distribution of each node according to the frequency domain vector, and filtering the frequency domain vector according to the energy state distribution and the feature vector to obtain a frequency domain filtering vector;
and carrying out inverse Fourier transform on the image by using the feature vector and the frequency domain filtering vector to obtain a low-frequency vector.
Further, the obtaining of the pre-and-post prediction angle sequence of each subsequence by using a least square method specifically includes:
fitting the amplitude in each subsequence by using a least square method to obtain an amplitude predicted value of each subsequence at a future moment, and obtaining a front-rear prediction angle sequence by using a simple pendulum prediction model based on each amplitude predicted value and each element in the subsequence corresponding to the amplitude predicted value.
Further, the confidence feature is:
Figure 196925DEST_PATH_IMAGE001
in the formula, B represents a confidence coefficient characteristic, C represents an aging characteristic, V represents the fluctuation degree of a subsequence, D represents the number of elements of the subsequence corresponding to the candidate data set, and n is the base number of a logarithm;
and the aging characteristic is the ratio of the number mean value in any subsequence to the sum of the number mean values of all the subsequences.
The specific acquiring process of the contact ratio of any two candidate data sets comprises the following steps:
and randomly selecting two candidate data sets, respectively selecting an element from the two candidate data sets to obtain a feature point pair of a preset point pair number, obtaining a difference set and a difference coincidence subset according to the Euclidean distance of the two elements in each feature point pair, and calculating the coincidence degree according to the intersection ratio of the two candidate data sets, the difference set and the difference coincidence subset.
Further, the performing of tangent and mean shift clustering on the adjacency matrix to obtain an optimal front-back angle set specifically includes:
performing graph cutting processing on the adjacency matrix to cut all the candidate data sets into a plurality of categories;
calculating the confidence characteristic mean value of the candidate data set contained in each category to obtain the evaluation value of each category;
performing mean shift clustering on all elements in the category with the maximum evaluation value to obtain an optimal front and back set;
and extracting the forward and backward prediction angle components of all elements in the optimal forward and backward set to obtain an optimal forward and backward prediction angle set.
Furthermore, the accuracy is the ratio of the number of local points participating in fitting to the number of elements in the window when the hanger angle data is subjected to sliding window fitting sequentially according to the set window size.
The crane sling guiding and positioning system based on image perception comprises the crane sling guiding and positioning method based on image perception.
According to the crane lifting appliance guiding and positioning method and system based on image sensing, tracking and positioning are achieved by predicting the front-back and left-right angles of the lifting appliance, and the problems that the existing lifting appliance detection method cannot solve the problem of shielding of the lifting appliance, influences of lifting operation environments are ignored, positioning accuracy is low, and efficient automatic detection cannot be achieved are solved; the invention extracts the motion characteristics of the lifting appliance by utilizing the simple pendulum motion of the lifting appliance in local time, realizes the prediction of the optimal angle through the analysis of the motion characteristics, and further accurately positions the moving position of the lifting appliance.
Drawings
Fig. 1 is a schematic flow chart of a crane spreader guiding and positioning method based on image sensing according to an embodiment of the present invention;
FIG. 2 is a side view of a crane model provided by an embodiment of the present invention;
fig. 3 is a top view of a crane model provided by an embodiment of the present invention.
And (3) graphic labeling:
a crane jib 1; a high-definition camera 2; a spreader 3; and a crown block 4.
Detailed Description
The embodiments of the present invention will be described in detail below with reference to the accompanying drawings, which are given solely for the purpose of illustration and are not to be construed as limitations of the invention, including the drawings which are incorporated herein by reference and for illustration only and are not to be construed as limitations of the invention, since many variations thereof are possible without departing from the spirit and scope of the invention.
Aiming at the problems that the existing sling detection method cannot solve the problem of sling shielding, neglects the influence of a hoisting operation environment, causes low positioning precision and cannot realize efficient automatic detection, the embodiment of the invention provides a crane sling guiding and positioning method and a crane sling guiding and positioning system based on image sensing, and as shown in figure 1, the crane sling guiding and positioning method based on image sensing comprises the following steps:
s1, extracting motion characteristics: based on the swing angle of the lifting appliance, performing sliding window fitting by setting the size of a window by using a simple pendulum model and a random sampling consistency algorithm to obtain a front-back and left-right movement characteristic sequence, and counting local interior points of each element participating in fitting to obtain the accuracy of each element;
fig. 2 and 3 are simplified models of a crane, in the embodiment of the present invention, a high definition camera 2 with variable viewing angle and focal length is installed on a crane boom 1, the high definition camera 2 is used for sequentially collecting hanger data in unit time, and simultaneously detecting a hanger 3 by rotating left and right and up and down angles, when the hanger data is collected, the hanger 3 needs to be located at the center of the view of the high definition camera 2; in addition, in this embodiment, a sensor is installed on both the crane boom 1 and the overhead traveling crane 4, and is used to obtain the length of the suspension wire and the position of the overhead traveling crane 4, so as to obtain the distance between the spreader 3 and the overhead traveling crane 4.
In this embodiment, the spreader data includes spreader images, and the high definition camera 2 does not need to have a higher sampling rate, and can acquire spreader data once in 0.5 seconds or 1 second.
Because the lifting appliance can shake when being interfered or being out of operation, and a large error can exist if a lifting appliance swinging angle is obtained by directly utilizing a lifting appliance image acquired by a high-definition camera, the embodiment utilizes a geometrical relationship to solve a lifting appliance swinging angle in a space according to a deflection angle, a suspension wire length and a crown block position of the high-definition camera, and the deflection angle of the high-definition camera is acquired by a sensor of the high-definition camera;
as shown in fig. 2 and 3, the swing of the spreader includes front-back swing and left-right swing of the spreader, and the swing angle of the spreader includes front-back swing angle of the spreader and left-right swing angle of the spreader; it should be noted that, the high definition camera must be in the center of the field of vision of high definition camera after controlling or upper and lower turned angle to utilize the geometric relation to solve the hoist pivot angle, in addition, this embodiment can be according to the distance of hoist to high definition camera, adjust the focus of high definition camera, in order to guarantee to gather clear hoist image.
This embodiment need be in the center of the field of vision of high definition camera with unit interval real-time detection hoist, promptly: inputting a hanger image collected by a high-definition camera into a depth neural network to obtain a position coordinate of a hanger in the hanger image, calculating displacement vectors of the hanger and the hanger image if the position coordinate of the hanger is not coincident with a central point of the hanger image, rotating the high-definition camera up and down and left and right according to the displacement vectors until the hanger is positioned at the center of the visual field of the high-definition camera, and then obtaining a swing angle of the hanger according to the deflection angle of the high-definition camera.
Therefore, the sequence of the front and rear swing angles of the lifting appliance and the left and right swing angles of the lifting appliance changing along with time can be obtained, data are captured forwards according to the preset sequence length from the current moment, the obtained front and rear swing angles of the lifting appliance and the obtained left and right swing angles of the lifting appliance are arranged according to the time sequence, and the front and rear swing angle sequence of the lifting appliance and the left and right swing angle sequence of the lifting appliance are obtained.
Since the analysis process of the front-back swing angle sequence and the left-right swing angle sequence of the spreader is the same in this embodiment, for convenience of description, the analysis process will be described below with the front-back swing angle sequence of the spreader;
because the hoist is under the effect of interference factors such as environment, whole handling process is not ideal simple pendulum motion, but it can be approximate to ideal simple pendulum motion in local time, and the whole motion process of hoist can be regarded as the stack of simple pendulum motion in local time, therefore, this embodiment utilizes the simple pendulum model to carry out the order fitting with the window size of setting for the pendulum angle sequence around the hoist to obtain the fore-and-aft movement characteristic sequence, and numbering its each element with the fitting order, the window size of this embodiment setting is 21, technical personnel in the art can adjust according to actual conditions, wherein, the simple pendulum model is:
Figure 839259DEST_PATH_IMAGE002
in the formula
Figure 194148DEST_PATH_IMAGE003
Showing the front and back swing angle of the spreader, A showing the amplitude of the front and back movement of the spreader,
Figure 958842DEST_PATH_IMAGE004
representing the angular velocity of the spreader back and forth movement,
Figure 523815DEST_PATH_IMAGE005
representing the initial phase of the spreader back and forth motion and t representing the spreader motion time.
In this embodiment, during each fitting, a RANSAC algorithm (random sampling consensus algorithm) is used to fit the front and rear swing angles of the spreader contained in the window, so as to obtain the amplitude, the angular velocity and the initial phase of the front and rear movements of the spreader, and form the front and rear movement characteristics as the output of the fitting result, that is, the output is obtained
Figure 78162DEST_PATH_IMAGE006
(ii) a Meanwhile, in this embodiment, the number of local points in each fitting by using the RANSAC algorithm needs to be counted, that is, the number of elements participating in fitting the forward and backward motion characteristicsAnd calculating the ratio of the number of the local interior points to the number of the elements contained in the corresponding window to obtain the accuracy corresponding to the forward and backward movement characteristics.
In the present embodiment, the forward-backward motion characteristics include the amplitude, angular velocity, and initial phase of the forward-backward motion; the left-right motion characteristics include amplitude, angular velocity, and initial phase of the left-right motion.
S2, calculating the fluctuation degree: randomly sampling the front and back motion characteristic sequences for a plurality of times to obtain a plurality of subsequences, selecting any one of the subsequences, obtaining a Laplace matrix according to the local similarity of any two elements of the subsequence, filtering the subsequences by utilizing the characteristic vectors and energy state distribution of the subsequence to obtain low-frequency vectors, and obtaining the fluctuation degree according to the subsequences and the low-frequency vectors;
in the embodiment of the present invention, the forward and backward movement feature sequence is randomly sampled for a plurality of times to obtain a plurality of sub-sequences, and it should be noted that the number of each element in a sub-sequence maintains the number of the corresponding element in the forward and backward movement feature sequence; the number of elements of the subsequence is three-fourths of the number of elements of the forward and backward movement characteristic sequence, and a person skilled in the art can adjust the number of the elements according to specific conditions.
Selecting any one of the subsequences, sequentially calculating the negative exponential power of the Euclidean distance between any two elements in the subsequences to obtain local similarity, and constructing a sub-adjacent matrix according to all the local similarity, wherein the elements on the main diagonal of the sub-adjacent matrix are 0, and meanwhile, the elements, smaller than one fourth of the mean value of all the elements, in the sub-adjacent matrix are set to be 0; in addition, in the embodiment, the amplitude components of each element in the subsequence form a row vector to obtain an amplitude vector;
constructing a degree matrix according to the sub-adjacency matrix, acquiring a Laplace matrix according to the sub-adjacency matrix and the degree matrix, and performing eigenvalue decomposition on the Laplace matrix, wherein vectors formed by eigenvalues are eigenvectors
Figure 412192DEST_PATH_IMAGE007
Taking the eigenvector of the Laplace matrix as a base, carrying out image Fourier transform on the amplitude vector to obtain a frequency domain vector corresponding to the amplitude vector, namely the frequency domain vector
Figure 144524DEST_PATH_IMAGE008
Acquiring energy state distribution on the jth node according to the frequency domain vector, wherein a calculation formula of the energy state distribution is as follows:
Figure 196794DEST_PATH_IMAGE009
in the formula (I), the compound is shown in the specification,
Figure 790718DEST_PATH_IMAGE010
representing the distribution of energy states at the jth node,
Figure 244833DEST_PATH_IMAGE011
representing the characteristic vector for the jth element value,
Figure 148067DEST_PATH_IMAGE012
representing the frequency domain vector jth element value.
Filtering the frequency domain vector according to the energy state distribution and the feature vector to obtain a frequency domain filtering vector
Figure 687632DEST_PATH_IMAGE013
The method specifically comprises the following steps:
Figure 607219DEST_PATH_IMAGE014
in the formula (I), the compound is shown in the specification,
Figure 915841DEST_PATH_IMAGE015
is a stand forThe mean of all the elements in the frequency domain vector,
Figure 989976DEST_PATH_IMAGE016
the energy threshold is specifically:
Figure 751259DEST_PATH_IMAGE017
wherein the content of the first and second substances,
Figure 952564DEST_PATH_IMAGE018
is the mean of the distribution of all energy states,
Figure 115692DEST_PATH_IMAGE019
is the mean value of all elements after normalization of each element in the feature vector,
Figure 360729DEST_PATH_IMAGE020
is a hyper-parameter.
And performing inverse Fourier transform on the image by using the feature vector and the frequency domain filter vector to obtain a low-frequency vector of an amplitude component, wherein the acquisition process of the low-frequency vector comprises the following steps:
Figure 874887DEST_PATH_IMAGE021
in the formula, f represents a low-frequency vector,
Figure 112839DEST_PATH_IMAGE022
representing the transpose of the feature vector.
Calculating the negative exponential power of the Euclidean distance between each element in the low-frequency vector and the corresponding position element in the amplitude vector to obtain the fluctuation degree of the subsequence, and sequentially obtaining the fluctuation degrees of other subsequences by using the method; in this embodiment, the smaller the fluctuation degree is, the smoother the motion characteristic change is, and the more accurate the obtained optimal prediction angle set is.
S3, acquiring a candidate data set: obtaining a pre-and-post prediction angle sequence of each subsequence by using a least square method, and obtaining a candidate data set of each subsequence by using mean shift clustering based on the pre-and-post prediction angle sequence and the accuracy corresponding to each element of the pre-and-post prediction angle sequence;
in the actual hoisting process, due to the influence of various interference factors, the motion process of the lifting appliance is complex, the angle at the future moment is predicted by directly utilizing the collected front and back and left and right swing angles, the error is large, in addition, the front and back or left and right amplitudes of the lifting appliance are difficult to obtain directly, and large deviation exists, so that the embodiment of the invention obtains the amplitude by utilizing the swing angle in local time, the amplitude can reflect the swinging energy of the lifting appliance in local time, and the interference of the outside on the lifting appliance can be accurately reflected through the change of the amplitude, so that the accuracy and the reasonability of the optimal predicted angle are greatly improved, and the error is reduced;
selecting any one of the subsequences, fitting the amplitude of each element in the subsequences by using a least square method to obtain an amplitude-time curve of the subsequences, predicting an amplitude prediction value of the subsequences at a certain time in the future according to the amplitude-time curve, selecting any element in the subsequences, inputting the amplitude prediction value and the angular velocity and initial phase in the element into a single pendulum prediction model to obtain a front prediction angle and a rear prediction angle, sequentially inputting the angular velocity and initial phase of other elements in the subsequences and the amplitude prediction value into the single pendulum prediction model to obtain the front prediction angle and the rear prediction angle of the corresponding element, and forming the front prediction angle and the rear prediction angle of all the front prediction angles obtained by the subsequences into a front prediction angle sequence and a rear prediction angle sequence, wherein the single pendulum:
Figure 130474DEST_PATH_IMAGE023
in the formula (I), the compound is shown in the specification,
Figure 811991DEST_PATH_IMAGE024
which represents the angle of the prediction before and after,
Figure 547865DEST_PATH_IMAGE025
the predicted value of the amplitude is represented,
Figure 90973DEST_PATH_IMAGE004
representing the angular velocity of the spreader back and forth movement,
Figure 963114DEST_PATH_IMAGE005
indicating the initial phase of the fore and aft motion of the spreader,
Figure 815533DEST_PATH_IMAGE026
indicating a time in the future.
The embodiment utilizes the pre-and-post prediction angle and the accuracy rate to generate two-dimensional feature points
Figure 38704DEST_PATH_IMAGE027
Wherein, in the step (A),
Figure 618459DEST_PATH_IMAGE028
for which the m-th element in the subsequence is in the future
Figure 610685DEST_PATH_IMAGE029
The angle is predicted before and after the time of day,
Figure 899584DEST_PATH_IMAGE030
the accuracy corresponding to the mth element in this subsequence; and performing mean shift clustering on all the two-dimensional feature points in the subsequence to obtain a clustering result, wherein the clustering result is a candidate data set of the subsequence, and candidate data sets of other subsequences are sequentially obtained.
S4, predicting an angle: obtaining the confidence characteristic of each candidate data set according to the aging characteristic and the fluctuation degree of the subsequence, obtaining the candidate similarity of any two candidate data sets according to the contact ratio and the confidence characteristic of the two candidate data sets, constructing an adjacent matrix according to the candidate similarity, performing tangent and mean shift clustering on the adjacent matrix to obtain an optimal front-back prediction angle set, and similarly, obtaining an optimal left-right prediction angle set according to the left-right motion characteristic sequence;
calculating the mean value of all element numbers in each subsequence to obtain a number mean value, calculating the ratio of the number mean value of a certain subsequence to the sum of the number mean values of all subsequences to obtain the aging characteristic of the subsequence to represent the aging of the subsequence, so that the accuracy of subsequent angle prediction is improved, and obtaining each subsequence according to the aging characteristic and the fluctuation degree of the subsequenceIn this embodiment, since the accuracies of the sub-sequences correspond to the accuracies of the candidate data sets one to one, the confidence feature of the candidate data set is the confidence feature of the corresponding sub-sequence, and the confidence feature of any sub-sequence specifically is:
Figure 344472DEST_PATH_IMAGE031
in the formula, B represents a confidence characteristic, C represents an aging characteristic, V represents the fluctuation degree of the subsequence, D represents the number of elements of the subsequence, and n is the base number of logarithm.
To this end, the present embodiment may obtain a confidence feature for each of the candidate data sets.
According to the contact ratio and the confidence characteristic of any two candidate data sets, acquiring the candidate similarity of the two candidate data sets, specifically:
firstly, arbitrarily selecting two candidate data sets x and y, simultaneously obtaining subsequences corresponding to the two candidate data sets, respectively selecting an element from the two candidate data sets to obtain two elements p and q, and respectively extracting element data corresponding to the two elements in the subsequences corresponding to the two elements
Figure 963803DEST_PATH_IMAGE032
Forming a characteristic point pair, and sequentially acquiring the characteristic point pairs of the two candidate data sets until the number of the characteristic point pairs is a preset point pair number;
then, two feature points in the feature point pair are calculated
Figure 76116DEST_PATH_IMAGE033
Figure 4758DEST_PATH_IMAGE034
The euclidean distance of each feature point pair is obtained as the internal distance of each feature point pair, in this embodiment, the number of preset point pairs is the product of the lengths of two subsequences, it should be noted that this embodiment can repeatedly select elements in a candidate data set, but it is required to ensure that the feature point pairs formed are uniqueFirstly, performing primary filtration;
then, all internal distances of the two candidate data sets form a difference set, and the difference set is selected to meet the requirements
Figure 202521DEST_PATH_IMAGE035
Obtaining a difference coincidence subset which represents the difference of the motion characteristics corresponding to the equal front and back prediction angles in the two candidate data sets; wherein the content of the first and second substances,
Figure 124078DEST_PATH_IMAGE036
represented in the candidate data set x with the element p in the future
Figure 90897DEST_PATH_IMAGE029
The angle is predicted before and after the time of day,
Figure 190440DEST_PATH_IMAGE037
represented in candidate dataset y, with element q in the future
Figure 875499DEST_PATH_IMAGE029
Predicting angles before and after the moment;
calculating the degree of coincidence of the two candidate data sets, namely:
Figure 102212DEST_PATH_IMAGE038
wherein F represents the degree of coincidence, g represents the intersection ratio of the two candidate data sets,
Figure 923538DEST_PATH_IMAGE039
indicating the degree of difference; in this embodiment, the difference degree is a negative exponential power of a ratio of a sum of all elements in the difference coincidence subset to a sum of all elements in the difference set.
Finally, calculating the ratio of the coincidence degree of the two candidate data sets to the difference value of the confidence degree characteristics of the two candidate data sets to obtain candidate similarity;
constructing the candidate similarity degrees between all the candidate data sets into an adjacency matrix, wherein the elements on the main diagonal of the adjacency matrix are 0; performing Ncut graph processing on the adjacency matrix, dividing all the candidate data sets into a plurality of categories, calculating the confidence characteristic mean value of the candidate data sets contained in each category to obtain the evaluation value of each category, selecting the category with the largest evaluation value as the optimal category, combining the elements of all the candidate data sets in the optimal category, performing mean shift clustering on the elements to filter outlier data to obtain an optimal front-back set, extracting front-back prediction angle components of all the elements in the optimal front-back set to obtain an optimal front-back prediction angle set, and similarly, obtaining the optimal left-right prediction angle set according to the left-right motion characteristic sequence.
S5, tracking the lifting appliance: tracking the lifting appliance according to the optimal front-back and left-right prediction angle sets;
in this embodiment, a geometric relationship is used to obtain the upper, lower, left and right deflection angle sets of the high definition camera according to the optimal front, rear, left and right prediction angle sets, elements at the central points in the upper, lower, left and right deflection angle sets are respectively extracted to obtain the optimal upper, lower, left and right deflection angles of the high definition camera, the high definition camera is rotated according to the optimal upper, lower, left and right deflection angle sets, and the boundary frame of the lifting appliance is detected
Figure 131665DEST_PATH_IMAGE040
In the first prediction, the next time of the current time, that is, the next time of the current time, can be predicted first in the embodiment
Figure 163075DEST_PATH_IMAGE040
K + 1; if the boundary frame of the hanger is not detected, the hanger is blocked, and at the moment, the hanger is enabled to be blocked
Figure 52534DEST_PATH_IMAGE040
And K +2, repeating the steps until the boundary frame of the lifting appliance is detected.
The crane lifting appliance guiding and positioning method and system based on image sensing provided by the embodiment of the invention comprises the steps of extracting motion characteristics, calculating fluctuation degree, obtaining a candidate data set, predicting an angle and tracking a lifting appliance, and solves the problems that the existing lifting appliance detection method cannot solve the problem of shielding of the lifting appliance, neglects the influence of a lifting operation environment, has low positioning precision and cannot realize efficient automatic detection, the embodiment of the invention predicts the motion trend of the lifting appliance by utilizing the motion characteristics in local time of the lifting appliance, provides reliable feedback data for the prediction of the lifting appliance angle, greatly improves the positioning precision of the lifting appliance of a crane, can guide the next step of falling of the lifting appliance, does not need to consider the problem that the lifting appliance swings due to interference factors, eliminates the positioning problem and potential safety hazard caused by swinging, and simultaneously, even if the lifting appliance is shielded, tracking and safety reminding can be realized, the collision accidents of workers are reduced, and the operation safety is improved; in addition, the embodiment has the characteristics of high reliability, low cost, strong anti-interference capability and quick and accurate positioning.
The above embodiments are preferred embodiments of the present invention, but the present invention is not limited to the above embodiments, and any other changes, modifications, substitutions, combinations, and simplifications which do not depart from the spirit and principle of the present invention should be construed as equivalents thereof, and all such changes, modifications, substitutions, combinations, and simplifications are intended to be included in the scope of the present invention.

Claims (10)

1. Image perception-based crane lifting appliance guiding and positioning method is characterized by comprising the following steps:
s1, based on a lifting appliance swing angle, fitting by setting a window size and a sliding window by using a simple pendulum model and a random sampling consistency algorithm to obtain a front-back and left-right movement characteristic sequence, and counting local interior points of each element participating in fitting to obtain the accuracy of each element;
s2, randomly sampling the front and back motion characteristic sequences for a plurality of times to obtain a plurality of subsequences, selecting any one of the subsequences, obtaining a Laplace matrix according to the local similarity of any two elements of the subsequence, filtering the subsequences by utilizing the characteristic vectors and energy state distribution of the subsequence to obtain low-frequency vectors, and obtaining the fluctuation degree according to the subsequences and the low-frequency vectors;
s3, obtaining a pre-and-post prediction angle sequence of each subsequence by using a least square method, and obtaining a candidate data set of each subsequence by using mean shift clustering based on the pre-and-post prediction angle sequence and the accuracy corresponding to each element of the pre-and-post prediction angle sequence;
s4, obtaining confidence characteristics of each candidate data set according to the aging characteristics and the fluctuation degree of the subsequence, obtaining candidate similarity of any two candidate data sets according to the contact ratio and the confidence characteristics of the two candidate data sets, constructing an adjacent matrix according to the candidate similarity, carrying out tangent and mean shift clustering on the adjacent matrix to obtain an optimal front-back prediction angle set, and similarly, obtaining an optimal left-right prediction angle set according to the left-right motion characteristic sequence;
and S5, tracking the lifting appliance according to the optimal front-back and left-right prediction angle set.
2. The image perception-based crane spreader guiding and positioning method as claimed in claim 1, wherein: the forward and backward motion characteristics comprise the amplitude, the angular velocity and the initial phase of the forward and backward motion;
the left-right motion characteristics include amplitude, angular velocity, and initial phase of the left-right motion.
3. The image perception-based crane spreader guiding and positioning method according to claim 2, wherein any one of the subsequences is selected, and the laplacian matrix is obtained according to local similarity of any two elements of the subsequence, specifically:
selecting any one subsequence, obtaining local similarity according to the negative exponential power of the Euclidean distance of any two elements of the subsequence, and constructing a sub-adjacency matrix according to all the local similarities;
and acquiring a degree matrix according to the sub-adjacency matrix, and acquiring a Laplace matrix according to the sub-adjacency matrix and the degree matrix.
4. The image perception-based crane spreader guiding and positioning method according to claim 3, wherein the sub-sequence is filtered by using its feature vectors and energy state distribution to obtain low-frequency vectors, specifically:
taking the characteristic vector of the Laplace matrix as a base, and carrying out image Fourier transform on the amplitude component of each element in the subsequence to obtain a frequency domain vector;
acquiring energy state distribution of each node according to the frequency domain vector, and filtering the frequency domain vector according to the energy state distribution and the feature vector to obtain a frequency domain filtering vector;
and carrying out inverse Fourier transform on the image by using the feature vector and the frequency domain filtering vector to obtain a low-frequency vector.
5. The image perception-based crane sling guiding and positioning method according to claim 2, wherein the pre-and post-prediction angle sequence of each subsequence is obtained by using a least square method, specifically:
fitting the amplitude in each subsequence by using a least square method to obtain an amplitude predicted value of each subsequence at a future moment, and obtaining a front-rear prediction angle sequence by using a simple pendulum prediction model based on each amplitude predicted value and each element in the subsequence corresponding to the amplitude predicted value.
6. The image perception-based crane spreader guide positioning method of claim 4, wherein the confidence features are:
Figure 953671DEST_PATH_IMAGE001
in the formula, B represents a confidence coefficient characteristic, C represents an aging characteristic, V represents the fluctuation degree of a subsequence, D represents the number of elements of the subsequence corresponding to the candidate data set, and n is the base number of a logarithm;
and the aging characteristic is the ratio of the number mean value in any subsequence to the sum of the number mean values of all the subsequences.
7. The image perception-based crane spreader guiding and positioning method according to claim 1, wherein the specific acquisition process of the coincidence degree of any two of the candidate data sets is as follows:
and randomly selecting two candidate data sets, respectively selecting an element from the two candidate data sets to obtain a feature point pair of a preset point pair number, obtaining a difference set and a difference coincidence subset according to the Euclidean distance of the two elements in each feature point pair, and calculating the coincidence degree according to the intersection ratio of the two candidate data sets, the difference set and the difference coincidence subset.
8. The image perception-based crane spreader guiding and positioning method according to claim 1, wherein the adjacency matrix is subjected to tangent and mean shift clustering to obtain an optimal front-rear angle set, specifically:
performing graph cutting processing on the adjacency matrix to cut all the candidate data sets into a plurality of categories;
calculating the confidence characteristic mean value of the candidate data set contained in each category to obtain the evaluation value of each category;
performing mean shift clustering on all elements in the category with the maximum evaluation value to obtain an optimal front and back set;
and extracting the forward and backward prediction angle components of all elements in the optimal forward and backward set to obtain an optimal forward and backward prediction angle set.
9. The image perception-based crane spreader guiding and positioning method as claimed in claim 1, wherein: the accuracy is the ratio of the number of local points participating in fitting to the number of elements in the window when the hanger angle data is subjected to sliding window fitting in sequence according to the set window size.
10. Hoist guide positioning system based on image perception, its characterized in that: the system comprising a method according to any of claims 1 to 9.
CN202110115422.3A 2021-01-28 2021-01-28 Crane lifting appliance guiding and positioning method and system based on image sensing Active CN112465874B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110115422.3A CN112465874B (en) 2021-01-28 2021-01-28 Crane lifting appliance guiding and positioning method and system based on image sensing

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110115422.3A CN112465874B (en) 2021-01-28 2021-01-28 Crane lifting appliance guiding and positioning method and system based on image sensing

Publications (2)

Publication Number Publication Date
CN112465874A true CN112465874A (en) 2021-03-09
CN112465874B CN112465874B (en) 2021-04-30

Family

ID=74802807

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110115422.3A Active CN112465874B (en) 2021-01-28 2021-01-28 Crane lifting appliance guiding and positioning method and system based on image sensing

Country Status (1)

Country Link
CN (1) CN112465874B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011131991A (en) * 2009-12-24 2011-07-07 Kato Works Co Ltd Crane device
KR20150056494A (en) * 2013-11-15 2015-05-26 인포겟시스템 주식회사 method for auto calculating for examination of block lifting safety
CN105800464A (en) * 2016-04-29 2016-07-27 南开大学 Positioning method based on automatic lifting hook system
CN109052180A (en) * 2018-08-28 2018-12-21 北京航天自动控制研究所 A kind of container automatic aligning method and system based on machine vision
CN110790142A (en) * 2019-09-17 2020-02-14 中联重科股份有限公司 Crane amplitude deflection compensation method and system and crane
CN110996068A (en) * 2019-12-20 2020-04-10 上海振华重工(集团)股份有限公司 Automatic tracking system, equipment and method for lifting appliance
CN111517212A (en) * 2020-06-06 2020-08-11 武汉耀华桥梁工程技术有限公司 Lifting appliance capable of adjusting position of lifting point and inclination angle of lifted object and operation method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011131991A (en) * 2009-12-24 2011-07-07 Kato Works Co Ltd Crane device
KR20150056494A (en) * 2013-11-15 2015-05-26 인포겟시스템 주식회사 method for auto calculating for examination of block lifting safety
CN105800464A (en) * 2016-04-29 2016-07-27 南开大学 Positioning method based on automatic lifting hook system
CN109052180A (en) * 2018-08-28 2018-12-21 北京航天自动控制研究所 A kind of container automatic aligning method and system based on machine vision
CN110790142A (en) * 2019-09-17 2020-02-14 中联重科股份有限公司 Crane amplitude deflection compensation method and system and crane
CN110996068A (en) * 2019-12-20 2020-04-10 上海振华重工(集团)股份有限公司 Automatic tracking system, equipment and method for lifting appliance
CN111517212A (en) * 2020-06-06 2020-08-11 武汉耀华桥梁工程技术有限公司 Lifting appliance capable of adjusting position of lifting point and inclination angle of lifted object and operation method

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
JINXIN CAO ET AL: "The integrated yard truck and yard crane scheduling problem: Benders’ decomposition-based methods", 《TRANSPORTATION RESEARCH PART E: LOGISTICS AND TRANSPORTATION REVIEW》 *
MOHAMMED ADELABDELMEGID ET AL: "GA optimization model for solving tower crane location problem in construction sites", 《ALEXANDRIA ENGINEERING JOURNAL》 *
万雷: "起重机吊重偏摆与定位系统的控制研究", 《中国优秀硕士学位论文全文数据库(电子期刊)工程科技Ⅱ辑》 *
漆静: "基于机器视觉集装箱吊具智能定位系统研究", 《中国优秀硕士学位论文全文数据库(电子期刊)信息科技辑》 *
漆静等: "机器视觉用于集装箱吊具定位系统设计", 《起重运输机械》 *

Also Published As

Publication number Publication date
CN112465874B (en) 2021-04-30

Similar Documents

Publication Publication Date Title
WO2018028103A1 (en) Unmanned aerial vehicle power line inspection method based on characteristics of human vision
CN113255481B (en) Crowd state detection method based on unmanned patrol car
CN110533722A (en) A kind of the robot fast relocation method and system of view-based access control model dictionary
CN107818326A (en) A kind of ship detection method and system based on scene multidimensional characteristic
CN101339658B (en) Aerial photography traffic video rapid robust registration method
CN109509210A (en) Barrier tracking and device
CN104121902B (en) Implementation method of indoor robot visual odometer based on Xtion camera
EP4137901A1 (en) Deep-learning-based real-time process monitoring system, and method therefor
CN109784204A (en) A kind of main carpopodium identification of stacking string class fruit for parallel robot and extracting method
CN108363953B (en) Pedestrian detection method and binocular monitoring equipment
CN111126116A (en) Unmanned ship river channel garbage identification method and system
CN113791400B (en) Stair parameter autonomous detection method based on laser radar
CN111598172B (en) Dynamic target grabbing gesture rapid detection method based on heterogeneous depth network fusion
CN109816051B (en) Hazardous chemical cargo feature point matching method and system
CN112597877A (en) Factory personnel abnormal behavior detection method based on deep learning
JP2017076289A (en) Parameter decision device, parameter decision method and program
CN112766145B (en) Method and device for identifying dynamic facial expressions of artificial neural network
CN114170188A (en) Target counting method and system for overlook image and storage medium
CN112465874B (en) Crane lifting appliance guiding and positioning method and system based on image sensing
Nenchoo et al. Real-Time 3D UAV pose estimation by visualization
Liu et al. Outdoor camera calibration method for a GPS & camera based surveillance system
Pan et al. Vision-based approach angle and height estimation for UAV landing
Frank et al. Stereo-vision for autonomous industrial inspection robots
CN112733584A (en) Intelligent alarm method and device for communication optical cable
Jie An aircraft image detection and tracking method based on improved optical flow method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant