CN115859212A - Method and system for autonomous deployment and recovery of marine equipment - Google Patents

Method and system for autonomous deployment and recovery of marine equipment Download PDF

Info

Publication number
CN115859212A
CN115859212A CN202211461965.1A CN202211461965A CN115859212A CN 115859212 A CN115859212 A CN 115859212A CN 202211461965 A CN202211461965 A CN 202211461965A CN 115859212 A CN115859212 A CN 115859212A
Authority
CN
China
Prior art keywords
information
marine
marine equipment
equipment
tracking
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211461965.1A
Other languages
Chinese (zh)
Other versions
CN115859212B (en
Inventor
司徒伟伦
杨文林
王蕴婷
曾维湘
吴辉源
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Intelligent Unmanned System Research Institute
Original Assignee
Guangdong Intelligent Unmanned System Research Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Intelligent Unmanned System Research Institute filed Critical Guangdong Intelligent Unmanned System Research Institute
Priority to CN202211461965.1A priority Critical patent/CN115859212B/en
Publication of CN115859212A publication Critical patent/CN115859212A/en
Application granted granted Critical
Publication of CN115859212B publication Critical patent/CN115859212B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses a method for autonomous deployment and recovery of marine equipment, which comprises the following steps: acquiring image characteristic point information of the marine equipment based on a preset tracking rule; carrying out data fusion on inertial measurement information of the marine equipment and image characteristic point information of the marine equipment to generate pose information of the marine equipment; carrying out data fusion on the wireless carrier information of the marine equipment and the pose information of the marine equipment to generate a positioning track predicted value of the marine equipment; and controlling an actuating mechanism to automatically distribute and recover the marine equipment based on the positioning track predicted value of the marine equipment. Therefore, the method and the system provided by the invention can solve the problems of laying and recovering under complicated and severe ocean conditions, and can automatically lay and recover ocean equipment automatically, unmanned, reliably and efficiently.

Description

Method and system for autonomous deployment and recovery of marine equipment
Technical Field
The invention relates to the technical field of marine equipment networks, in particular to a method and a system for automatically deploying and retrieving marine equipment.
Background
With the decrease of land resources and the increase of mining costs, mankind gradually turns his eyes to underwater. The exploitation of underwater resources becomes a new technical deployment key of each country, but the underwater operation environment makes the exploration and acquisition of resources very difficult, and most of operation tasks can be only performed through water equipment. However, the conditions on water are complex, and particularly in marine operation, the equipment needs to be laid and recovered regularly. The marine environment is changeable and unstable, and the manual laying and recovery are very dangerous and unreliable.
In order to guarantee the safety of marine workers and realize reliable and stable recovery of marine equipment, autonomous deployment and recovery equipment needs to be installed on a marine working ship. However, at present, the self-arranging and recovering equipment has weak adaptability to complex sea conditions, positioning accuracy is insufficient, optical sensors are mostly used for short-distance positioning of marine equipment, and robustness in the sea is not high. In addition, data time delay exists in the space positioning of the multiple sensors, the calculated amount is large, the track closing effect of each sensor in the space is poor, and the accumulated error of the pose is large.
Disclosure of Invention
The invention aims to solve the technical problem of providing a method and a system for automatically deploying and retrieving marine equipment, which can solve the deploying and retrieving problems under complex and severe marine conditions and automatically deploy and retrieve the marine equipment in an automated, unmanned, reliable and efficient manner.
In order to solve the technical problems, the invention discloses a method for autonomous deployment and recovery of marine equipment in a first aspect, which comprises the following steps: acquiring image characteristic point information of the marine equipment based on a preset tracking rule; carrying out data fusion on inertial measurement information of the marine equipment and image characteristic point information of the marine equipment to generate pose information of the marine equipment; carrying out data fusion on the wireless carrier information of the marine equipment and the pose information of the marine equipment to generate a positioning track predicted value of the marine equipment; and controlling an actuating mechanism to automatically distribute and recover the marine equipment based on the positioning track predicted value of the marine equipment.
In some embodiments, the obtaining of image feature point information of a marine device based on preset tracking rules includes: acquiring first frame position information in real-time image information of marine equipment; tracking the first frame position information by using a region construction twin network tracking algorithm, and judging whether tracking is successful; if the tracking is judged to be successful, acquiring the image characteristic point information of the first frame position information, and acquiring the next frame position information of the first frame position information, and circulating the step of acquiring the image characteristic point information of the marine equipment based on the preset tracking rule.
In some embodiments, tracking the first frame position information by using a region-building twin network tracking algorithm, and determining whether tracking is successful, and then if it is determined that tracking is successful, saving the image feature point information of the first frame position information as training sample data, further including: if the tracking fails, training the training sample data based on a target detection algorithm to generate a detection model; inputting the position information of the next frame which fails to be tracked into the detection model to generate and retrieve the image information; and acquiring the image characteristic point information of the retrieved image information.
In some embodiments, the data fusion of the inertial measurement information of the marine device and the image feature point information of the marine device to generate pose information of the marine device includes: analyzing the image characteristic point information of the marine equipment and the inertial measurement information of the marine equipment to generate a Kalman filtering estimation model; and substituting the image characteristic point information of the marine equipment and the inertial measurement information of the marine equipment into the Kalman filtering estimation model, and predicting the marine equipment through Kalman filtering to generate the position and pose information of the marine equipment.
In some embodiments, the data fusion of the wireless carrier information of the marine device and the pose information of the marine device to generate the predicted value of the positioning track of the marine device includes: constructing an error state model by using wireless carrier information of the marine equipment; and substituting the wireless carrier information of the marine equipment and the pose information of the marine equipment into the error state model, and generating a positioning track predicted value of the marine equipment through volume Kalman filtering.
According to a second aspect of the present invention, there is provided a system for autonomous deployment and retrieval of marine equipment, the system comprising: the binocular camera is used for acquiring images of the marine equipment; the image processing module is used for flexibly processing the image of the marine device based on a preset tracking rule to generate image characteristic point information of the marine device; the inertial measurement module is used for acquiring inertial measurement information of the marine equipment; the first fusion module is used for carrying out data fusion on inertial measurement information of the marine equipment and image characteristic point information of the marine equipment to generate pose information of the marine equipment; the wireless carrier module is used for acquiring wireless carrier information of the marine equipment; the second fusion module is used for carrying out data fusion on the wireless carrier information of the marine equipment and the pose information of the marine equipment to generate a positioning track predicted value of the marine equipment; and the executing mechanism is used for automatically deploying and retrieving the marine equipment based on the positioning track predicted value of the marine equipment.
In some embodiments, the image processing module comprises: the frame processing unit is used for acquiring first frame position information in real-time image information of the marine equipment; the tracking unit is used for tracking the first frame position information by utilizing a region construction twin network tracking algorithm and judging whether the tracking is successful or not; and the characteristic point information acquisition unit is used for acquiring the image characteristic point information of the first frame of position information and acquiring the next frame of position information of the first frame of position information when the tracking is judged to be successful, and circulating the step of acquiring the image characteristic point information of the marine equipment based on the preset tracking rule.
In some embodiments, the feature point information obtaining unit further stores image feature point information of the first frame position information as training sample data, and the feature point information obtaining unit is further configured to: if the tracking failure is judged, training the training sample data based on a target detection algorithm to generate a detection model; inputting the position information of the first frame which fails to track into the detection model to generate and retrieve the image information; and acquiring the image characteristic point information of the retrieved image information.
In some embodiments, the first fusion module comprises: the Kalman filtering estimation model is generated by carrying out Kalman filtering on image characteristic point information of the marine equipment and inertial measurement information of the marine equipment; and the first prediction unit is used for substituting the image characteristic point information of the marine device and the inertial measurement information of the marine device into the Kalman filtering estimation model, and predicting the marine device through Kalman filtering to generate the position and pose information of the marine device.
In some embodiments, the second fusion module comprises: the second prediction unit is used for substituting the wireless carrier information of the marine equipment and the pose information of the marine equipment into the error state model and generating a predicted value of the positioning track of the marine equipment through volume Kalman filtering.
Compared with the prior art, the invention has the beneficial effects that:
the implementation of the invention can develop the autonomous deployment and recovery method of the ocean equipment based on the data fusion of a plurality of sensors aiming at the actual engineering requirements of short-distance positioning under complex sea conditions. Firstly, based on the complexity of underwater communication and the severe conditions of marine environment, data fusion of physical model simulation is selected to be carried out on a binocular camera, an inertia measurement module, a wireless carrier module and the like, so that the visualization of the motion pose of marine equipment is realized. Secondly, aiming at the action time delay during the deployment and recovery, an object positioning algorithm based on multi-sensor data fusion is developed in the changeable and complex marine environment, the path of the marine equipment can be predicted, the autonomous deployment and recovery of the marine equipment are achieved, and the marine equipment has good positioning effect and tracking accuracy and strong robustness. Therefore, the problem of laying and recycling under complex and severe ocean conditions can be solved, and automatic, unmanned, reliable and efficient laying and recycling of ocean equipment are realized.
Drawings
FIG. 1 is a schematic flow chart of autonomous deployment and recovery of marine equipment according to an embodiment of the present invention;
FIG. 2 is a flowchart of a method for acquiring image feature point information of a marine device based on a preset tracking rule according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of a system for autonomous deployment and retrieval of marine devices according to an embodiment of the present disclosure;
fig. 4 is a schematic view of a binocular camera mounting structure of the system for autonomous deployment and recovery of marine equipment disclosed in the embodiment of the present invention;
FIG. 5 is a schematic diagram of a wireless carrier module of an autonomous deployment and retrieval system for marine devices according to an embodiment of the present invention;
fig. 6 is a schematic structural diagram of an execution module of the system for autonomous deployment and recovery of marine equipment according to an embodiment of the present invention.
Detailed Description
For better understanding and implementation, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The terms "comprises," "comprising," and any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or modules is not necessarily limited to those steps or modules explicitly listed, but may include other steps or modules not expressly listed or inherent to such process, method, article, or apparatus.
The embodiment of the invention discloses a method and a system for autonomous deployment and recovery of marine equipment, which can be used for developing a method for autonomous deployment and recovery of marine equipment based on data fusion of a plurality of sensors according to the actual engineering requirements of short-distance positioning under complex sea conditions. Firstly, the multi-sensor category of the marine equipment is analyzed and matched under the consideration of the complex underwater communication and the severe ocean, and the motion pose visualization of the marine equipment is realized by carrying out the data fusion of physical model simulation on a binocular camera, an inertia measurement module, a wireless carrier module and the like. Secondly, an object positioning algorithm based on multi-sensor data fusion is developed aiming at the action time delay during the distribution and recovery and the changeable and complex ocean environment, the path of the ocean equipment can be predicted, the autonomous distribution and recovery of the ocean equipment are achieved, and the method has good positioning effect, tracking accuracy and strong robustness. Therefore, the problem of laying and recycling under complex and severe ocean conditions can be solved, and automatic, unmanned, reliable and efficient laying and recycling of ocean equipment are realized.
Referring to fig. 1, fig. 1 is a schematic flow chart of a method for autonomous deployment and recovery of marine equipment according to an embodiment of the present invention. The method for autonomous deployment and recovery of the marine equipment can be applied to various application systems of monitoring, management, recovery and the like of the marine equipment, and the application of the method is not limited by the embodiment of the invention. As shown in fig. 1, the method for autonomous deployment and retrieval of marine equipment may include the operations of:
101. and acquiring image characteristic point information of the marine equipment based on a preset tracking rule.
In order to accurately position the marine equipment, the marine equipment can be realized as an Autonomous Underwater Vehicle (AUV), an underwater cable-controlled Robot (ROV) and the like. The image of marine equipment is obtained to the first binocular camera that utilizes mother's ship, and this camera can realize for binocular camera, can set up at actuating mechanism's end from this, can visit the optics real-time image of marine equipment, and wherein, in the actual processing process, can utilize artifical automatic selection to treat the marine equipment of laying and retrieving on the real-time image who obtains to put the record.
Specifically, as shown in fig. 2, a flowchart of a method for acquiring image feature point information of a marine device based on a preset tracking rule includes: the method comprises the steps of obtaining first frame position information in real-time image information of the marine device, wherein the first frame position information can be used for realizing two-dimensional positioning of the marine device at the current moment, then tracking the first frame position information by using a region construction twin network tracking algorithm, namely a SimRPN tracking algorithm, obtaining next frame position information can be used for realizing two-dimensional positioning of the marine device at the next moment, judging whether tracking is successful or not by setting an experience threshold value, judging and manually judging whether signals of the marine device are disconnected or not.
If the tracking is judged to be successful, image feature point information is extracted from the obtained first frame position information by directly utilizing image processing, and at the moment, the next frame position information is triggered to become the first frame position information, so that the step can be circulated, and long-time tracking and locking of the marine equipment are realized. In addition, the image feature point information of the first frame position information is also saved as training sample data.
If the tracking fails, training sample data to generate a detection model based on a target detection algorithm, in this embodiment, a YOLOv5 detection algorithm is adopted, the detection model is designed in a manner that YOLOv5 detection can be performed according to model characteristics and signal response characteristics of marine equipment, and an intelligent detection model can be obtained by using the position information of the plurality of first frames successfully positioned and tracked as training samples. And then, the position information of the next frame which fails to be tracked can be input into the detection model to generate the re-retrieval image information, and the image characteristic point information of the re-retrieval image information is obtained, so that the re-retrieval operation of the lost marine equipment in short-time tracking is realized.
As a preferred embodiment, the accuracy of detecting the marine device is also ensured by comparing the obtained recovered image with the training sample data through the NCC correlation detection algorithm. The correlation detection formula (1) is as follows,
Figure SMS_1
wherein p is the mark of the target ocean equipment, namely representing the recovered image, wp is the window area, and I 1 (x, y) is the pixel value, I ', of the history tracking image (i.e. training sample data)' 1 (p x ,p y ) Tracking the mean of pixels of an image for history, I 2 (x, y) is the pixel value, I 'of the target image' 2 (p x ,p y ) The window pixel mean is matched for the target image. If NCC = -1, it means that the two matching windows are completely uncorrelated, and conversely, if NCC =1, it means that the two matching windows are very correlated. Therefore, the robustness of marine device tracking can be further improved by carrying out correlation detection on the retrieved images. The method can solve the problem of low robustness under the condition that the appearance of the marine equipment is not common or mutation occurs during detection, and can realize long-time tracking on the marine equipment.
102. And carrying out data fusion on the inertial measurement information of the marine equipment and the image characteristic point information of the marine equipment to generate pose information of the marine equipment.
Further, inertia measurement information of the marine device is obtained, and current inertia measurement information of the marine device can be read through an inertia measurement sensor of the marine device, wherein the inertia measurement information comprises gravity, deviation, parameters related to a driving track and the like of the marine device. Through the acquired inertial measurement information, the estimated positioning of the marine device can be acquired through Kalman filtering by using the inertial measurement information and the acquired image characteristic point information of the marine device, and latest marine device pose information is issued, wherein the issued marine device pose information can be realized in the mode of issuing a odometer. Specifically, the image feature point information of the marine device and the inertial measurement information of the marine device can be analyzed to generate a kalman filtering estimation model, and the kalman filtering estimation model is used for predicting the marine device to generate the pose information of the marine device, wherein the pose information can be used for feeding back the positioning data in the shot image of the marine device.
In addition, in practical application, inertial measurement information of the marine device can be given to a camera of the marine device, because the inertial measurement information and the camera are bound together, and the track acquisition of the camera is more convenient and accurate compared with the measurement of the arrangement of the marine device.
103. And carrying out data fusion on the wireless carrier information of the marine equipment and the pose information of the marine equipment to generate a positioning track predicted value of the marine equipment.
Aiming at the condition of severe ocean special vision, when the vision can not accurately position the obtained object, an error state model can be constructed by utilizing the wireless carrier information of the ocean equipment and the pose information of the ocean equipment, then the pose information of the ocean equipment and the wireless carrier information are brought into the error state model, and a positioning track prediction value of the ocean equipment is generated through a volume Kalman method. Specifically, the method comprises the following steps:
firstly, receiving a pulse signal sent by a tag to be detected through an ocean device such as a ship UWB wireless carrier base station, wherein the UWB wireless carrier base station comprises two base stations arranged at different positions, analyzing the pulse signal to obtain the arrival angle of the pulse signal, and obtaining the arrival angle of the pulse signal through a formula (2):
Figure SMS_2
sin alpha is the angle value reached by the pulse signal, l is the distance between the antennas of the two wireless carriers, Δ t is the time difference between the arrival of the pulse signal at different antennas, and c is the speed of the signal, i.e., the speed of light.
Then, the position of the label to be measured is obtained through the formula (3):
Figure SMS_3
wherein the coordinates of the wireless carrier base stations are set to (x) A ,y A ),(x B ,y B ) Setting the position to be measured as (x, y), and the pulse signal arrival angle beta 1 ,β 2
Acquiring wireless carrier information of the marine equipment, namely positioning information of the marine equipment in real time by the UWB wireless carrier base station, and assuming that an observed value positioned by the UWB wireless carrier base station at the moment k is aligned with a time stamp of position and attitude information of the marine equipment by using error state volume Kalman filtering, wherein the position coordinate at the moment is (x) k ,y k ) At a speed of
Figure SMS_4
The state vector (4) is then as follows:
Figure SMS_5
wherein X' k Is predicted value, is δ x k Is an error value
The error state model of the construction system is as follows (5), (6), (7) and (8):
Figure SMS_6
Figure SMS_7
Figure SMS_8
Figure SMS_9
/>
where t is the sampling interval of the system;
Figure SMS_10
and &>
Figure SMS_11
Representing the acceleration of the combined system in the x-direction and the y-direction at time k-1.
Positioning data which is received position information of marine equipment is used as state vector predicted value X' k And then updating the error state. The prediction of the error state and covariance at time k is shown in equations (9) and (10):
δX k|k-1 =FδX k-1 +W k-1 (9)
M k|k-1 =FM k-1 +Q k-1 (10)
where F is the state matrix of the system, W k-1 Is the noise vector of the system, M k-1 Optimal estimation of error state covariance at time k-1; q k-1 Is the noise covariance at time k-1. F, W k-1 And can be defined as follows (11)
Figure SMS_12
The difference value of the positioning information obtained according to the position information of the UWB wireless carrier base station and the position information of the ocean equipment, namely the positioning data, is used as the observation vector Y of the system k The following formula (12):
Y k =HδX k +V k (12)
wherein V k To observe the noise: y is k And H has the following formula (13):
Figure SMS_13
on the basis, the volume Kalman filtering step is divided into three parts, namely state updating, measurement updating and state estimation, and the specific steps are as follows: first, an error state updating section includes calculating an error state quantity predicted value δ x' k+1|k Error covariance prediction value P k+1|k . The specific equations are as follows (14), (15), (16) and (17):
Figure SMS_14
Figure SMS_15
Figure SMS_16
Figure SMS_17
in the above formula S k Defined by spherical integral, ξ i Is a volume point set, u k For system input, n is the dimension of the state vector
Then, a measurement update section including calculation of a measurement predicted value y' k+1|k Measurement error covariance
Figure SMS_18
Cross covariance ≥ er>
Figure SMS_19
The specific equation is as follows (18), (19), (20), (21) and (22):
Figure SMS_20
Figure SMS_21
/>
Figure SMS_22
Figure SMS_23
Figure SMS_24
finally, a state estimation part including the calculation of the Kalman gain K k+1 Error state quantity estimated value δ x' k+1 Error covariance estimate P k+1 . The specific equation is as follows (23), (24) and (25):
Figure SMS_25
δx′ k+1 =δx′ k+1|k +K k+1 (y k+1 -y′ k+1 ) (24)
Figure SMS_26
then, the delta x 'can be estimated according to the error state quantity' k+1 Obtaining the optimal estimated value X of the real state at the moment of k +1 k =X′ k +δx′ k+1 . When the current-time target ocean equipment is shielded or suddenly changed in motion, the state of the current-time target ocean equipment can still be estimated according to the result deduced in the previous moment, and the robustness of overall positioning is improved. In addition, through the data fusion processing of the two steps, the track closing effect of each sensor of the ocean equipment on the space can be improved.
104. And controlling an actuating mechanism to automatically distribute and recover the marine equipment based on the positioning track predicted value of the marine equipment.
After the predicted value of the positioning track of the marine equipment is obtained, the two groups of data can be stored as a control data group which can be read by an executing mechanism, the executing mechanism can be realized as a mechanical arm, the control data group is analyzed by utilizing inverse kinematics of mechanical arm motion control, the control moment and the rotation angle of each joint of the mechanical arm are obtained, and therefore the control of the executing mechanism can be realized through a PID algorithm.
Referring to fig. 3, fig. 3 is a schematic view of an autonomous deployment and recovery system for marine equipment according to an embodiment of the present invention. As shown in fig. 3, the system for autonomous deployment and retrieval of marine devices comprises:
the binocular camera 1, the image processing module 2, the inertia measurement module 3, the first fusion module 4, the wireless carrier module 5, the second fusion module 6 and the actuating mechanism 7.
The binocular camera 1 is used for acquiring images of the marine equipment. In practical application, the binocular camera has a structure as shown in fig. 4, and comprises a left eye camera 11, a right eye camera 12 and an illumination system 13. Is mounted at the end of the actuator 7. The binocular camera 1 can acquire optical images with proper brightness, uniform light and protruding objects.
The image processing module 2 is used for flexibly processing the images of the marine device based on preset tracking rules to generate image feature point information of the marine device. The image processing module 2 includes a frame processing unit 21, a tracking unit 22, and a feature point information obtaining unit 23. The frame processing unit 21 is used for acquiring first frame position information in the real-time image information of the marine device. The tracking unit 22 is configured to track the first frame position information by using a region construction twin network tracking algorithm, and determine whether tracking is successful. The feature point information obtaining unit 23 is configured to, if it is determined that the tracking is successful, obtain image feature point information of first frame position information, and obtain next frame position information of the first frame position information, and cyclically obtain image feature point information of the marine device based on a preset tracking rule. Specifically, the frame processing unit 21 obtains first frame position information in real-time image information of the marine device, where the first frame position information may be implemented as two-dimensional positioning of the marine device at the current time, and then the tracking unit 22 tracks the first frame position information by using a region-building twin network tracking algorithm, i.e., a SiamRPN tracking algorithm, obtains next frame position information, which may be implemented as two-dimensional positioning of the marine device at the next time, and determines whether tracking is successful by setting an empirical threshold. The judgment and the manual judgment can also be realized by utilizing whether the signal of the marine equipment is disconnected or not.
If the tracking is judged to be successful, the image processing is directly utilized to obtain the next frame of position information and extract the image characteristic point information, and at the moment, the next frame of position information is triggered to become the first frame of position information, so that the step can be circulated, and long-term tracking and locking of the marine equipment are realized. In addition, the image feature point information of the first frame position information is also saved as training sample data.
Further, the feature point information obtaining unit 23 further stores the image feature point information of the first frame position information as training sample data, and the feature point information obtaining unit is further configured to: and if the tracking failure is judged, training the training sample data based on a target detection algorithm to generate a detection model, inputting the position information of the next frame of the tracking failure into the detection model to generate the image information to be retrieved, and acquiring the image characteristic point information of the image information to be retrieved. Specifically, the method comprises the following steps: if the tracking fails, training sample data to generate a detection model based on a target detection algorithm, in this embodiment, a YOLOv5 detection algorithm is adopted, the detection model is designed in a manner that YOLOv5 detection can be performed according to model characteristics and signal response characteristics of marine equipment, and an intelligent detection model can be obtained by using the position information of the plurality of first frames successfully positioned and tracked as training samples. And then, the position information of the next frame which fails to be tracked is input into the detection model to generate the image retrieving information, and the image characteristic point information of the image retrieving information is obtained, so that the position information of the marine equipment which is lost by short-time tracking is retrieved.
As a preferred embodiment, the accuracy of detecting the marine device is ensured by comparing the obtained recovered image with the training sample data through the NCC correlation detection algorithm. The correlation detection formula (1) is as follows,
Figure SMS_27
wherein p is the mark of the target ocean equipment, namely representing the recovered image, wp is the window area, and I 1 (x, y) is the pixel value, I ', of the history tracking image (i.e. training sample data)' 1 (p x ,p y ) Tracking pixel mean of images for history, I 2 (x, y) is a target mapPixel value of picture, I' 2 (p x ,p y ) The window pixel mean is matched for the target image. If NCC = -1, it means that the two matching windows are completely uncorrelated, and conversely, if NCC =1, it means that the two matching windows are very correlated. Therefore, the robustness of marine device tracking can be further improved by carrying out correlation detection on the retrieved images. The method can solve the problem of low robustness under the condition that the appearance of the marine equipment is not common or mutation occurs during detection, and can realize long-time tracking on the marine equipment.
The inertial measurement module 3 is used for acquiring inertial measurement information of the marine device, and the module can be assembled on the marine device to acquire basic motion information and attitude, namely the inertial measurement information of the marine device through a pulse communication signal of the marine device.
The first fusion module 4 is used for performing data fusion on the inertial measurement information of the marine device and the image feature point information of the marine device to generate pose information of the marine device. Specifically, the first fusion module includes: a kalman filter estimation model 41 for performing kalman filter generation by using the image feature point information of the marine device and the inertial measurement information of the marine device; and the first prediction unit 42 substitutes the image characteristic point information of the marine device and the inertial measurement information of the marine device into the Kalman filtering estimation model, and predicts the marine device through Kalman filtering to generate the pose information of the marine device. Specifically, inertia measurement information of the marine device is acquired, and current inertia measurement information of the marine device, including gravity, deviation, parameters related to a driving track and the like of the marine device, may be read by an inertia measurement sensor of the marine device. Through the acquired inertial measurement information, the estimated positioning of the marine device can be acquired through Kalman filtering by using the inertial measurement information and the acquired image characteristic point information of the marine device, and latest marine device pose information is issued, wherein the issued marine device pose information can be realized in the mode of issuing a odometer. Specifically, the image feature point information of the marine device and the inertial measurement information of the marine device can be analyzed to generate a kalman filtering estimation model, and the kalman filtering estimation model is used for predicting the marine device to generate the pose information of the marine device, wherein the pose information can be used for feeding back the positioning data in the shot image of the marine device.
The wireless carrier module 5 is used for acquiring wireless carrier information of the marine device. As shown in fig. 5, the wireless carrier modules 5 may be 2 modules, which are disposed on two sides of the mother ship (or may be a marine device), as shown in 51 and 52 in fig. 5.
The second fusion module 6 is used for performing data fusion on the wireless carrier information of the marine device and the pose information of the marine device to generate a predicted value of the positioning track of the marine device. Specifically, the second fusion module includes: the error state model 61 is constructed through wireless carrier information of the marine equipment, and an optimal estimation track model is generated through an error state model volume Kalman filtering estimation model; and the second prediction unit 62 is used for bringing the pose information and the wireless carrier information of the marine device into the error state model and generating a positioning track prediction value of the marine device through the volume kalman. For a specific implementation manner of the second fusion module 6, please refer to the implementation manner of step 103, which is not described herein again.
The executing mechanism 7 is used for performing autonomous deployment and recovery on the marine equipment based on the positioning track prediction value of the marine equipment. Specifically, as shown in fig. 6, a block diagram of an execution module is shown. It can be seen that the actuator 7 is implemented as a robotic arm comprising a pivotable fixed base 71, a column 72, a luffing cylinder 73, a swivel arm 74, a telescopic main arm 75, a telescopic cylinder 76, a telescopic slave arm 77, and an underwater docking device 78. After the predicted value of the positioning track of the marine equipment is obtained, the data can be stored into a control data set which can be read by an actuating mechanism, the control data set is analyzed by utilizing inverse kinematics of mechanical arm motion control, and control torque and rotation angle of each joint of the mechanical arm are obtained, so that control over each component of the mechanical arm can be realized through a PID algorithm.
The embodiment of the invention discloses a computer-readable storage medium for storing a computer program for electronic data exchange, wherein the computer program enables a computer to execute the described autonomous deployment and recovery method for marine equipment.
An embodiment of the invention discloses a computer program product comprising a non-transitory computer readable storage medium storing a computer program, and the computer program is operable to cause a computer to perform the described autonomous deployment and recovery method for marine devices.
The above-described embodiments are only illustrative, and the modules described as separate components may or may not be physically separate, and the components displayed as modules may or may not be physical modules, may be located in one place, or may be distributed on a plurality of network modules. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
Through the above detailed description of the embodiments, those skilled in the art will clearly understand that the embodiments may be implemented by software plus a necessary general hardware platform, and may also be implemented by hardware. Based on such understanding, the above technical solutions may be embodied in the form of a software product, which may be stored in a computer-readable storage medium, wherein the storage medium includes a Read-Only Memory (ROM), a Random Access Memory (RAM), a Programmable Read-Only Memory (PROM), an Erasable Programmable Read-Only Memory (EPROM), a One-time Programmable Read-Only Memory (OTPROM), an Electrically Erasable Programmable Read-Only Memory (EEPROM), a Compact Disc-Read-Only Memory (CD-ROM) or other Memory capable of storing data, a magnetic tape, or any other computer-readable medium capable of storing data.
Finally, it should be noted that: the method and system for autonomous deployment and recovery of marine equipment disclosed in the embodiments of the present invention are only preferred embodiments of the present invention, and are only used for illustrating the technical solutions of the present invention, not for limiting the same; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those skilled in the art; the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the spirit and scope of the technical solutions of the embodiments of the present invention.

Claims (10)

1. A method for autonomous deployment and retrieval of marine equipment, the method comprising:
acquiring image characteristic point information of the marine equipment based on a preset tracking rule;
carrying out data fusion on inertial measurement information of the marine equipment and image characteristic point information of the marine equipment to generate pose information of the marine equipment;
carrying out data fusion on wireless carrier information of the marine equipment and pose information of the marine equipment to generate a positioning track predicted value of the marine equipment;
and controlling an actuating mechanism to automatically distribute and recover the marine equipment based on the positioning track predicted value of the marine equipment.
2. The method for autonomous deployment and retrieval of marine devices according to claim 1, wherein said obtaining image feature point information of marine devices based on preset tracking rules comprises:
acquiring first frame position information in real-time image information of marine equipment;
tracking the first frame position information by using a region construction twin network tracking algorithm, and judging whether tracking is successful;
if the tracking is judged to be successful, acquiring the image characteristic point information of the first frame position information, and acquiring the next frame position information of the first frame position information, and circulating the step of acquiring the image characteristic point information of the marine equipment based on the preset tracking rule.
3. The method according to claim 2, wherein the tracking of the first frame position information is performed by using a region construction twin network tracking algorithm, and whether the tracking is successful or not is determined, and then if the tracking is determined to be successful, the image feature point information of the first frame position information is saved as training sample data, further comprising:
if the tracking fails, training the training sample data based on a target detection algorithm to generate a detection model;
inputting the position information of the next frame which fails to be tracked into the detection model to generate and retrieve the image information;
and acquiring the image characteristic point information of the retrieved image information.
4. The method for autonomous deployment and retrieval of marine devices of claim 1, wherein the data fusion of the inertial measurement information of a marine device and the image feature point information of the marine device to generate pose information of a marine device comprises:
analyzing the image characteristic point information of the marine equipment and the inertial measurement information of the marine equipment to generate a Kalman filtering estimation model;
and substituting the image characteristic point information of the marine equipment and the inertial measurement information of the marine equipment into the Kalman filtering estimation model, and predicting the marine equipment through Kalman filtering to generate the position and pose information of the marine equipment.
5. The method for autonomous deployment and recovery of marine devices according to claim 4, wherein data fusion is performed on wireless carrier information of marine devices and pose information of the marine devices to generate predicted values of positioning tracks of the marine devices, and the method comprises the following steps:
constructing an error state model by using wireless carrier information of the marine equipment;
and substituting the wireless carrier information of the marine equipment and the pose information of the marine equipment into the error state model, and generating a positioning track predicted value of the marine equipment through volume Kalman filtering.
6. A system for autonomous deployment and retrieval of marine equipment, the system comprising:
the binocular camera is used for acquiring images of the marine equipment;
the image processing module is used for processing the image of the marine equipment based on a preset tracking rule to generate image characteristic point information of the marine equipment;
the inertial measurement module is used for acquiring inertial measurement information of the marine equipment;
the first fusion module is used for carrying out data fusion on inertial measurement information of the marine equipment and image characteristic point information of the marine equipment to generate pose information of the marine equipment;
the wireless carrier module is used for acquiring wireless carrier information of the marine equipment;
the second fusion module is used for carrying out data fusion on the wireless carrier information of the marine equipment and the pose information of the marine equipment to generate a positioning track predicted value of the marine equipment;
and the executing mechanism is used for automatically deploying and retrieving the marine equipment based on the positioning track predicted value of the marine equipment.
7. The system for autonomous deployment and retrieval of marine devices of claim 6, wherein said image processing module comprises:
the frame processing unit is used for acquiring first frame position information in real-time image information of the marine equipment;
the tracking unit is used for tracking the first frame position information by utilizing a region construction twin network tracking algorithm and judging whether the tracking is successful or not;
and the characteristic point information acquisition unit is used for acquiring the image characteristic point information of the first frame position information when the tracking is judged to be successful, and acquiring the next frame position information of the first frame position information, and circulating the step of acquiring the image characteristic point information of the marine equipment based on the preset tracking rule.
8. The system according to claim 7, wherein the feature point information obtaining unit further stores image feature point information of the first frame of position information as training sample data, and is further configured to:
if the tracking failure is judged, training the training sample data based on a target detection algorithm to generate a detection model; inputting the position information of the next frame which fails to be tracked into the detection model to generate and retrieve the image information; and acquiring the image characteristic point information of the retrieved image information.
9. The system for autonomous deployment and retrieval of marine equipment of claim 6, wherein said first fusion module comprises:
the Kalman filtering estimation model is generated by carrying out Kalman filtering on image characteristic point information of the marine equipment and inertial measurement information of the marine equipment;
and the first prediction unit is used for substituting the image characteristic point information of the marine device and the inertial measurement information of the marine device into the Kalman filtering estimation model, and predicting the marine device through Kalman filtering to generate the pose information of the marine device.
10. The system for autonomous deployment and retrieval of marine equipment of claim 9, wherein the second fusion module comprises:
the error state model is generated through wireless carrier information of the marine equipment;
and the second prediction unit is used for substituting the wireless carrier information of the marine equipment and the pose information of the marine equipment into the error state model and generating a positioning track prediction value of the marine equipment through volume Kalman filtering.
CN202211461965.1A 2022-11-17 2022-11-17 Autonomous deployment and recovery method and system for marine equipment Active CN115859212B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211461965.1A CN115859212B (en) 2022-11-17 2022-11-17 Autonomous deployment and recovery method and system for marine equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211461965.1A CN115859212B (en) 2022-11-17 2022-11-17 Autonomous deployment and recovery method and system for marine equipment

Publications (2)

Publication Number Publication Date
CN115859212A true CN115859212A (en) 2023-03-28
CN115859212B CN115859212B (en) 2023-07-18

Family

ID=85664681

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211461965.1A Active CN115859212B (en) 2022-11-17 2022-11-17 Autonomous deployment and recovery method and system for marine equipment

Country Status (1)

Country Link
CN (1) CN115859212B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107117268A (en) * 2017-05-12 2017-09-01 西南科技大学 The ocean rubbish recovering method and system of a kind of heterogeneous system
US20180164124A1 (en) * 2016-09-15 2018-06-14 Syracuse University Robust and stable autonomous vision-inertial navigation system for unmanned vehicles
CN110118554A (en) * 2019-05-16 2019-08-13 深圳前海达闼云端智能科技有限公司 SLAM method, apparatus, storage medium and device based on visual inertia
WO2020087846A1 (en) * 2018-10-31 2020-05-07 东南大学 Navigation method based on iteratively extended kalman filter fusion inertia and monocular vision
CN111257914A (en) * 2020-01-14 2020-06-09 杭州电子科技大学 Marine fishing boat track prediction method and system based on Beidou and AIS data fusion
CN113741191A (en) * 2021-09-01 2021-12-03 集美大学 Method and system for tracking water surface target of offshore oil supporting ship
CN113804184A (en) * 2020-06-15 2021-12-17 上海知步邦智能科技有限公司 Ground robot positioning method based on multiple sensors
CN114488164A (en) * 2022-01-17 2022-05-13 清华大学深圳国际研究生院 Underwater vehicle synchronous positioning and mapping method and underwater vehicle

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180164124A1 (en) * 2016-09-15 2018-06-14 Syracuse University Robust and stable autonomous vision-inertial navigation system for unmanned vehicles
CN107117268A (en) * 2017-05-12 2017-09-01 西南科技大学 The ocean rubbish recovering method and system of a kind of heterogeneous system
WO2020087846A1 (en) * 2018-10-31 2020-05-07 东南大学 Navigation method based on iteratively extended kalman filter fusion inertia and monocular vision
CN110118554A (en) * 2019-05-16 2019-08-13 深圳前海达闼云端智能科技有限公司 SLAM method, apparatus, storage medium and device based on visual inertia
CN111257914A (en) * 2020-01-14 2020-06-09 杭州电子科技大学 Marine fishing boat track prediction method and system based on Beidou and AIS data fusion
CN113804184A (en) * 2020-06-15 2021-12-17 上海知步邦智能科技有限公司 Ground robot positioning method based on multiple sensors
CN113741191A (en) * 2021-09-01 2021-12-03 集美大学 Method and system for tracking water surface target of offshore oil supporting ship
CN114488164A (en) * 2022-01-17 2022-05-13 清华大学深圳国际研究生院 Underwater vehicle synchronous positioning and mapping method and underwater vehicle

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
H. CHE ET AL.: "A pose estimation method based on stereo vision and inertial navigation fusion", 《CSAA/IET INTERNATIONAL CONFERENCE ON AIRCRAFT UTILITY SYSTEMS (AUS 2020)》 *
丁虎;: "海洋船舶的轨迹估计算法", 船舶工程, no. 04 *

Also Published As

Publication number Publication date
CN115859212B (en) 2023-07-18

Similar Documents

Publication Publication Date Title
US11392146B2 (en) Method for detecting target object, detection apparatus and robot
US9224050B2 (en) Vehicle localization in open-pit mining using GPS and monocular camera
CN106874854B (en) Unmanned aerial vehicle tracking method based on embedded platform
CN110850403A (en) Multi-sensor decision-level fused intelligent ship water surface target feeling knowledge identification method
Ludington et al. Augmenting UAV autonomy
US9183638B2 (en) Image based position determination
US10347001B2 (en) Localizing and mapping platform
Everts et al. Cooperative Object Tracking with Multiple PTZ Cameras.
Ortiz et al. A particle filter-based approach for tracking undersea narrow telecommunication cables
US10739770B2 (en) Autonomously-controlled inspection platform with model-based active adaptive data collection
CN114326732A (en) Robot autonomous following system and autonomous following control method
Miao et al. UniVIO: Unified direct and feature-based underwater stereo visual-inertial odometry
Zacchini et al. Forward-looking sonar CNN-based automatic target recognition: an experimental campaign with FeelHippo AUV
Rodriguez et al. Low-cost quadrotor applied for visual detection of landmine-like objects
GB2489829A (en) Use of image processing to ascertain three-dimensional deviations of an aircraft from a trajectory towards a target
US10551474B2 (en) Delay compensation while controlling a remote sensor
Wunsche Detection and control of mobile robot motion by real-time computer vision
CN115859212B (en) Autonomous deployment and recovery method and system for marine equipment
CN117067261A (en) Robot monitoring method, device, equipment and storage medium
CN117036989A (en) Miniature unmanned aerial vehicle target recognition and tracking control method based on computer vision
Duecker et al. Towards an Open-Source Micro Robot Oceanarium: A Low-Cost, Modular, and Mobile Underwater Motion-Capture System
KR101847113B1 (en) Estimation method and apparatus for information corresponding camera orientation by using image
Schneiderman et al. Model-based vision for car following
CN117647998B (en) Underwater vehicle connection method and system based on dynamic vision sensor
CN111968157B (en) Visual positioning system and method applied to high-intelligent robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 511458 Room 305, building C, 1121 Haibin Road, Nansha District, Guangzhou City, Guangdong Province

Applicant after: Guangdong Intelligent Unmanned System Research Institute (Nansha)

Address before: 511458 Room 305, building C, 1121 Haibin Road, Nansha District, Guangzhou City, Guangdong Province

Applicant before: Guangdong intelligent unmanned System Research Institute

GR01 Patent grant
GR01 Patent grant