CN117739994B - Visual robot underwater target identification tracking method and system - Google Patents
Visual robot underwater target identification tracking method and system Download PDFInfo
- Publication number
- CN117739994B CN117739994B CN202410187249.1A CN202410187249A CN117739994B CN 117739994 B CN117739994 B CN 117739994B CN 202410187249 A CN202410187249 A CN 202410187249A CN 117739994 B CN117739994 B CN 117739994B
- Authority
- CN
- China
- Prior art keywords
- tracking
- time
- robot
- underwater
- real
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 27
- 230000000007 visual effect Effects 0.000 title claims description 15
- 238000004364 calculation method Methods 0.000 claims abstract description 18
- 230000005540 biological transmission Effects 0.000 claims abstract description 13
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 claims description 36
- 238000012545 processing Methods 0.000 claims description 24
- 238000010606 normalization Methods 0.000 claims description 18
- 150000003839 salts Chemical class 0.000 claims description 15
- 238000004458 analytical method Methods 0.000 claims description 14
- 238000001514 detection method Methods 0.000 claims description 11
- 238000002592 echocardiography Methods 0.000 claims description 11
- 238000012544 monitoring process Methods 0.000 claims description 6
- 230000001133 acceleration Effects 0.000 claims description 3
- 238000011160 research Methods 0.000 description 6
- 230000001276 controlling effect Effects 0.000 description 3
- 230000009286 beneficial effect Effects 0.000 description 2
- 238000013135 deep learning Methods 0.000 description 2
- 238000005265 energy consumption Methods 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 238000007781 pre-processing Methods 0.000 description 2
- 230000001105 regulatory effect Effects 0.000 description 2
- 238000012216 screening Methods 0.000 description 2
- 239000013535 sea water Substances 0.000 description 2
- 239000002699 waste material Substances 0.000 description 2
- 238000004422 calculation algorithm Methods 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 238000013136 deep learning model Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000004392 development of vision Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 238000000265 homogenisation Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 238000012549 training Methods 0.000 description 1
Classifications
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02A—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
- Y02A90/00—Technologies having an indirect contribution to adaptation to climate change
- Y02A90/30—Assessment of water resources
Landscapes
- Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)
Abstract
The invention discloses a method and a system for identifying and tracking an underwater target of a vision robot, which relate to the technical field of underwater target identification and tracking and comprise the steps of according to the gray average value of an underwater imageAdjusting light; performing target identification on the real-time underwater image, and obtaining the ultrasonic transmission speed C and the target distance S of the current position of the robot after calculation; calculating to obtain the actual tracking speed of the robotFurther calculating to obtain a predicted tracking time Yt and a predicted tracking distance Ys; calculating the electric quantity required by trackingJudging the real-time electric quantity of the robotAnd corresponding tracking and early warning are sent outwards. Resources can be reasonably allocated, action routes can be planned, and accuracy and success of the robot tracking task can be improved.
Description
Technical Field
The invention relates to the technical field of underwater target identification and tracking, in particular to an underwater target identification and tracking method and an underwater target identification and tracking system for a vision robot.
Background
Underwater target identification and tracking has wide application in many fields, such as underwater archaeology, resource exploration, marine biology research, underwater safety monitoring, etc. However, achieving accurate identification and tracking of underwater targets presents challenges due to the complexity and uncertainty of the underwater environment. In recent years, with the development of vision sensor technology, a vision-based underwater target recognition and tracking method is becoming a research hotspot. These methods typically include image preprocessing, target detection, and target tracking. However, in an underwater environment, due to the influence of light refraction, water turbidity, target movement and other factors, the image quality is often poor, and difficulties are brought to target detection and tracking.
In the Chinese application with the application publication number of CN117008622A, an underwater target identification tracking method of a visual robot and an underwater visual robot thereof are disclosed, comprising the steps of firstly acquiring a target image through a visual module, and then preprocessing the acquired image to obtain a multi-target feature set; then, sequentially identifying and screening targets in the multi-target feature set from the near to the far until a specific target is identified; in the screening process, interference discrimination is carried out on the identified non-specific targets, if no interference exists, the target is directly swept to enter the next target identification, and if interference exists, the robot track is adjusted to a non-interference state, so that obstacle targets are avoided, and specific targets are found; after a specific target is found, the underwater robot body system runs to the specific target according to the planned track.
In the application of the invention, the underwater target is subjected to the light homogenization treatment through the lattice light source, so that the target can be clearly found and identified through the vision module, and the underwater target is identified and tracked, however, in the process of tracking the underwater target, as the water depth increases and the pressure increases, the resistance of the robot during movement also correspondingly increases, so that the robot needs to consume more energy to keep the speed or navigate in deep water, the energy is insufficient in the process of tracking the target, the tracking task cannot be completed, even the robot cannot return to navigation, and the economic loss is caused.
Therefore, the invention provides a visual robot underwater target identification tracking method and a visual robot underwater target identification tracking system.
Disclosure of Invention
(One) solving the technical problems
Aiming at the defects of the prior art, the invention provides a visual robot underwater target recognition tracking method and a visual robot underwater target recognition tracking system.
(II) technical scheme
In order to achieve the above purpose, the invention is realized by the following technical scheme: a visual robot underwater target identification tracking method comprises the following steps:
Periodically acquiring real-time underwater images by using underwater photographing equipment, and obtaining gray average value of underwater images by processing and calculating And according to the gray level average value/>, of the underwater imageAdjusting light;
performing target identification on a real-time underwater image, starting a temperature and salt depth meter and an ultrasonic generator after the target is detected to obtain water temperature T, water salinity Yd and water depth Sd and recording the time difference between transmitting and receiving echoes Obtaining the ultrasonic transmission speed C and the target distance S of the current position of the robot after analysis and calculation;
the propeller stress under the rated condition is obtained according to the detection of the stress sensor And real-time screw stress/>Calculating to obtain the actual tracking speed/>, of the robotFurther calculating to obtain a predicted tracking time Yt and a predicted tracking distance Ys;
The tracking early warning module is used for obtaining the predicted tracking time Yt and the predicted tracking distance Ys, calculating and obtaining the electric quantity Zl required by tracking, and judging the electric quantity Zl and the real-time electric quantity of the robot And corresponding tracking and early warning are sent outwards.
Further, using the underwater photographing device to periodically collect real-time underwater images, and extracting the total number of pixels a of the underwater images and gray values corresponding to each pixel by using image processing softwareAfter linear normalization processing, calculating to obtain the gray average value/>, of the underwater image:
Gray scale mean value of corresponding underwater imageThe calculation formula of (2) is as above.
Further, acquiring gray-scale average value of underwater imageJudging whether the gray scale average value exceeds 100, if the gray scale average value is smaller than 100, increasing the light brightness until the gray scale average value of the real-time underwater image is larger than 100. And if the brightness of the light is adjusted to be the maximum real-time underwater image gray average value still smaller than 50, controlling the position of the mobile robot.
The gray value represents the brightness level of a pixel, which typically ranges from 0 to 255, in a gray image 0 represents black, i.e. no brightness, and 255 represents white, i.e. the brightest color. Thus, a smaller gray value indicates a darker color, e.g., a gray value of 50 indicates a very dark color, and a gray value of 150 indicates a brighter color.
Further, if a target is detected in the real-time underwater image, starting a thermal salt depth meter to collect water temperature T, water salinity Yd and water depth Sd, transmitting ultrasonic waves by using an ultrasonic generator, and recording the time difference between transmitting and receiving the echoesAfter linear normalization processing, calculating to obtain the ultrasonic transmission speed C and the target distance S of the current position of the robot:
the corresponding formulas of the ultrasonic transmission speed C and the target distance S are as above.
A warm salt depth gauge (Conductivity Temperature Depth, abbreviated CTD) is a marine instrument for measuring the conductivity, temperature and pressure of seawater. C in the warm salt depth meter represents Conductivity, since Conductivity can be used to estimate salinity (Salinity); and D represents Depth, since Depth is calculated from pressure.
Further, acquiring real-time target distanceObtaining the maximum target speed/>, after analysisCorresponding maximum target speed/>The calculation formula of (2) is as follows:
where i denotes a time sequence number of the target distance, i=1, 2, 3, 4, …, n is a positive integer.
Further, a stress sensor is arranged on the propeller of the robot propeller, and the stress of the propeller under the rated condition is detected and obtainedAnd real-time screw stress/>After linear normalization processing, calculating to obtain the actual tracking speed/>, of the robot:
Wherein,Indicating the nominal speed of the robot.
Further, the actual tracking speed is obtainedReal-time target distance/>Speed/>Acceleration/>After linear normalization processing, calculating to obtain a predicted tracking time Yt, and further calculating to obtain a predicted tracking distance Ys:
the calculation formulas of the corresponding predicted tracking time Yt and the predicted tracking distance Ys are as above.
Further, the real-time electric quantity of the robot is obtained according to an electric quantity monitoring system equipped with the robotAfter linear normalization processing, calculating to obtain the power fluctuation index/>, of the robot:
Corresponding robot electric quantity fluctuation indexThe calculation formula of (2) is as above.
Further, a predicted tracking time Yt, a predicted tracking distance Ys and a robot electric quantity fluctuation index are obtainedAfter linear normalization processing, calculating to obtain the electric quantity Zl required by completing tracking:
wherein E represents the distance of the robot from the return point, Representing the period detection time.
When the tracking required electric quantity Zl is more than 0.9 times of the real-time electric quantity of the robotAnd when the system is in operation, tracking and early warning are sent outwards.
An underwater target recognition tracking system for a vision robot, comprising:
The light adjusting module periodically acquires real-time underwater images by using underwater photographing equipment, processes and calculates the gray average value of the underwater images And according to the gray level average value/>, of the underwater imageAdjusting light;
The target distance analysis module is used for carrying out target identification on the real-time underwater image, starting a temperature and salt depth meter and an ultrasonic generator after the target is detected to obtain water temperature T, water salinity Yd and water depth Sd and recording the time difference between transmitting and receiving echoes Obtaining the ultrasonic transmission speed C and the target distance S of the current position of the robot after analysis and calculation;
the target tracking prediction module is used for obtaining the stress of the propeller under the rated condition according to the detection of the stress sensor And real-time screw stress/>Calculating to obtain the actual tracking speed/>, of the robotFurther calculating to obtain a predicted tracking time Yt and a predicted tracking distance Ys;
The tracking early warning module is used for obtaining the predicted tracking time Yt and the predicted tracking distance Ys, calculating and obtaining the electric quantity Zl required by tracking, and judging the electric quantity Zl and the real-time electric quantity of the robot And corresponding tracking and early warning are sent outwards.
(III) beneficial effects
The invention provides a visual robot underwater target recognition tracking method and system, which have the following beneficial effects:
1. Periodically acquiring real-time underwater images by using underwater photographing equipment, and processing and calculating to obtain gray average value of underwater images And according to the gray level average value/>, of the underwater imageAdjust light, according to actual need automatically regulated light luminance, can avoid unnecessary energy waste, greatly reduced energy consumption to through real-time adjustment light luminance, can ensure that image quality is stable, thereby improve monitoring accuracy.
2. By carrying out target identification on a real-time underwater image, starting a temperature and salt depth meter and an ultrasonic generator after the target is detected, obtaining the water temperature T, the water salinity Yd and the water depth Sd and recording the time difference between transmitting and receiving echoesThe ultrasonic transmission speed C of the current position of the robot and the target distance S are obtained after analysis and calculation, so that the rapid, real-time and accurate distance measurement of the target can be realized.
3. Obtaining the stress of the propeller under the rated condition by detecting according to the stress sensorAnd real-time screw rod stressCalculating to obtain the actual tracking speed/>, of the robotAnd further calculating to obtain the predicted tracking time Yt and the predicted tracking distance Ys, so that the position of the target can be more accurately determined, the positioning precision is improved, the robot is helped to better avoid risks, and the tracking safety is improved.
4. The electric quantity Zl required by tracking is calculated and obtained by obtaining the predicted tracking time Yt and the predicted tracking distance Ys, and the electric quantity Zl and the real-time electric quantity of the robot are judgedAnd corresponding tracking early warning is sent outwards, so that resources can be reasonably allocated and a mobile route can be planned, and the accuracy and the success of a robot tracking task can be improved.
Drawings
FIG. 1 is a schematic flow chart of a visual robot underwater target recognition tracking method of the invention;
fig. 2 is a schematic structural diagram of an underwater target recognition tracking system of a vision robot according to the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Referring to fig. 1, the invention provides a visual robot underwater target recognition and tracking method, which comprises the following steps:
Periodically acquiring a real-time underwater image by using underwater photographing equipment, and processing and calculating to obtain a gray average value of the underwater image And according to the gray level average value/>, of the underwater imageAnd adjusting the light.
The first step comprises the following steps:
step 101, periodically acquiring a real-time underwater image by using underwater photographing equipment, and extracting the total number of pixels a of the underwater image and gray values corresponding to each pixel by using image processing software After linear normalization processing, calculating to obtain the gray average value/>, of the underwater image:
Gray scale mean value of corresponding underwater imageThe calculation formula of (2) is as above.
102, Acquiring a gray level average value of an underwater imageJudging whether the light brightness exceeds 100, if the light brightness is smaller than 100, increasing the light brightness until the real-time underwater image gray average value is larger than 100, and if the light brightness is adjusted to be maximum, controlling the position of the mobile robot if the real-time underwater image gray average value is still smaller than 50.
The gray value represents the brightness level of a pixel, which typically ranges from 0 to 255, in a gray image 0 represents black, i.e. no brightness, and 255 represents white, i.e. the brightest color. Thus, a smaller gray value indicates a darker color, e.g., a gray value of 50 indicates a very dark color, and a gray value of 150 indicates a brighter color.
In use, the contents of steps 101 and 102 are combined:
periodically acquiring real-time underwater images by using underwater photographing equipment, and processing and calculating to obtain gray average value of underwater images And according to the gray level average value/>, of the underwater imageAdjust light, according to actual need automatically regulated light luminance, can avoid unnecessary energy waste, greatly reduced energy consumption to through real-time adjustment light luminance, can ensure that image quality is stable, thereby improve monitoring accuracy.
Step two, performing target identification on the real-time underwater image, starting a temperature and salt depth meter and an ultrasonic generator after the target is detected to obtain water temperature T, water salinity Yd and water depth Sd and recording the time difference between transmitting and receiving echoesAnd obtaining the ultrasonic transmission speed C and the target distance S of the current position of the robot after analysis and calculation.
The second step comprises the following steps:
Step 201, importing a plurality of angle images of a target into an underwater target recognition model, training and testing sample data, performing target recognition on a real-time underwater image by using the trained underwater target recognition model, and controlling the position of the mobile robot if the target is not detected.
It should be noted that: the underwater target recognition model can be referred to as follows: the method is characterized in that the method is based on binocular vision underwater dynamic target recognition and positioning research, an underwater target active recognition method of a GAF-D3Net deep learning network is adopted, underwater image target recognition and parameter extraction based on deep learning are performed, underwater target real-time recognition and positioning research based on binocular vision is performed, underwater target recognition method research based on self-supervision acoustic feature learning is performed, and the method is oriented to deep learning model robustness research of underwater target recognition.
Step 202, if a target is detected in the real-time underwater image, starting a thermal salt depth meter to collect water temperature T, water salinity Yd and water depth Sd, transmitting ultrasonic waves by using an ultrasonic generator, and recording the time difference between transmitting and receiving echoesAfter linear normalization processing, calculating to obtain the ultrasonic transmission speed C and the target distance S of the current position of the robot:
A warm salt depth gauge (Conductivity Temperature Depth, abbreviated CTD) is a marine instrument for measuring the conductivity, temperature and pressure of seawater. C in the warm salt depth meter represents Conductivity, since Conductivity can be used to estimate salinity (Salinity); and D represents Depth, since Depth is calculated from pressure.
In use, the contents of steps 201 and 202 are combined:
by carrying out target identification on a real-time underwater image, starting a temperature and salt depth meter and an ultrasonic generator after the target is detected, obtaining the water temperature T, the water salinity Yd and the water depth Sd and recording the time difference between transmitting and receiving echoes The ultrasonic transmission speed C of the current position of the robot and the target distance S are obtained after analysis and calculation, so that the rapid, real-time and accurate distance measurement of the target can be realized.
Step three, obtaining the stress of the propeller under the rated condition according to the detection of the stress sensorAnd real-time screw stress/>Calculating to obtain the actual tracking speed/>, of the robotAnd further calculates and obtains a predicted tracking time Yt and a predicted tracking distance Ys.
The third step comprises the following steps:
step 301, obtaining real-time target distance Obtaining the maximum target speed/>, after analysisCorresponding maximum target speed/>The calculation formula of (2) is as follows:
where i denotes a time sequence number of the target distance, i=1, 2, 3, 4, …, n is a positive integer.
Step 302, installing a stress sensor on a propeller of the robot propeller, and detecting and obtaining the stress of the propeller under the rated conditionAnd real-time screw stress/>After linear normalization processing, calculating to obtain the actual tracking speed of the robot:
Wherein,Indicating the nominal speed of the robot.
Step 303, obtaining an actual tracking speedReal-time target distance/>Speed/>Acceleration/>After linear normalization processing, calculating to obtain a predicted tracking time Yt, and further calculating to obtain a predicted tracking distance Ys:
the calculation formulas of the corresponding predicted tracking time Yt and the predicted tracking distance Ys are as above.
In use, the contents of steps 301 to 303 are combined:
Obtaining the stress of the propeller under the rated condition by detecting according to the stress sensor And real-time screw rod stressCalculating to obtain the actual tracking speed/>, of the robotAnd further calculating to obtain the predicted tracking time Yt and the predicted tracking distance Ys, so that the position of the target can be more accurately determined, the positioning precision is improved, the robot is helped to better avoid risks, and the tracking safety is improved.
Step four, obtaining the predicted tracking time Yt and the predicted tracking distance Ys, calculating to obtain the electric quantity Zl required by tracking, and judging the electric quantity Zl and the real-time electric quantity of the robotAnd corresponding tracking and early warning are sent outwards.
The fourth step comprises the following steps:
Step 401, acquiring real-time electric quantity of the robot according to an electric quantity monitoring system equipped with the robot After linear normalization processing, calculating to obtain the power fluctuation index/>, of the robot:
Corresponding robot electric quantity fluctuation indexThe calculation formula of (2) is as above.
Step 402, obtaining a predicted tracking time Yt, a predicted tracking distance Ys and a robot electric quantity fluctuation indexAfter linear normalization processing, calculating to obtain the electric quantity Zl required by completing tracking:
wherein E represents the distance of the robot from the return point, Representing the period detection time.
When the tracking required electric quantity Zl is more than 0.9 times of the real-time electric quantity of the robotAnd when the target tracking strategy is started, and the target tracking strategy is started.
In use, the contents of steps 401 and 402 are combined:
the electric quantity Zl required by tracking is calculated and obtained by obtaining the predicted tracking time Yt and the predicted tracking distance Ys, and the electric quantity Zl and the real-time electric quantity of the robot are judged And corresponding tracking early warning is sent outwards, so that resources can be reasonably allocated and a mobile route can be planned, and the accuracy and the success of a robot tracking task can be improved.
Referring to fig. 2, the present invention provides a visual robot underwater target recognition tracking system, comprising:
The light adjusting module periodically acquires real-time underwater images by using underwater photographing equipment, processes and calculates the gray average value of the underwater images And according to the gray level average value/>, of the underwater imageAnd adjusting the light.
The target distance analysis module is used for carrying out target identification on the real-time underwater image, starting a temperature and salt depth meter and an ultrasonic generator after the target is detected to obtain water temperature T, water salinity Yd and water depth Sd and recording the time difference between transmitting and receiving echoesAnd obtaining the ultrasonic transmission speed C and the target distance S of the current position of the robot after analysis and calculation.
The target tracking prediction module is used for obtaining the stress of the propeller under the rated condition according to the detection of the stress sensorAnd real-time screw stress/>Calculating to obtain the actual tracking speed/>, of the robotAnd further calculates and obtains a predicted tracking time Yt and a predicted tracking distance Ys.
The tracking early warning module is used for obtaining the predicted tracking time Yt and the predicted tracking distance Ys, calculating and obtaining the electric quantity Zl required by tracking, and judging the electric quantity Zl and the real-time electric quantity of the robotAnd corresponding tracking and early warning are sent outwards.
The above embodiments may be implemented in whole or in part by software, hardware, firmware, or any other combination. When implemented in software, the above-described embodiments may be implemented in whole or in part in the form of a computer program product. Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
The foregoing is merely illustrative of the present application, and the present application is not limited thereto, and any person skilled in the art will readily recognize that variations or substitutions are within the scope of the present application.
Claims (2)
1. A visual robot underwater target identification tracking method is characterized in that: the method comprises the following steps:
Periodically acquiring real-time underwater images by using underwater photographing equipment, and obtaining gray average value of underwater images by processing and calculating And according to the gray level average value/>, of the underwater imageAdjusting light;
Periodically acquiring a real-time underwater image by using underwater photographing equipment, and extracting the total number of pixels a of the underwater image and gray values corresponding to each pixel by using image processing software After linear normalization processing, calculating to obtain the gray average value/>, of the underwater image:
,
Acquiring gray average value of underwater imageJudging whether the light brightness exceeds 100, if the light brightness is smaller than 100, increasing the light brightness until the real-time underwater image gray average value is larger than 100, and if the light brightness is adjusted to be maximum, controlling the position of the mobile robot if the real-time underwater image gray average value is still smaller than 50;
performing target identification on a real-time underwater image, starting a temperature and salt depth meter and an ultrasonic generator after the target is detected to obtain water temperature T, water salinity Yd and water depth Sd and recording the time difference between transmitting and receiving echoes Obtaining the ultrasonic transmission speed C and the target distance S of the current position of the robot after analysis and calculation;
If a target is detected in the real-time underwater image, starting a temperature and salt depth meter to collect water temperature T, water salinity Yd and water depth Sd, transmitting ultrasonic waves by using an ultrasonic generator, and recording the time difference between transmitting and receiving the echoes After linear normalization processing, calculating to obtain the ultrasonic transmission speed C and the target distance S of the current position of the robot:
,
the propeller stress under the rated condition is obtained according to the detection of the stress sensor And real-time screw stress/>Calculating to obtain the actual tracking speed/>, of the robotFurther calculating to obtain a predicted tracking time Yt and a predicted tracking distance Ys;
acquiring real-time target distance Obtaining the maximum target speed/>, after analysisCorresponding maximum target speedThe calculation formula of (2) is as follows:
,
wherein i represents the time sequence number of the target distance, N is a positive integer;
A stress sensor is arranged on a propeller of the robot propeller, and the stress of the propeller under the rated condition is detected and obtained And real-time screw stress/>After linear normalization processing, calculating to obtain the actual tracking speed/>, of the robot:
,
Wherein,Indicating the rated speed of the robot;
Acquiring actual tracking speed Real-time target distance/>Speed/>Acceleration/>After linear normalization processing, calculating to obtain a predicted tracking time Yt, and further calculating to obtain a predicted tracking distance Ys:
,
The tracking early warning module is used for obtaining the predicted tracking time Yt and the predicted tracking distance Ys, calculating and obtaining the electric quantity Zl required by tracking, and judging the electric quantity Zl and the real-time electric quantity of the robot And send out corresponding tracking and early warning;
Acquiring real-time electric quantity of the robot according to an electric quantity monitoring system equipped with the robot After linear normalization processing, calculating to obtain the power fluctuation index/>, of the robot:
,
Acquiring predicted tracking time Yt, predicted tracking distance Ys and robot electric quantity fluctuation indexAfter linear normalization processing, calculating to obtain the electric quantity Zl required by completing tracking:
,
wherein E represents the distance of the robot from the return point, Representing a period detection time;
when the tracking required electric quantity Zl is more than 0.9 times of the real-time electric quantity of the robot And when the system is in operation, tracking and early warning are sent outwards.
2. A vision robot underwater target recognition tracking system for implementing the method of claim 1, characterized in that: comprising the following steps:
The light adjusting module periodically acquires real-time underwater images by using underwater photographing equipment, processes and calculates the gray average value of the underwater images And according to the gray level average value/>, of the underwater imageAdjusting light;
The target distance analysis module is used for carrying out target identification on the real-time underwater image, starting a temperature and salt depth meter and an ultrasonic generator after the target is detected to obtain water temperature T, water salinity Yd and water depth Sd and recording the time difference between transmitting and receiving echoes Obtaining the ultrasonic transmission speed C and the target distance S of the current position of the robot after analysis and calculation;
the target tracking prediction module is used for obtaining the stress of the propeller under the rated condition according to the detection of the stress sensor And real-time screw stress/>Calculating to obtain the actual tracking speed/>, of the robotFurther calculating to obtain a predicted tracking time Yt and a predicted tracking distance Ys;
The tracking early warning module is used for obtaining the predicted tracking time Yt and the predicted tracking distance Ys, calculating and obtaining the electric quantity Zl required by tracking, and judging the electric quantity Zl and the real-time electric quantity of the robot And corresponding tracking and early warning are sent outwards.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410187249.1A CN117739994B (en) | 2024-02-20 | 2024-02-20 | Visual robot underwater target identification tracking method and system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410187249.1A CN117739994B (en) | 2024-02-20 | 2024-02-20 | Visual robot underwater target identification tracking method and system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN117739994A CN117739994A (en) | 2024-03-22 |
CN117739994B true CN117739994B (en) | 2024-04-30 |
Family
ID=90281578
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202410187249.1A Active CN117739994B (en) | 2024-02-20 | 2024-02-20 | Visual robot underwater target identification tracking method and system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117739994B (en) |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017000773A1 (en) * | 2015-06-30 | 2017-01-05 | 芋头科技(杭州)有限公司 | Camera assembly device of robot and shooting and tracking method thereof |
CN108693535A (en) * | 2018-04-03 | 2018-10-23 | 中信重工开诚智能装备有限公司 | A kind of detection system for obstacle and detection method for underwater robot |
CN112153216A (en) * | 2020-09-16 | 2020-12-29 | Oppo广东移动通信有限公司 | Electric quantity early warning method and device, terminal equipment and storage medium |
WO2021017291A1 (en) * | 2019-07-31 | 2021-02-04 | 平安科技(深圳)有限公司 | Darkflow-deepsort-based multi-target tracking detection method, device, and storage medium |
CN112883564A (en) * | 2021-02-01 | 2021-06-01 | 中国海洋大学 | Water body temperature prediction method and prediction system based on random forest |
CN113108791A (en) * | 2021-03-05 | 2021-07-13 | 深圳大学 | Navigation positioning method and navigation positioning equipment |
WO2023113058A1 (en) * | 2021-12-13 | 2023-06-22 | ㈜유시스 | Drone control method for precise landing |
CN117008622A (en) * | 2023-04-07 | 2023-11-07 | 西安工业大学 | Visual robot underwater target identification tracking method and underwater visual robot thereof |
-
2024
- 2024-02-20 CN CN202410187249.1A patent/CN117739994B/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017000773A1 (en) * | 2015-06-30 | 2017-01-05 | 芋头科技(杭州)有限公司 | Camera assembly device of robot and shooting and tracking method thereof |
CN108693535A (en) * | 2018-04-03 | 2018-10-23 | 中信重工开诚智能装备有限公司 | A kind of detection system for obstacle and detection method for underwater robot |
WO2021017291A1 (en) * | 2019-07-31 | 2021-02-04 | 平安科技(深圳)有限公司 | Darkflow-deepsort-based multi-target tracking detection method, device, and storage medium |
CN112153216A (en) * | 2020-09-16 | 2020-12-29 | Oppo广东移动通信有限公司 | Electric quantity early warning method and device, terminal equipment and storage medium |
CN112883564A (en) * | 2021-02-01 | 2021-06-01 | 中国海洋大学 | Water body temperature prediction method and prediction system based on random forest |
CN113108791A (en) * | 2021-03-05 | 2021-07-13 | 深圳大学 | Navigation positioning method and navigation positioning equipment |
WO2023113058A1 (en) * | 2021-12-13 | 2023-06-22 | ㈜유시스 | Drone control method for precise landing |
CN117008622A (en) * | 2023-04-07 | 2023-11-07 | 西安工业大学 | Visual robot underwater target identification tracking method and underwater visual robot thereof |
Non-Patent Citations (2)
Title |
---|
"Visual detection and tracking system for a spherical amphibious robot";Guo, Shuxiang 等;《Sensors》;20171231;第17卷(第4期);正文第870页 * |
"欠驱动船舶路径跟踪的强化学习迭代滑模控制";沈智鹏 等;《哈尔滨工程大学学报》;20171231(第5期);正文第697-704页 * |
Also Published As
Publication number | Publication date |
---|---|
CN117739994A (en) | 2024-03-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108256446B (en) | Method, device and equipment for determining lane line in road | |
CN109670411B (en) | Ship point cloud depth image processing method and system based on generation countermeasure network | |
CN104034733A (en) | Service life prediction method based on binocular vision monitoring and surface crack image recognition | |
CN109613559B (en) | Device and method for distinguishing water-land boundary floaters based on vision and laser radar | |
CN109178234B (en) | Ship freeboard height measuring system and measuring method thereof | |
CN116148801B (en) | Millimeter wave radar-based target detection method and system | |
CN103940344A (en) | High-precision remote displacement sensor | |
KR101772220B1 (en) | Calibration method to estimate relative position between a multi-beam sonar and a camera | |
CN113256697B (en) | Three-dimensional reconstruction method, system, device and storage medium for underwater scene | |
CN117739994B (en) | Visual robot underwater target identification tracking method and system | |
CN115984360B (en) | Method and system for calculating length of dry beach based on image processing | |
CN116452965A (en) | Underwater target detection and recognition method based on acousto-optic fusion | |
CN115755072A (en) | Special scene positioning method and system based on binocular structured light camera | |
CN113781513B (en) | Leakage detection method and system for water supply pipeline of power plant | |
CN112907728B (en) | Ship scene restoration and positioning method and system based on camera and edge calculation | |
CN112326917B (en) | Water environment pollution traceability system | |
CN103940345B (en) | A kind of long-range displacement measurement system and method | |
CN115328131A (en) | Obstacle avoidance method and device for unmanned ship, unmanned ship and storage medium | |
KR101696088B1 (en) | Method for recognizing object by ultrasound and apparatus therefor | |
JP2020056650A (en) | Image analysis device, image analysis method, and image analysis program | |
CN113419075B (en) | Ship speed measuring method, system, device and medium based on binocular vision | |
CN112305493B (en) | Deep sea visual ranging method based on light beacon detection | |
CN113971679B (en) | Ocean tide measuring method based on computer vision and image processing | |
CN117572438B (en) | Navigation type fish shoal detection method and system | |
CN118225035A (en) | Visual ranging method, device and equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |