CN116594080A - Underwater target detection system and detection method - Google Patents
Underwater target detection system and detection method Download PDFInfo
- Publication number
- CN116594080A CN116594080A CN202310868749.7A CN202310868749A CN116594080A CN 116594080 A CN116594080 A CN 116594080A CN 202310868749 A CN202310868749 A CN 202310868749A CN 116594080 A CN116594080 A CN 116594080A
- Authority
- CN
- China
- Prior art keywords
- data
- camera
- image data
- acoustic detector
- underwater
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 145
- 238000012545 processing Methods 0.000 claims abstract description 49
- 238000004891 communication Methods 0.000 claims abstract description 22
- 239000011159 matrix material Substances 0.000 claims description 24
- 238000006243 chemical reaction Methods 0.000 claims description 11
- 230000009466 transformation Effects 0.000 claims description 10
- 230000014509 gene expression Effects 0.000 claims description 9
- 230000004927 fusion Effects 0.000 claims description 6
- 238000012800 visualization Methods 0.000 claims description 4
- 230000000007 visual effect Effects 0.000 claims description 2
- 235000019994 cava Nutrition 0.000 abstract 1
- 238000000034 method Methods 0.000 description 8
- 230000003287 optical effect Effects 0.000 description 5
- 238000010586 diagram Methods 0.000 description 3
- 238000005286 illumination Methods 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 230000010354 integration Effects 0.000 description 2
- 239000000523 sample Substances 0.000 description 2
- 238000013519 translation Methods 0.000 description 2
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 2
- 230000007547 defect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 239000012535 impurity Substances 0.000 description 1
- 244000005700 microbiome Species 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01V—GEOPHYSICS; GRAVITATIONAL MEASUREMENTS; DETECTING MASSES OR OBJECTS; TAGS
- G01V11/00—Prospecting or detecting by methods combining techniques covered by two or more of main groups G01V1/00 - G01V9/00
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02A—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
- Y02A90/00—Technologies having an indirect contribution to adaptation to climate change
- Y02A90/30—Assessment of water resources
Landscapes
- Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- General Life Sciences & Earth Sciences (AREA)
- General Physics & Mathematics (AREA)
- Geophysics (AREA)
- Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)
Abstract
The application relates to an underwater target detection system and a detection method, which belong to the technical field of underwater detection, wherein the underwater target detection system comprises a watertight box body, an acoustic detector, a camera, a control module and a data processing module; the watertight box body is provided with a first detection port and a second detection port; the acoustic detector is positioned in the watertight box body, is correspondingly arranged at the first detection port and is used for collecting coordinate data of a detection target; the camera is positioned in the watertight box body and correspondingly arranged at the second detection port and is used for collecting image data of the underwater environment; the control module is respectively in communication connection with the acoustic detector and the camera and is used for controlling the acoustic detector and the camera to acquire data; the data processing module is communicatively coupled to the acoustic detector and the camera. According to the application, through the cooperation of the camera and the acoustic detector, the detection of the underwater target can be realized without carrying an additional positioning system, and particularly, the detection efficiency and the detection precision can be effectively improved for environments such as underwater caves and the like.
Description
Technical Field
The application belongs to the technical field of underwater detection, and particularly relates to an underwater target detection system and an underwater target detection method.
Background
Currently, underwater detection devices at home and abroad are roughly classified into two types: acoustic devices and optical devices. The acoustic device is typically a multi-beam probe device that acquires the shape and position of an underwater object by transmitting and receiving sound waves; the acoustic device needs to be used with a positioning system, typically a GPS-based radio positioning system, that provides position and attitude information of the acoustic device. The optical device is typically an underwater laser detection device that acquires the shape and position of an underwater object by transmitting and receiving laser light; the optical device also needs to be used in conjunction with a positioning system, typically one based on inertial navigation, that provides positional and attitude information for the optical device. However, when the underwater space is limited and the environment is complex, such as an underwater cave environment, the GPS signal is easy to lose, so that the radio positioning system is invalid, and the acoustic equipment cannot obtain an accurate detection result; in addition, the water body in the underwater cave environment is turbid, contains a large amount of impurities and microorganisms, so that laser is absorbed or scattered, and the optical equipment cannot obtain a clear detection result.
Therefore, how to provide an underwater target detection system and a detection method for underwater cave environment detection is a technical problem which needs to be solved urgently at present.
Disclosure of Invention
Aiming at the defects existing in the prior art, the application provides an underwater target detection system and a detection method, which output target detection data through image data obtained by a camera and coordinate data obtained by an acoustic detector, and solve the problems of inaccurate and unclear detection results caused by the dependence of current underwater detection equipment on a positioning system.
The application provides an underwater target detection system, comprising:
the watertight box body is provided with a first detection port and a second detection port;
the acoustic detector is positioned in the watertight box body, is correspondingly arranged at the first detection port and is used for acquiring coordinate data of a detection target;
the camera is positioned in the watertight box body, is correspondingly arranged at the second detection port and is used for acquiring image data of the underwater environment;
the control module is respectively in communication connection with the acoustic detector and the camera and is used for controlling the acoustic detector and the camera to acquire data;
the data processing module is in communication connection with the acoustic detector and the camera and is used for obtaining pose data of camera movement according to image data acquired by the camera and taking the pose data as the pose data of the acoustic detector movement; and fusing the pose data of the movement of the acoustic detector with the coordinate data of the detection target acquired by the acoustic detector, and outputting target detection data.
According to the underwater target detection system provided by the technical scheme, through the cooperation of the camera and the acoustic detector, the detection of the underwater target can be realized without carrying an additional positioning system, and particularly, for environments in which the positioning system such as an underwater cave is difficult to stably operate, the underwater target detection system can effectively improve the detection efficiency and the detection precision.
In some embodiments, the control module controls the camera to acquire image data in advance, and after the camera acquires the first piece of initial image data, the control module controls the acoustic detector to start and controls the acoustic detector and the camera to synchronously acquire data.
In some embodiments, the data processing module performs epipolar geometry processing on two adjacent pieces of image data according to a time sequence of the image data acquired by the camera, so as to obtain a pose transformation matrix of camera movement as pose data of camera movement.
In some of these embodiments, the underwater object detection system further comprises a communication module through which the control module is communicatively coupled to the acoustic detector and the camera, and the data processing module is communicatively coupled to the acoustic detector and the camera. According to the technical scheme, the data transmission and instruction control of the acoustic detector and the camera are realized through the communication module, and the integration and controllability of the underwater target detection system are improved.
In some of these embodiments, the underwater target detection system further comprises a power supply device for providing a supply of electrical energy. According to the technical scheme, the stable operation of the underwater target detection system is ensured through the power supply equipment, and the failure or the fault caused by insufficient power is avoided.
Besides, the application also provides an underwater target detection method, which adopts the underwater target detection system to detect the underwater target and comprises the following steps:
a data acquisition step, namely starting a camera to acquire image data of an underwater environment, and immediately starting an acoustic detector and starting to acquire data synchronously with the camera after the camera acquires first initial image data;
a data processing step of processing the acquired image data to obtain pose data of the movement of the camera, wherein the pose data is used as the pose data of the movement of the acoustic detector;
a data fusion step of fusing coordinate data of a detection target acquired by an acoustic detector with pose data of the movement of the acoustic detector and outputting target detection data;
and a data splicing step, wherein the obtained target detection data are stored according to time sequence, so as to obtain final detection data.
In some embodiments, in the data processing step, epipolar geometry processing is performed on two adjacent pieces of image data according to a time sequence of the acquired image data, so as to generate a pose transformation matrix E for camera movement as pose data for camera movement.
In some of these embodiments, the step of obtaining the pose transformation matrix E by epipolar geometry processing is: calibrating a camera for acquiring image data to obtain an internal reference matrix K of the camera; the image data are arranged according to the time sequence, feature point matching is sequentially carried out on two adjacent image data, the coordinate conversion relation of the front image data and the rear image data is obtained by combining the camera internal reference matrix K, the pose conversion matrix E is obtained by calculation, and the expression of the coordinate conversion relation of the front image data and the rear image data is as follows:
(1);
wherein K is an internal reference matrix of the camera; among the two adjacent image data, the image data with the preceding time is recorded as first image data, the image data with the following time is recorded as second image data, u and v are respectively the abscissa and the ordinate of the feature point in the first image data, and u 'and v' are respectively the abscissa and the ordinate of the same feature point in the second image data.
In some embodiments, in the data fusion step, the target detection data is obtained by calculating according to formula (2), where the expression of formula (2) is:
(2);
wherein U is target detection data, and S is coordinate data of a detection target obtained by an acoustic detector.
In some of these embodiments, the underwater target detection method further comprises a visualization processing step comprising: and carrying out visual processing on the final detection data through 3D point cloud processing software.
Based on the above scheme, the underwater target detection system and the underwater target detection method in the embodiment of the application can realize the detection of the underwater target through the cooperation of the camera and the acoustic detector without carrying an additional positioning system, and particularly for environments in which the positioning system such as an underwater cave is difficult to stably operate, the underwater target detection system can effectively improve the detection efficiency and the detection precision.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description serve to explain the application and do not constitute a limitation on the application. In the drawings:
FIG. 1 is a schematic diagram of the internal structure of an underwater target detection system of the present application;
FIG. 2 is a schematic diagram of the structure of the underwater target detection system of the present application;
FIG. 3 is a flow chart of the underwater target detection method of the present application;
fig. 4 is a schematic diagram of the epipolar geometry process.
In the figure:
1. a watertight case; 2. an acoustic detector; 3. a camera; 4. a communication module;
5. a data processing module; 6. a control module; 7. a power supply device.
Detailed Description
The technical solutions in the embodiments will be clearly and completely described below with reference to the drawings in the embodiments of the present application. It will be apparent that the described embodiments are only some, but not all, embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
In the description of the present application, it should be understood that the terms "center," "lateral," "longitudinal," "upper," "lower," "front," "rear," "left," "right," "top," "bottom," "inner," "outer," and the like indicate or are based on the orientation or positional relationship shown in the drawings, merely to facilitate description of the application and simplify the description, and do not indicate or imply that the devices or elements referred to must have a particular orientation, be constructed and operate in a particular orientation, and thus should not be construed as limiting the application.
The terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature.
In the description of the present application, it should be noted that, unless explicitly specified and limited otherwise, the terms "mounted," "connected," and "connected" are to be construed broadly, and may be either fixedly connected, detachably connected, or integrally connected, for example; can be directly connected or indirectly connected through an intermediate medium, and can be communication between two elements. The specific meaning of the above terms in the present application will be understood in specific cases by those of ordinary skill in the art.
In one embodiment of the underwater object detection system of the present application, as shown in fig. 1-2, the underwater object detection system comprises a watertight box 1, an acoustic detector 2, a camera 3, a control module 6 and a data processing module 5; the watertight box body 1 is provided with a first detection port and a second detection port; the acoustic detector 2 is positioned in the watertight box body 1, is correspondingly arranged at the first detection port and is used for acquiring coordinate data of a detection target; the camera 3 is positioned in the watertight box body 1 and correspondingly arranged at the second detection port and is used for collecting image data of the underwater environment; the control module 6 is respectively in communication connection with the acoustic detector 2 and the camera 3 and is used for controlling the acoustic detector 2 and the camera to acquire data; the data processing module 5 is in communication connection with the acoustic detector 2 and the camera 3, and is used for obtaining pose data of movement of the camera 3 according to image data acquired by the camera 3, and taking the pose data of movement of the acoustic detector 2 as pose data of movement of the camera 3; and fusing the pose data of the movement of the acoustic detector 2 with the coordinate data of the detection target acquired by the acoustic detector 2, and outputting target detection data.
In the above-described exemplary embodiment, the underwater target detection system obtains pose data of the movement of the camera 3 from the image data acquired by the camera 3, and uses the pose data as the pose data of the movement of the acoustic detector 2, and then fuses the coordinate data of the detection target acquired by the acoustic detector 2 to output target detection data. The underwater target detection system provided by the application can realize the detection of the underwater target by the cooperation of the camera 3 and the acoustic detector 2 without carrying an additional positioning system, and particularly can effectively improve the detection efficiency and the detection precision in environments where the positioning system such as an underwater cave is difficult to stably operate.
It should be noted that, the control module 6 controls the camera 3 to collect image data in advance, and after the camera 3 acquires the first initial image data (such as the image data 0 shown in fig. 3), the control module 6 controls the acoustic detector 2 to start, and controls the acoustic detector 2 and the camera 3 to collect data synchronously.
As an illustrative example, the acoustic detector 2 may employ a doppler acoustic device, the video camera 3 may employ an industrial camera, and the control module 6 may employ an STM32 control board. It should be noted that, the data processing module 5 is communicatively connected to the acoustic detector 2 and the camera 3, and the data processing module 5 may be installed in the watertight box 1 or may be located outside the watertight box 1. It can be appreciated that when the data processing module 5 is located in the watertight case 1, the data processing module 5 performs data processing by using an underwater computer; when the data processing module 5 is positioned outside the watertight box body 1, the data processing module 5 adopts an on-water computer to process data.
It should also be noted that the system can be used to carry an ROV underwater robot for operation. When the underwater environment is dim, the camera 3 can provide illumination by means of the ROV underwater robot, and the LED illumination device can also be arranged in the watertight box body 1 to provide illumination.
As shown in fig. 3, it should be noted that, the data processing module 5 performs epipolar geometry processing on two adjacent image data according to the time sequence of the image data acquired by the camera 3, to obtain a pose transformation matrix for moving the camera 3, as pose data for moving the camera 3. It should be noted that, the pose data of the movement of the camera 3 may be used as the pose data of the movement of the acoustic detector 2, because the camera 3 and the acoustic detector 2 are located in the same watertight box 1 and move together with the watertight box 1, the pose data of the movement of the camera 3 and the pose data of the movement of the acoustic detector 2 are the same.
As shown in fig. 1, it should be noted that the underwater object detection system further includes a communication module 4, the control module 6 is communicatively connected to the acoustic detector 2 and the camera 3 through the communication module 4, and the data processing module 5 is communicatively connected to the acoustic detector 2 and the camera 3 through the communication module 4; the communication module 4 is configured to receive coordinate data collected by the acoustic detector 2 and image data collected by the camera 3, and transmit the coordinate data collected by the acoustic detector 2 and the image data collected by the camera 3 to the data processing module 5, and the communication module 4 is further configured to perform instruction control on the acoustic detector 2 and the camera 3. The data transmission and instruction control of the acoustic detector 2 and the camera 3 are realized through the communication module 4, so that the integration and controllability of the underwater object detection system are improved. As an illustrative embodiment, the communications module 4 may employ an STM32 control board and a network switch; the commanded control of the communication module 4 for the acoustic detector 2 and the camera 3 includes, but is not limited to, controlling the acoustic detector 2 and the camera 3 to start acquiring data or end acquiring data.
As shown in fig. 1, it is also to be noted that the underwater object detection system further comprises a power supply device 7 for providing an electrical power supply. It should be noted that, the power supply device 7 is electrically connected to the acoustic detector 2, the camera 3 and the control module 6, and when the data processing module 5 is located in the watertight box 1, the power supply device 7 is electrically connected to the data processing module 5. By setting the power supply equipment 7, the stable operation of the underwater target detection system is ensured, and the failure or fault caused by insufficient power is avoided. As an exemplary embodiment, the power supply device 7 may employ an external power supply and a 220V to 48V power conversion module.
As shown in fig. 3, based on the above-mentioned underwater target detection system, the present application further provides an underwater target detection method, which uses the above-mentioned underwater target detection system to detect an underwater target, comprising the following steps:
a data acquisition step, namely starting a camera 3 to acquire image data of an underwater environment, and immediately starting and starting the acoustic detector 2 to acquire coordinate data of a detection target after the camera acquires first initial image data; in this step, the control module 6 controls the camera 3 to start collecting the image data 0 in advance, and then the control module 6 controls the acoustic detector 2 to start immediately and start collecting the data in synchronization with the camera 3. The acoustic probe 2 acquires coordinate data of the detection target as absolute three-dimensional coordinates of the detection target.
A data processing step of processing the acquired image data to obtain pose data of the movement of the camera 3 as the pose data of the movement of the acoustic detector 2; in this step, the image data collected by the camera 3 is processed to obtain pose data of movement of the camera 3, because the camera 3 and the acoustic detector 2 are located in the same watertight box 1, the pose data of movement of the camera 3 can be used as pose data of movement of the acoustic detector 2.
A data fusion step of fusing coordinate data of a detection target acquired by the acoustic detector 2 with pose data of the movement of the acoustic detector 2 and outputting target detection data;
a data splicing step, namely storing the obtained target detection data according to time sequence to obtain final detection data;
as shown in fig. 3, in the data processing step, epipolar geometry processing is performed on two adjacent pieces of image data according to the time series of the acquired image data, and a pose transformation matrix E in which the camera 3 moves is generated as pose data in which the camera 3 moves.
As shown in fig. 3, it should be further noted that the steps for obtaining the pose transformation matrix E through the epipolar geometry process are as follows: calibrating a camera 3 for collecting image data to obtain an internal reference matrix K of the camera 3; the image data are arranged according to the time sequence, feature point matching is sequentially carried out on two adjacent image data, the coordinate conversion relation of the front image data and the rear image data is obtained by combining the camera internal reference matrix K, the pose conversion matrix E is obtained by calculation, and the expression of the coordinate conversion relation of the front image data and the rear image data is as follows:
(1);
wherein K is an internal reference matrix of the camera 3; among the two adjacent image data, the image data with the preceding time is recorded as first image data, the image data with the following time is recorded as second image data, u and v are respectively the abscissa and the ordinate of the feature point in the first image data, and u 'and v' are respectively the abscissa and the ordinate of the same feature point in the second image data.
As shown in fig. 4, the principle of obtaining the coordinate conversion relationship between the front and rear image data by the epipolar geometry processing is as follows: the camera 3 shoots the P point at the o point to form a corresponding pixel point P on the imaging plane, and at the moment, we can know that the P point is on the op ray, but cannot determine the specific position of the P point; the camera moves to the o ' point to shoot the P point again to generate a new pixel point P ', and if the image is matched through the characteristic points and the P ' are determined to be the characteristic points of the same point, the position of the P can be determined. The p-point and p' -point must have the following relationship:(3) The method comprises the steps of carrying out a first treatment on the surface of the Wherein, (x, y, z) is the coordinates of the p-point in the o-xyz coordinate system; (x ', y', z ') is the coordinates of the p' point in the o '-x' y 'z' coordinate system; [ t ]]Is a translation matrix; r is a rotation matrixThe method comprises the steps of carrying out a first treatment on the surface of the It should be noted that, the points in one coordinate system are transformed into the other coordinate system, which must undergo two operations of translation and rotation, and the aforementioned pose transformation matrix E is [ t ]]* Shorthand for R, i.e. e= [ t]* R is defined as the formula. And combining the camera internal reference matrix K to obtain a relational expression: />(4),/>(5) The method comprises the steps of carrying out a first treatment on the surface of the The above-described relational expression (1) can be obtained from the relational expressions (3), (4) and (5).
As shown in fig. 3, it should be further noted that, in the data fusion step, the target detection data is obtained by calculating according to the formula (2), where the expression of the formula (2) is:
(2);
where U is target detection data, and S is coordinate data of a detection target obtained by the acoustic detector 2.
In addition, as shown in fig. 3, it should be further noted that the underwater target detection method further includes a visualization processing step, where the final detection data is subjected to visualization processing by using 3D point cloud processing software; in this step, as an exemplary embodiment, 3D point cloud processing software may employ cloudcomputer.
By the aid of the underwater object detection system in the above embodiments, the underwater object can be detected without carrying an additional positioning system through the cooperation of the camera 3 and the acoustic detector 2, and particularly, the underwater object detection system can effectively improve detection efficiency and accuracy in environments where the positioning system such as an underwater cave is difficult to stably operate. The other positive technical effects of the underwater target detection system in the above embodiments are also applicable to the underwater target detection method, and are not described herein.
By way of illustration of various embodiments of the underwater target detection system and method of the present application, it can be seen that the underwater target detection system and method embodiments of the present application have at least one or more of the following advantages:
1. according to the underwater target detection system provided by the application, through the cooperation of the acoustic detector 2 and the camera 3, an additional positioning system is not required to be carried, so that the complexity and cost of the underwater target detection system are reduced;
2. the underwater target detection method provided by the application can be suitable for various underwater environments, especially for the underwater cave environments with limited space and complex environment, and can stably run without carrying an additional positioning system;
3. according to the underwater target detection method provided by the application, the positioning accuracy and stability of an underwater target detection system are improved through the epipolar geometry processing of the data processing step.
Finally, it should be noted that: in the present specification, each embodiment is described in a progressive manner, and each embodiment is mainly described in a different point from other embodiments, and identical and similar parts between the embodiments are all enough to refer to each other.
The above embodiments are only for illustrating the technical solution of the present application and not for limiting the same; while the application has been described in detail with reference to the preferred embodiments, those skilled in the art will appreciate that: modifications may be made to the specific embodiments of the present application or equivalents may be substituted for part of the technical features thereof; without departing from the spirit of the application, it is intended to cover the scope of the application as claimed.
Claims (10)
1. An underwater target detection system, comprising:
the watertight box body is provided with a first detection port and a second detection port;
the acoustic detector is positioned in the watertight box body, is correspondingly arranged at the first detection port and is used for acquiring coordinate data of a detection target;
the camera is positioned in the watertight box body, is correspondingly arranged at the second detection port and is used for acquiring image data of the underwater environment;
the control module is respectively in communication connection with the acoustic detector and the camera and is used for controlling the acoustic detector and the camera to acquire data;
the data processing module is in communication connection with the acoustic detector and the camera and is used for obtaining pose data of camera movement according to image data acquired by the camera and taking the pose data as the pose data of the acoustic detector movement; and fusing the pose data of the movement of the acoustic detector with the coordinate data of the detection target acquired by the acoustic detector, and outputting target detection data.
2. The underwater target detection system of claim 1 wherein the control module controls the camera to acquire image data in advance, and after the camera acquires the first initial image data, the control module controls the acoustic detector to be activated and controls the acoustic detector and the camera to acquire data synchronously.
3. The underwater target detection system of claim 1, wherein the data processing module performs epipolar geometry processing on two adjacent pieces of image data in accordance with a time series of the image data acquired by the camera, to obtain a pose transformation matrix of camera movement as pose data of camera movement.
4. The underwater target detection system of claim 1, wherein the underwater target detection system further comprises a communication module, the control module is communicatively coupled to the acoustic detector and the camera via the communication module, and the data processing module is communicatively coupled to the acoustic detector and the camera via the communication module.
5. An underwater target detection system as in claim 1 or 4 wherein the underwater target detection system further comprises a power supply device for providing a supply of electrical energy.
6. An underwater target detection method, characterized in that the underwater target detection system according to any one of claims 1 to 5 is used for detection of an underwater target, comprising the steps of:
a data acquisition step, namely starting a camera to acquire image data of an underwater environment, and immediately starting an acoustic detector and starting to acquire data synchronously with the camera after the camera acquires first initial image data;
a data processing step of processing the acquired image data to obtain pose data of the movement of the camera, wherein the pose data is used as the pose data of the movement of the acoustic detector;
a data fusion step of fusing coordinate data of a detection target acquired by an acoustic detector with pose data of the movement of the acoustic detector and outputting target detection data;
and a data splicing step, wherein the obtained target detection data are stored according to time sequence, so as to obtain final detection data.
7. The underwater target detection method according to claim 6, wherein in the data processing step, epipolar geometry processing is performed on two adjacent pieces of image data in accordance with a time series of the acquired image data, and a camera-moving pose transformation matrix E is generated as the camera-moving pose data.
8. The underwater target detection method according to claim 6 or 7, wherein the step of obtaining the pose transformation matrix E by epipolar geometry processing is: calibrating a camera for acquiring image data to obtain an internal reference matrix K of the camera; the image data are arranged according to the time sequence, feature point matching is sequentially carried out on two adjacent image data, the coordinate conversion relation of the front image data and the rear image data is obtained by combining the camera internal reference matrix K, the pose conversion matrix E is obtained by calculation, and the expression of the coordinate conversion relation of the front image data and the rear image data is as follows:
(1);
wherein K is an internal reference matrix of the camera; among the two adjacent image data, the image data with the preceding time is recorded as first image data, the image data with the following time is recorded as second image data, u and v are respectively the abscissa and the ordinate of the feature point in the first image data, and u 'and v' are respectively the abscissa and the ordinate of the same feature point in the second image data.
9. The underwater target detection method of claim 6, wherein in the data fusion step, the target detection data is calculated according to the formula (2), and the expression of the formula (2) is:
(2);
wherein U is target detection data, and S is coordinate data of a detection target obtained by an acoustic detector.
10. The underwater target detection method as claimed in claim 6, wherein the underwater target detection method further comprises a visualization processing step including: and carrying out visual processing on the final detection data through 3D point cloud processing software.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310868749.7A CN116594080B (en) | 2023-07-17 | 2023-07-17 | Underwater target detection system and detection method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310868749.7A CN116594080B (en) | 2023-07-17 | 2023-07-17 | Underwater target detection system and detection method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN116594080A true CN116594080A (en) | 2023-08-15 |
CN116594080B CN116594080B (en) | 2023-12-01 |
Family
ID=87601245
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310868749.7A Active CN116594080B (en) | 2023-07-17 | 2023-07-17 | Underwater target detection system and detection method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116594080B (en) |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090103083A1 (en) * | 2007-10-23 | 2009-04-23 | Kevin Kremeyer | Acoustic and optical illumination technique for underwater charaterization of objects/environment |
CN102975833A (en) * | 2012-12-10 | 2013-03-20 | 上海大学 | Teleoperation unmanned submersible for detecting and disposing submarine target |
JP2018203192A (en) * | 2017-06-09 | 2018-12-27 | 株式会社荏原製作所 | Underwater robot control system and underwater robot control method |
CN109859271A (en) * | 2018-12-14 | 2019-06-07 | 哈尔滨工程大学 | A kind of combined calibrating method of Underwater Camera and Forward-looking Sonar |
CN110081881A (en) * | 2019-04-19 | 2019-08-02 | 成都飞机工业(集团)有限责任公司 | It is a kind of based on unmanned plane multi-sensor information fusion technology warship bootstrap technique |
KR20200000083A (en) * | 2018-06-22 | 2020-01-02 | 포항공과대학교 산학협력단 | Underwater navigation method and system for underwater vehicle |
CN114488164A (en) * | 2022-01-17 | 2022-05-13 | 清华大学深圳国际研究生院 | Underwater vehicle synchronous positioning and mapping method and underwater vehicle |
CN115019412A (en) * | 2022-06-01 | 2022-09-06 | 杭州电子科技大学 | Underwater AUV (autonomous underwater vehicle) submarine cable inspection system and method based on multiple sensors |
CN116045959A (en) * | 2022-12-21 | 2023-05-02 | 中国矿业大学 | Laser vision fusion-based anti-impact drilling robot synchronous positioning and map construction method |
CN116309813A (en) * | 2022-11-28 | 2023-06-23 | 北京航空航天大学 | Solid-state laser radar-camera tight coupling pose estimation method |
-
2023
- 2023-07-17 CN CN202310868749.7A patent/CN116594080B/en active Active
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090103083A1 (en) * | 2007-10-23 | 2009-04-23 | Kevin Kremeyer | Acoustic and optical illumination technique for underwater charaterization of objects/environment |
CN102975833A (en) * | 2012-12-10 | 2013-03-20 | 上海大学 | Teleoperation unmanned submersible for detecting and disposing submarine target |
JP2018203192A (en) * | 2017-06-09 | 2018-12-27 | 株式会社荏原製作所 | Underwater robot control system and underwater robot control method |
KR20200000083A (en) * | 2018-06-22 | 2020-01-02 | 포항공과대학교 산학협력단 | Underwater navigation method and system for underwater vehicle |
CN109859271A (en) * | 2018-12-14 | 2019-06-07 | 哈尔滨工程大学 | A kind of combined calibrating method of Underwater Camera and Forward-looking Sonar |
CN110081881A (en) * | 2019-04-19 | 2019-08-02 | 成都飞机工业(集团)有限责任公司 | It is a kind of based on unmanned plane multi-sensor information fusion technology warship bootstrap technique |
CN114488164A (en) * | 2022-01-17 | 2022-05-13 | 清华大学深圳国际研究生院 | Underwater vehicle synchronous positioning and mapping method and underwater vehicle |
CN115019412A (en) * | 2022-06-01 | 2022-09-06 | 杭州电子科技大学 | Underwater AUV (autonomous underwater vehicle) submarine cable inspection system and method based on multiple sensors |
CN116309813A (en) * | 2022-11-28 | 2023-06-23 | 北京航空航天大学 | Solid-state laser radar-camera tight coupling pose estimation method |
CN116045959A (en) * | 2022-12-21 | 2023-05-02 | 中国矿业大学 | Laser vision fusion-based anti-impact drilling robot synchronous positioning and map construction method |
Non-Patent Citations (2)
Title |
---|
ANTONIO LAGUDI ET AL.: "An Alignment Method for the Integration of Underwater 3D Data Captured by a Stereovision System and an Acoustic Camera", SENSORS, pages 8 - 15 * |
张勋等: "基于测距声纳与光视觉的水下目标定位方法研究", 船舶工程, vol. 38, no. 5, pages 74 - 78 * |
Also Published As
Publication number | Publication date |
---|---|
CN116594080B (en) | 2023-12-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110728715B (en) | Intelligent inspection robot camera angle self-adaptive adjustment method | |
US10427304B2 (en) | Robotic charger alignment | |
CN101887037B (en) | Wheel-type X-ray flaw detection robot device | |
Xing et al. | Robust RGB-D camera and IMU fusion-based cooperative and relative close-range localization for multiple turtle-inspired amphibious spherical robots | |
CN109374008A (en) | A kind of image capturing system and method based on three mesh cameras | |
KR20140049361A (en) | Multiple sensor system, and apparatus and method for three dimensional world modeling using the same | |
CN102253057B (en) | Endoscope system and measurement method using endoscope system | |
CN109931909B (en) | Unmanned aerial vehicle-based marine fan tower column state inspection method and device | |
CN107241533B (en) | A kind of battle array scanning laser imaging device and method under water | |
CN111220126A (en) | Space object pose measurement method based on point features and monocular camera | |
CN113570715B (en) | Sensor fusion-based rotary laser real-time positioning modeling system and method | |
JP2016177640A (en) | Video monitoring system | |
CN110992487A (en) | Rapid three-dimensional map reconstruction device and reconstruction method for hand-held airplane fuel tank | |
CN111077907A (en) | Autonomous positioning method of outdoor unmanned aerial vehicle | |
CN110849269A (en) | System and method for measuring geometric dimension of field corn cobs | |
CN107870335A (en) | The three-dimensional composite imaging method of EO-1 hyperion laser, system and nobody from the device that navigates | |
CN109282743A (en) | It is suitble to the laser high-speed line of deep sea in-situ measurement to scan binocular vision three-dimensional imaging device | |
CN110966921A (en) | Indoor three-dimensional scanning equipment and method | |
CN116594080B (en) | Underwater target detection system and detection method | |
CN214409706U (en) | Indoor unmanned aerial vehicle positioning system based on machine vision | |
CN110728745A (en) | Underwater binocular stereoscopic vision three-dimensional reconstruction method based on multilayer refraction image model | |
CN211552867U (en) | Visual navigation system for assisting unmanned trolley | |
CN112798020A (en) | System and method for evaluating positioning accuracy of intelligent automobile | |
CN113324538B (en) | Cooperative target remote high-precision six-degree-of-freedom pose measurement method | |
CN113359839A (en) | Unmanned aerial vehicle perception system based on three-dimensional vision |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |