CN109859271B - Combined calibration method for underwater camera and forward-looking sonar - Google Patents
Combined calibration method for underwater camera and forward-looking sonar Download PDFInfo
- Publication number
- CN109859271B CN109859271B CN201811532378.0A CN201811532378A CN109859271B CN 109859271 B CN109859271 B CN 109859271B CN 201811532378 A CN201811532378 A CN 201811532378A CN 109859271 B CN109859271 B CN 109859271B
- Authority
- CN
- China
- Prior art keywords
- calibration plate
- coordinate system
- calibration
- camera
- sonar
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02A—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
- Y02A90/00—Technologies having an indirect contribution to adaptation to climate change
- Y02A90/30—Assessment of water resources
Landscapes
- Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)
Abstract
The invention provides a combined calibration method of an underwater camera and a forward-looking sonar, wherein the camera acquires images of a calibration plate underwater, obtains internal parameters of the camera through calibration, and deduces a three-dimensional coordinate conversion relation from a coordinate system of the calibration plate to coordinates of the underwater camera; the forward looking sonar acquires acoustic echo data of an underwater environment, and deduces a three-dimensional coordinate conversion relation between a forward looking sonar coordinate system and a calibration plate centroid coordinate system; and the joint calibration between the underwater camera and the forward-looking sonar is completed through the two relations. The invention belongs to a non-contact calibration method, is simple, convenient and quick to operate, can greatly reduce random errors caused by complicated manual participation and calibration tools, and improves the calibration efficiency; the target characteristics can be realized in the camera image, and two-dimensional position information of the target characteristics can be obtained in a sonar system; the relative position relation of the two sensor measuring systems can be obtained, the defects of complicated calibration measuring process, poor accuracy and the like are overcome, and the method has strong operability.
Description
Technical Field
The invention belongs to the field of image processing and information fusion, and particularly relates to a combined calibration method of an underwater camera and a forward-looking sonar.
Background
For a perception system in an unmanned motion platform environment, the influence of external factors such as the environment and the like on the system performance can be reduced by using a multi-sensor data fusion technology. The particularity of the underwater environment is also a big difficulty, and the radar detection system widely applied to the monitoring of the land environment is not suitable for the underwater detection environment due to the hormone attenuation and diffusion of radio waves in water. When the common light source is used for underwater illumination, the light energy is greatly lost and serious scattered light is generated due to the absorption and scattering of water media and particles in water, so that the imaging quality is greatly reduced. Light waves have limited effect in complex marine environments and cannot meet the actual underwater needs. The sound wave not only has a long propagation distance in water, but also is suitable for complex and changeable hydrological environments, so that the sound wave becomes a more ideal underwater detection and environment sensing means.
In practical research, the combination of an underwater camera and a sonar system is applied to the technologies of acquiring motion information of an underwater robot, three-dimensional reconstruction and the like. The sonar can measure two-dimensional information of underwater objects, and the measurement accuracy is much higher compared with other distance measuring instruments. While the underwater camera may acquire other information of the object, such as shape, size, etc. The combined use of the underwater camera and the sonar can obtain richer and more comprehensive information of the measured object, can also improve the accuracy of the measured information and simplify the information processing process, and can achieve good complementary effect on performance. However, since the data acquired by the two sensors are based on respective coordinate systems, the actual data fusion process needs to represent the data acquired by the two sensor systems in the same coordinate system. Therefore, the first task is to jointly calibrate the two sensor systems to determine the relative relationship between the two coordinate systems.
Disclosure of Invention
The invention aims to provide a combined calibration method of an underwater camera and a forward-looking sonar, which utilizes a calibration plate as an intermediate medium to complete the conversion of relative coordinate systems of two systems and the real-time determination of relative positions and overcomes the defects of complicated measurement process, poor accuracy and the like in the prior art. The purpose of the invention is realized as follows:
a combined calibration method of an underwater camera and a forward-looking sonar comprises the following specific implementation steps:
step 1, a target optical detection module of an underwater camera acquires at least 20 calibration plate images underwater, internal parameters of the camera are obtained through calibration, a rotation matrix and a translation matrix between a calibration plate coordinate system and a camera coordinate system are calculated through the internal parameters of the underwater camera and the angular positions of the calibration plate, a three-dimensional coordinate conversion relation from the calibration plate coordinate system to coordinates of the underwater camera is deduced, and a calibration plate plane rectangular coordinate system is O b X b Y b Z b The origin is the lower right corner of the calibration plate, and the two sides of the calibration plate are respectively O b X b 、O b Y b ,O b Z b The axis is perpendicular to the plane of the calibration plate, and the coordinate system of the camera is O c X c Y c Z c ;
And 3, calculating the conversion between the camera coordinate system and the sonar coordinate system through the three-dimensional coordinate conversion relation from the calibration plate coordinate system to the underwater camera coordinate and the three-dimensional coordinate conversion relation from the forward-looking sonar coordinate system to the calibration plate centroid coordinate system, namely completing the combined calibration between the underwater camera and the forward-looking sonar.
The three-dimensional conversion relation between the camera coordinate system and the calibration plate plane coordinate system in the step 1 is
Wherein R is b→c Being an orthogonal rotation matrix, T b→c Is a translation matrix.
WhereinThe distance between the sonar and the origin of the coordinate system of the calibration plate is r, and the angle is theta.
Center point O b And point O b Between m and n are the physical size of the calibration plate in the horizontal direction and the number of checkerboards in the vertical direction respectively; d y ,D x Respectively setting a checkerboard horizontal direction physical dimension and a vertical direction physical dimension of the calibration board, and then the three-dimensional conversion relation between the calibration board centroid coordinate system and the calibration board plane coordinate system is
The three-dimensional conversion relation between the camera coordinate system and the sonar coordinate system in the step 3 is
Wherein R is b→c For orthogonal rotation matrix between camera and calibration plate, T b→c Is a translation matrix between the camera and the calibration plate, and m and n are points O respectively b And point O b ' between the horizontal physical size of the board and the number of the checkerboards in the vertical direction, D y 、D x The physical size of the calibration plate in the horizontal direction and the physical size of the calibration plate in the vertical direction are respectively one checkerboard, the distance between the sonar and the origin of the coordinate system of the calibration plate is r, and the angle is theta.
The calibration plate exists in the detection visual field range of the underwater camera and the forward-looking sonar at the same time, and the underwater camera and the forward-looking sonar collect underwater optical and underwater acoustic images of the calibration plate at the same time.
The three-dimensional coordinate conversion relation between the underwater camera and the calibration plate is obtained by a Zhangyingyou monocular camera calibration method, and the calibration of the internal and external parameters of the camera is carried out by utilizing the Zhangyingyou calibration method to obtain the effective focal length, the image principal point coordinates, the orthogonal rotation matrix and the translation vector of the camera.
Setting the range, sensitivity and gain parameters of the forward looking sonar, and receiving the echo signal of the calibration plate in the range by the forward looking sonar to obtain the azimuth, size and form information of the calibration plate; carrying out noise filtering on the sonar image through median filtering, carrying out image segmentation on the sonar image by adopting a region growing method, and extracting the centroid position and angle information of the calibration plate; the centroid refers to the center of the smallest circumscribed rectangle of the calibration plate target after image segmentation.
The invention has the beneficial effects that: the invention belongs to a non-contact calibration method, is simple, convenient and quick to operate, can greatly reduce random errors caused by complicated manual participation and calibration tools, and improves the calibration efficiency; the target characteristics can be realized in the camera image, and two-dimensional position information of the target characteristics can be obtained in a sonar system; the relative position relation of the two sensor measuring systems can be obtained, the defects of complicated calibration measuring process, poor accuracy and the like are overcome, and the method has strong operability.
Drawings
FIG. 1 is a schematic view of a coordinate system according to the present invention.
FIG. 2 is a flow chart of the joint calibration method of the present invention.
Fig. 3 is an optical image of a calibration plate captured by a underwater camera in the same scene.
Fig. 4 is a calibration plate acoustic image collected by a forward-looking sonar in the same scene.
Fig. 5 is the result of the distance and orientation of the target processed by the computer according to the forward looking sonar image.
Detailed Description
The invention is further described with reference to the accompanying drawings in which:
example 1
The invention aims to provide a combined calibration method of an underwater camera and a forward-looking sonar, which utilizes a calibration plate as an intermediate medium to complete the conversion of relative coordinate systems of two systems and the real-time determination of relative positions and overcomes the defects of complicated measurement process, poor accuracy and the like in the prior art. The purpose of the invention is realized by the following steps:
a combined calibration method of an underwater camera and a forward-looking sonar comprises the following specific implementation steps:
step 1, a target optical detection module of an underwater camera acquires at least 20 calibration plate images underwater, internal parameters of the camera are obtained through calibration, a rotation matrix and a translation matrix between a calibration plate coordinate system and a camera coordinate system are calculated through the internal parameters of the underwater camera and the angular positions of the calibration plate, a three-dimensional coordinate conversion relation from the calibration plate coordinate system to coordinates of the underwater camera is deduced, and a calibration plate plane rectangular coordinate system is O b X b Y b Z b The origin is the lower right corner of the calibration plate, and the two sides of the calibration plate are respectively O b X b 、O b Y b ,O b Z b The axis is perpendicular to the plane of the calibration plate, and the coordinate system of the camera is O c X c Y c Z c ;
And 3, calculating the conversion between the camera coordinate system and the sonar coordinate system through the three-dimensional coordinate conversion relationship from the calibration plate coordinate system to the underwater camera coordinate system and the three-dimensional coordinate conversion relationship from the forward-looking sonar coordinate system to the calibration plate centroid coordinate system, namely completing the combined calibration between the underwater camera and the forward-looking sonar.
The three-dimensional conversion relation between the camera coordinate system and the calibration plate plane coordinate system in the step 1 is
Wherein R is b→c Being an orthogonal rotation matrix, T b→c Is a translation matrix.
WhereinThe distance between the sonar and the origin of the coordinate system of the calibration plate is r, and the angle is theta.
Center point O b And point O b Between m and n are the physical size of the calibration plate in the horizontal direction and the number of checkerboards in the vertical direction respectively; d y ,D x A checkerboard horizontal direction physical dimension and a checkerboard vertical direction physical dimension are respectively used as the calibration board, and the three-dimensional conversion relation between the calibration board centroid coordinate system and the calibration board plane coordinate system is
The three-dimensional conversion relation between the camera coordinate system and the sonar coordinate system in the step 3 is
Wherein R is b→c For orthogonal rotation matrix between camera and calibration plate, T b→c Is a translation matrix between the camera and the calibration plate, and m and n are points O b And point O b ' between them, the physical size of the board in the horizontal direction and the number of checkerboards in the vertical direction are calibrated, D y 、D x The physical size of the calibration plate in the horizontal direction and the physical size of the calibration plate in the vertical direction are respectively one checkerboard, the distance between the sonar and the origin of the coordinate system of the calibration plate is r, and the angle is theta.
The calibration plate exists in the detection visual field range of the underwater camera and the forward-looking sonar at the same time, and the underwater camera and the forward-looking sonar collect underwater optical images and underwater acoustic images of the calibration plate at the same time.
The three-dimensional coordinate conversion relation between the underwater camera and the calibration plate is obtained by a Zhangyingyou monocular camera calibration method, and the calibration of the internal and external parameters of the camera is carried out by utilizing the Zhangyingyou calibration method to obtain the effective focal length, the image principal point coordinates, the orthogonal rotation matrix and the translation vector of the camera.
Setting the range, sensitivity and gain parameters of the forward looking sonar, and receiving the echo signal of the calibration plate in the range by the forward looking sonar to obtain the azimuth, size and form information of the calibration plate; carrying out noise filtering on the sonar image through median filtering, carrying out image segmentation on the sonar image by adopting a region growing method, and extracting the centroid position and angle information of the calibration plate; the centroid refers to the center of the smallest circumscribed rectangle of the calibration plate target after image segmentation.
The calibration plate is suspended 2-5 meters below the water surface, and the calibration plate needs to keep a certain distance from the water surface to prevent reflection from the water surface from influencing calibration.
The lower middle part of the calibration plate is provided with a hole, a weight is tied by a rope and penetrates through the hole, so that the calibration plate is kept in a vertical state in water, and the calibration plate is placed right in front of the underwater camera. The scanning angle of the front-view sonar is set to be 180 degrees in the range of the forward opening angle, the calibration plate is positioned in the detection range of the front-view sonar, and the mass center of the calibration plate and the beam of the front-view sonar are positioned in the same horizontal plane.
Example 2
1. The model of the underwater camera used in the step (I) is Kongsberg. The distance between the calibration plate and the underwater camera is within the underwater optical visibility range, and the calibration plate is placed right in front of the underwater camera. Firstly, a calibration plate image is obtained through a camera, the collected image is preprocessed, and an optical image of the calibration plate obtained by shooting through an underwater camera is shown in fig. 3. Then, a monocular camera coordinate system mathematical model is established, camera internal parameters are calibrated, and the rotation matrix and the translation matrix between the calibration plate coordinate system and the camera coordinate system are calculated by combining the underwater camera internal parameters and the calibration plate corner point positions. The established monocular camera coordinate coefficient model is a pinhole approximation model, the calibration of the internal and external parameters of the camera is carried out by adopting a Zhang-Yongyou monocular camera calibration method, the internal parameter matrix such as the effective focal length, the image principal point coordinate and the like of the camera is obtained, the external parameters such as the orthogonal rotation matrix, the translation vector and the like at each position are obtained, and the orthogonal rotation matrix R of the calibration plate under the camera coordinate system is obtained b→c And translation matrix T b→c . Tool for measuringThe bulk flow is shown in FIG. 2.
2. The forward looking sonar used in the step (II) is a SeaKing DST double-frequency digital mechanical scanning forward looking sonar. The distance between the calibration plate and the forward-looking sonar is beyond the range of the acoustic blind area of the forward-looking sonar. The forward looking sonar detection module acquires data using a single beam mechanically scanned forward looking sonar. Setting the range, sensitivity and gain parameters of the forward-looking sonar, receiving the calibration plate echo signal in the range by the forward-looking sonar, obtaining the azimuth, size and form information of the calibration plate, and generating a sonar image, wherein the acoustic image of the calibration plate acquired by the forward-looking sonar is shown in fig. 4. And carrying out noise filtering on the sonar image by median filtering, carrying out image segmentation on the sonar image by adopting a region growing method, and extracting the centroid position and angle information of the calibration plate. The centroid refers to the center (point B) of the smallest circumscribed rectangle (rectangle a) of the calibration plate target after image segmentation. Therefore, the distance between the sonar and the mass center of the calibration plate and the angle relation between the sonar and the mass center of the calibration plate are obtained, and the conversion relation between the sonar coordinate system and the mass center coordinate system of the calibration plate is obtained. And the sonar coordinate system and the calibration plate centroid plane coordinate system do not rotate, and only the offset is considered. As can be seen from fig. 5, the near point distance, the near point azimuth, the centroid distance, the distance azimuth, and the detection rectangle distance and height of the target can be calculated from the sonar image by using a program.
3. And (II) suspending the calibration plate at a position 3 meters below the water surface, forming a hole at the middle lower part of the calibration plate, tying a weight by a rope, and penetrating through the hole to keep the calibration plate in a vertical state in the water. The scanning angle of the forward-looking sonar is set to be 180 degrees, the calibration plate is positioned in the detection range of the forward-looking sonar, the centroid of the calibration plate and the forward-looking sonar wave beam are positioned in the same horizontal plane, and the forward-looking sonar transmitting and receiving wave beams are perpendicular to the plane of the calibration plate. Establishing another calibration plate centroid plane rectangular coordinate system O b' X b' Y b' Z b' The origin is the centroid of the calibration plate, and the sides parallel to the two sides of the calibration plate corresponding to the centroid are O b' X b' 、O b' Y b' ,O b' Z b' The axis is perpendicular to the calibration plate plane. According to the existing program, the distance from the sonar to the centroid of the calibration plate, namely the distance from the sonar to the centroid of the calibration plate can be obtainedThe distance r between the origin of the coordinate system and the angle theta thereof, thereby establishing a three-dimensional coordinate conversion relationship between the forward looking sonar and the coordinate system of the calibration plate.
4. The program for processing sonar data in step (ii) mainly includes the following four parts, and a specific flow is shown in fig. 2.
Generation of sonar images: the forward looking sonar transmission matrix rotates in a "step-by-step" fashion. When the sonar control system sends out a rotation instruction, the sonar head starts to rotate clockwise or anticlockwise by a stepping angle, and transmits once acoustic pulse to the detection area at a certain vertical opening angle and a certain horizontal opening angle so as to receive and collect echo data. The forward looking sonar transmits serial port data through an RS-232 protocol.
The sonar sensors receive echo signals from each azimuth scanned. If an obstacle is present in a certain direction, the sound waves returning in that direction will fluctuate strongly, whereas fluctuations in the absence of objects (backgrounds) and where they are occluded by objects will be very small or even no echoes. According to the principle, the sonar original image can be generated according to the parameters such as the direction, the distance and the intensity of the collected sonar echo.
Sonar image and processing: and performing smooth denoising processing on the sonar original image by adopting median filtering. The method applies the 3-order median filter to filter the original image, can effectively inhibit noise and keep the definition of the sonar image.
Extracting target features: and (3) segmenting the filtered images by adopting a region growing method, and clustering the images directly according to the similarity and the connectivity of the pixels.
And calculating target position information and angle information.
5. One of the origins of the relative coordinate systems used by the camera system and the sonar system in the step (three) is the position of the lower right corner of the calibration plate, and the other one is the position of the centroid of the calibration plate
6. And (C) when the coordinate system of the camera is converted with the coordinate system of the calibration plate, the origin of the coordinate system of the calibration plate is the lower right corner of the calibration plate, and when the sonar coordinate system is converted with the coordinate system of the calibration plate, the origin of the coordinate system of the calibration plate is positioned at the centroid position of the calibration plate. And the conversion between the camera coordinate system and the sonar coordinate system is completed through the three-dimensional coordinate system conversion, namely the positioning of the relative positions of the two systems. The characteristics of the target can be realized in the camera image, and the two-dimensional position information of the target can be obtained in the sonar system, and the relative position relation of the two sensor measurement systems can be obtained. Substituting the formula (2) and the formula (5) into the formula (1) to obtain the three-dimensional conversion relation between the camera coordinate system and the sonar coordinate system:
Claims (8)
1. a combined calibration method of an underwater camera and a forward-looking sonar is characterized by comprising the following specific implementation steps:
step 1, a target optical detection module of an underwater camera acquires at least 20 calibration plate images underwater, internal parameters of the camera are obtained through calibration, a rotation matrix and a translation matrix between a calibration plate coordinate system and a camera coordinate system are calculated through the internal parameters of the underwater camera and the angular positions of the calibration plate, a three-dimensional coordinate conversion relation from the calibration plate coordinate system to coordinates of the underwater camera is deduced, and a calibration plate plane rectangular coordinate system is O b X b Y b Z b The origin is the lower right corner of the calibration plate, and the two sides of the calibration plate are respectively O b X b 、O b Y b ,O b Z b The axis is perpendicular to the plane of the calibration plate, and the coordinate system of the camera is O c X c Y c Z c ;
Step 2, the acoustic detection module obtains acoustic echo data of the underwater environment through the single-beam mechanical scanning type forward looking sonar, obtains the distance and angle relation between the forward looking sonar and the calibration plate centroid through data processing, deduces the three-dimensional coordinate conversion relation between a forward looking sonar coordinate system and a calibration plate centroid coordinate system, and the calibration plate centroid plane rectangular coordinate system is O b' X b' Y b' Z b' The origin is the centroid of the calibration plate, and the sides parallel to the two sides of the calibration plate corresponding to the centroid are O b' X b' 、O b' Y b' ,O b' Z b' The axis is vertical to the plane of the calibration plate, and the rectangular coordinate system of the sonar is O s X s Y s Z s ;
And 3, calculating the conversion between the camera coordinate system and the sonar coordinate system through the three-dimensional coordinate conversion relationship from the calibration plate coordinate system to the underwater camera coordinate system and the three-dimensional coordinate conversion relationship from the forward-looking sonar coordinate system to the calibration plate centroid coordinate system, namely completing the combined calibration between the underwater camera and the forward-looking sonar.
2. The method for jointly calibrating an underwater camera and a forward-looking sonar according to claim 1, wherein the method comprises the following steps: the three-dimensional conversion relation between the camera coordinate system and the calibration plate plane coordinate system in the step 1 is
Wherein R is b→c Being an orthogonal rotation matrix, T b→c Is a translation matrix.
3. The method for jointly calibrating an underwater camera and a forward-looking sonar according to claim 1, wherein the method comprises the following steps: step 2, the three-dimensional conversion relation between the sonar coordinate system and the calibration plate centroid coordinate system is
4. The method for jointly calibrating an underwater camera and a forward-looking sonar according to claim 1, wherein the method comprises the following steps: step 3, the three-dimensional coordinate conversion relation from the calibration plate coordinate system to the underwater camera coordinate and the three-dimensional coordinate conversion relation from the forward looking sonar coordinate system to the calibration plate centroid coordinate system are described, and the rotation matrix and the translation matrix of the calibration plate centroid coordinate system and the calibration plate coordinate system are
Center point O b And point O b Between m and n are the physical size of the calibration plate in the horizontal direction and the number of checkerboards in the vertical direction respectively; d y ,D x A checkerboard horizontal direction physical dimension and a checkerboard vertical direction physical dimension are respectively used as the calibration board, and the three-dimensional conversion relation between the calibration board centroid coordinate system and the calibration board plane coordinate system is
5. The method for jointly calibrating an underwater camera and a forward-looking sonar according to claim 1, wherein the method comprises the following steps: the three-dimensional conversion relation between the camera coordinate system and the sonar coordinate system in the step 3 is
Wherein R is b→c For orthogonal rotation matrix between camera and calibration plate, T b→c Is a translation matrix between the camera and the calibration plate, and m and n are points O b And point O b ' between the horizontal physical size of the board and the number of the checkerboards in the vertical direction, D y 、D x The physical size of the calibration plate in the horizontal direction and the physical size of the calibration plate in the vertical direction are respectively a checkerboard, the distance between the sonar and the origin of a coordinate system of the calibration plate is r, and the angle is theta.
6. The method for jointly calibrating an underwater camera and a forward-looking sonar according to claim 1, wherein the method comprises the following steps: the calibration plate exists in the detection visual field range of the underwater camera and the forward-looking sonar at the same time, and the underwater camera and the forward-looking sonar collect underwater optical and underwater acoustic images of the calibration plate at the same time.
7. The method for jointly calibrating an underwater camera and a forward-looking sonar according to claim 1, wherein the method comprises the following steps: the three-dimensional coordinate conversion relation between the underwater camera and the calibration plate is obtained by a Zhangyingyou monocular camera calibration method, and the calibration of the internal and external parameters of the camera is carried out by utilizing the Zhangyingyou calibration method to obtain the effective focal length, the image principal point coordinates, the orthogonal rotation matrix and the translation vector of the camera.
8. The method for jointly calibrating an underwater camera and a forward-looking sonar according to claim 1, wherein the method comprises the following steps: setting the range, sensitivity and gain parameters of the forward looking sonar, and receiving the echo signal of the calibration plate in the range by the forward looking sonar to obtain the azimuth, size and form information of the calibration plate; carrying out noise filtering on the sonar image through median filtering, carrying out image segmentation on the sonar image by adopting a region growing method, and extracting the centroid position and angle information of the calibration plate; the centroid refers to the center of the smallest circumscribed rectangle of the calibration plate target after image segmentation.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811532378.0A CN109859271B (en) | 2018-12-14 | 2018-12-14 | Combined calibration method for underwater camera and forward-looking sonar |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811532378.0A CN109859271B (en) | 2018-12-14 | 2018-12-14 | Combined calibration method for underwater camera and forward-looking sonar |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109859271A CN109859271A (en) | 2019-06-07 |
CN109859271B true CN109859271B (en) | 2022-09-27 |
Family
ID=66891130
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811532378.0A Active CN109859271B (en) | 2018-12-14 | 2018-12-14 | Combined calibration method for underwater camera and forward-looking sonar |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109859271B (en) |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111487627B (en) * | 2020-04-09 | 2023-04-28 | 广东省文物考古研究所 | Underwater sonar archaeology method, device, equipment and storage medium |
CN112308929B (en) * | 2020-10-28 | 2024-03-15 | 深圳市开成亿科技有限公司 | Underwater camera shooting calibration method, underwater camera shooting calibration system and storage medium |
CN112433219B (en) * | 2020-11-03 | 2024-05-31 | 深圳市汇海潜水工程服务有限公司 | Underwater detection method, system and readable storage medium |
CN114663745B (en) * | 2022-03-04 | 2024-07-02 | 深圳鳍源科技有限公司 | Position locking method of underwater equipment, terminal equipment, system and medium |
CN115100298B (en) * | 2022-08-25 | 2022-11-29 | 青岛杰瑞工控技术有限公司 | Light-sound image fusion method for deep and open sea visual culture |
CN116594080B (en) * | 2023-07-17 | 2023-12-01 | 中国海洋大学 | Underwater target detection system and detection method |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101699313A (en) * | 2009-09-30 | 2010-04-28 | 北京理工大学 | Method and system for calibrating external parameters based on camera and three-dimensional laser radar |
CN102042835A (en) * | 2010-11-05 | 2011-05-04 | 中国海洋大学 | Autonomous underwater vehicle combined navigation system |
WO2015024407A1 (en) * | 2013-08-19 | 2015-02-26 | 国家电网公司 | Power robot based binocular vision navigation system and method based on |
CN106228537A (en) * | 2016-07-12 | 2016-12-14 | 北京理工大学 | A kind of three-dimensional laser radar and the combined calibrating method of monocular-camera |
-
2018
- 2018-12-14 CN CN201811532378.0A patent/CN109859271B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101699313A (en) * | 2009-09-30 | 2010-04-28 | 北京理工大学 | Method and system for calibrating external parameters based on camera and three-dimensional laser radar |
CN102042835A (en) * | 2010-11-05 | 2011-05-04 | 中国海洋大学 | Autonomous underwater vehicle combined navigation system |
WO2015024407A1 (en) * | 2013-08-19 | 2015-02-26 | 国家电网公司 | Power robot based binocular vision navigation system and method based on |
CN106228537A (en) * | 2016-07-12 | 2016-12-14 | 北京理工大学 | A kind of three-dimensional laser radar and the combined calibrating method of monocular-camera |
Non-Patent Citations (2)
Title |
---|
二维和三维视觉传感集成系统联合标定方法;李琳等;《仪器仪表学报》;20121115(第11期);全文 * |
基于测距声纳与光视觉的水下目标定位方法研究;张勋等;《船舶工程》;20160515(第05期);全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN109859271A (en) | 2019-06-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109859271B (en) | Combined calibration method for underwater camera and forward-looking sonar | |
CN112669393B (en) | Laser radar and camera combined calibration method | |
CN106248340B (en) | A kind of wind tunnel model 3D ice shape On-line Measuring Method based on 3-D supersonic imaging technology | |
US9532029B2 (en) | 3d scanning laser systems and methods for determining surface geometry of an immersed object in a transparent cylindrical glass tank | |
CN110473260B (en) | Wave video measuring device and method | |
Aykin et al. | Forward-look 2-D sonar image formation and 3-D reconstruction | |
CN107582098B (en) | three-dimensional ultrasonic imaging method for two-dimensional ultrasonic image set reconstruction | |
JP3850541B2 (en) | Advanced measuring device | |
Negahdaripour et al. | On processing and registration of forward-scan acoustic video imagery | |
CN107909624B (en) | Method for extracting and fusing two-dimensional image from three-dimensional tomography | |
JPH05249239A (en) | Three-dimensional measurement and topography imaging sonar | |
CN207908979U (en) | A kind of target identification tracing system of unmanned boat | |
CN114488164B (en) | Synchronous positioning and mapping method for underwater vehicle and underwater vehicle | |
CN105074498A (en) | Ultrasonic diagnostic imaging system with spatial compounding of trapezoidal sector | |
CN112734921B (en) | Underwater three-dimensional map construction method based on sonar and visual image splicing | |
CN109410234A (en) | A kind of control method and control system based on binocular vision avoidance | |
CN110456362B (en) | Target acoustic imaging and speed measuring method and system based on pulse pair emission | |
CN111028337A (en) | Three-dimensional photoacoustic imaging method for improving problem of limited visual angle | |
Tang et al. | Three dimensional height information reconstruction based on mobile active sonar detection | |
CN210534857U (en) | Wave video measuring device | |
JPH04158855A (en) | Ultrasonic image display device | |
JP5082031B2 (en) | Underwater detection apparatus and method capable of calculating fish quantity information of a school of fish | |
Detry et al. | Turbid-water subsea infrastructure 3D reconstruction with assisted stereo | |
Kamgar‐Parsi et al. | High‐resolution underwater acoustic imaging with lens‐based systems | |
CN114187409A (en) | Method for building ship model based on video image and laser radar point cloud fusion |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |