CN116642468A - Unmanned aerial vehicle aerial photography and unmanned ship based underwater integrated scanning method - Google Patents
Unmanned aerial vehicle aerial photography and unmanned ship based underwater integrated scanning method Download PDFInfo
- Publication number
- CN116642468A CN116642468A CN202310626682.6A CN202310626682A CN116642468A CN 116642468 A CN116642468 A CN 116642468A CN 202310626682 A CN202310626682 A CN 202310626682A CN 116642468 A CN116642468 A CN 116642468A
- Authority
- CN
- China
- Prior art keywords
- aerial vehicle
- unmanned aerial
- unmanned
- data
- coordinate system
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 28
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 claims abstract description 27
- 239000011159 matrix material Substances 0.000 claims description 19
- 230000001133 acceleration Effects 0.000 claims description 9
- 230000001360 synchronised effect Effects 0.000 claims description 9
- 238000007781 pre-processing Methods 0.000 claims description 7
- 238000006073 displacement reaction Methods 0.000 claims description 6
- 238000001914 filtration Methods 0.000 claims description 6
- 238000005070 sampling Methods 0.000 claims description 6
- 238000004891 communication Methods 0.000 claims description 4
- 238000006243 chemical reaction Methods 0.000 claims description 3
- 238000012544 monitoring process Methods 0.000 abstract description 8
- 238000013507 mapping Methods 0.000 description 4
- 238000001514 detection method Methods 0.000 description 3
- 238000005259 measurement Methods 0.000 description 3
- 238000012876 topography Methods 0.000 description 3
- 238000007689 inspection Methods 0.000 description 2
- 230000007547 defect Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 230000009545 invasion Effects 0.000 description 1
- 238000011835 investigation Methods 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012805 post-processing Methods 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
- 238000011179 visual inspection Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
- G01C11/02—Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
- G01C11/02—Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures
- G01C11/025—Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures by scanning the object
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
- G01C11/04—Interpretation of pictures
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/42—Determining position
- G01S19/48—Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system
- G01S19/49—Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system whereby the further system is an inertial position system, e.g. loosely-coupled
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02A—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
- Y02A90/00—Technologies having an indirect contribution to adaptation to climate change
- Y02A90/30—Assessment of water resources
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Computer Networks & Wireless Communication (AREA)
- Image Processing (AREA)
Abstract
The invention discloses an unmanned aerial vehicle-based aerial photography and unmanned ship water-based underwater integrated scanning method, which relates to the technical field of breakwater health monitoring and comprises the following steps: all sensors carried by the unmanned aerial vehicle and the unmanned ship are fixedly connected through a ship-borne rigid stable platform; measuring coordinate values of each sensor under an object coordinate system of the water-on-water-under-water integrated system; inputting the measured lever arm value into real-time acquisition software to acquire underwater data on water, and acquiring multi-source data; the data after combined positioning and attitude determination are led into an on-water and underwater integrated system, and the acquired multi-source data are converted into a unified local engineering coordinate system; and fusing the multisource data converted into the same coordinate system, and establishing a three-dimensional model of the whole appearance of the breakwater. According to the invention, the three-dimensional model presenting the overall appearance of a wharf, a breakwater, an underwater structure and the like can be established by fusing the multi-source data of the unmanned aerial vehicle and the unmanned ship.
Description
Technical Field
The invention relates to the technical field of breakwater health monitoring, in particular to an unmanned aerial vehicle-based aerial photography and unmanned ship-based underwater integrated scanning method.
Background
As an important component of port infrastructure, the breakwater can defend the invasion port area of waves, ice, silt, water surge and the like, and provides a safe and stable shelter water area for the port, thereby ensuring the stable work of ships in the port and protecting the safety of various facility equipment in the port.
The breakwater working environment is severe, and the breakwater, the revetment, the seawall and other coastal infrastructure damage cases caused by storm surge and near-shore typhoon and high waves are frequent. The breakwater damage can cause huge economic loss, and the safety of staff in a port and the normal operation of various equipment facilities are directly influenced, so that the breakwater inspection and the timely monitoring of the structural health condition are of great economic and social significance.
The current breakwater structure health monitoring mainly relies on visual inspection, artificial photography, tape measures, leveling instrument, total station, GNSS RTK, diver underwater investigation and other modes, and has a plurality of obvious defects: the detection labor efficiency is low, the intensity is high, and the personnel safety risk is high; the detection result is difficult to be displayed from the point and the surface to show the whole appearance of the structure; the artificial subjective factors exist in the manual measurement, so that the measurement error is larger, the measurement result is coarser, and the digitization degree is low; the GNSS RTK automatic monitoring system has high cost (is only suitable for monitoring the standing dikes with fewer measuring points, and is not suitable for slope dikes, leading dikes and revetments with a plurality of facing block stones).
Based on the knowledge of damage hazard to breakwater and revetment and the shortcomings of the existing monitoring technology, it has been proposed to observe underwater parts on water by combining an unmanned plane and an unmanned ship, but the detection method often has the following shortcomings: the water-borne and underwater coordinate references are not uniform, the water-borne and underwater data precision is inconsistent, the water-borne and underwater data is discontinuous, and the technical bottlenecks of low operation efficiency and the like exist in the situation that data blank needs to be subjected to post-processing work such as interpolation.
Therefore, the underwater integrated mapping on water is a innovation and innovation of the related mapping technical method of the traditional water area, and how to propose intelligent inspection and health monitoring solutions of breakwaters, shore protection and the like based on the underwater integrated mapping on unmanned aerial vehicle and unmanned ship is a technical problem to be solved by those skilled in the art.
Disclosure of Invention
In view of the above, the invention provides an unmanned aerial vehicle aerial photography and unmanned ship water-borne and underwater integrated scanning method, and a three-dimensional model presenting the overall appearance of a wharf, a breakwater, an underwater structure and the like can be built by fusing multi-source data of an unmanned aerial vehicle and an unmanned ship.
In order to achieve the above object, the present invention provides the following technical solutions:
an unmanned aerial vehicle aerial photography and unmanned ship based underwater integrated scanning method comprises the following steps:
all sensors carried by the unmanned aerial vehicle and the unmanned ship are fixedly connected through a ship-borne rigid stable platform;
measuring coordinate values of each sensor under an object coordinate system of the water-on-water-under-water integrated system;
inputting the measured lever arm value into real-time acquisition software to acquire underwater data on water, and acquiring multi-source data;
the data after combined positioning and attitude determination are led into an on-water and underwater integrated system, and the acquired multi-source data are converted into a unified local engineering coordinate system;
and fusing the multisource data converted into the same coordinate system, and establishing a three-dimensional model of the whole appearance of the breakwater.
Optionally, the sensor comprises: the system comprises a laser scanner, a 360-degree panoramic camera, a multi-beam depth finder and GNSS-IMU integrated navigation equipment;
the laser scanner is carried on the unmanned plane and the unmanned ship and is used for acquiring the spatial information data of the water parts of slopes at the two sides of the breakwater;
the multi-beam depth sounder is mounted on the unmanned ship and used for acquiring spatial information data of underwater parts of slopes on two sides of the breakwater;
the 360-degree panoramic camera is carried on the unmanned aerial vehicle and used for acquiring image data of the breakwater top structure;
the GNSS-IMU integrated navigation equipment is carried on the unmanned aerial vehicle and used for providing positioning information, time information, gesture information and course information for a laser scanner, a multi-beam depth sounder and a 360-degree panoramic camera.
Optionally, converting image data of the breakwater roof structure acquired by the 360-degree panoramic camera into an object coordinate system, specifically including the following steps:
acquiring corresponding shooting parameter information when the 360-degree panoramic camera shoots image data, wherein the shooting parameter information comprises an antenna phase center three-dimensional coordinate and a sensor attitude angle;
three-dimensional sitting marking the antenna phase center as (X) C 、Y C 、Z C ) The sensor attitude angle includes a roll anglePitch angle θ, yaw angle ψ; the formula for converting shooting parameter information into an object coordinate system is as follows:
wherein:representing a rotation matrix from an object coordinate system to an image coordinate system; />A rotation matrix representing the image coordinate system to the sensor coordinate system; />A rotation matrix representing the sensor coordinate system to the carrier coordinate system; />A rotation matrix representing a geocentric earth fixed coordinate system to a navigation coordinate system; />A rotation matrix representing the carrier coordinate system to the navigation coordinate system; />Is a fixed matrix used for the conversion between the navigation coordinate system and the object coordinate system.
Optionally, the method for acquiring the lever arm value includes the following steps:
acquiring IMU data and GNSS data at a plurality of sampling moments, and acquiring priori state information and IMU state increment residual errors between adjacent GNSS sampling moments;
and constructing an objective function by combining the lever arm values to be optimized, and solving the lever arm values when the objective function converges.
Optionally, after acquiring the multi-source data, the method further comprises:
providing corresponding synchronous signals for each sensor by using a hardware synchronous controller, recording corresponding time and establishing a time synchronous standard;
performing data settlement of the reference station and the mobile station by using the IE, and outputting POS data;
and performing point cloud data preprocessing on the POS data to obtain a standard LAS data file.
Optionally, performing point cloud data preprocessing on the POS data, including point cloud filtering and point cloud thinning;
the point cloud filtering comprises echo signal denoising, distance noise removal and time denoising;
point cloud thinning is to reduce dense point cloud data.
Optionally, the process of combined positioning and attitude determination solution includes:
defining that the displacement value of the unmanned aerial vehicle at the initial moment is 0mm, and then the displacement value of the unmanned aerial vehicle at the direction X, Y, Z of the n moment = the coordinate value of the unmanned aerial vehicle at the direction X, Y, Z of the n moment-the coordinate value of the unmanned aerial vehicle at the initial moment X, Y, Z;
defining the speed of the unmanned aerial vehicle at the time n as the average speed of the unmanned aerial vehicle between the time n-1 and the time n+1, wherein the speed of the unmanned aerial vehicle at the time n X, Y, Z is = (three-dimensional space coordinate of the unmanned aerial vehicle at the time n+1 in X, Y, Z direction-three-dimensional space coordinate of the unmanned aerial vehicle at the time n-1 in X, Y, Z direction)/2/time interval of adjacent time;
and defining the acceleration of the unmanned aerial vehicle at the time n as the average acceleration of the unmanned aerial vehicle at the time n-1 and the time n+1, wherein the acceleration of the unmanned aerial vehicle at the time n X, Y, Z is = (the speed of the unmanned aerial vehicle at the time n+1 in X, Y, Z direction-the speed of the unmanned aerial vehicle at the time n-1 in X, Y, Z direction)/the time interval of 2/adjacent images.
Optionally, the controlling of the unmanned aerial vehicle and the unmanned ship includes:
controlling the unmanned ship to navigate according to a preset path, enabling the unmanned ship to fly above the unmanned ship synchronously along with the unmanned ship, and connecting the unmanned ship with the unmanned ship in a communication way;
when the residual electric quantity of the unmanned aerial vehicle is lower than a preset threshold value, the unmanned aerial vehicle is controlled to reduce the flying height, and the unmanned aerial vehicle is parked on the unmanned aerial vehicle and charges the unmanned aerial vehicle through the unmanned aerial vehicle.
Compared with the prior art, the invention provides the unmanned aerial vehicle photographing and unmanned ship water-based underwater integrated scanning method, and a three-dimensional model capable of presenting the whole appearance of the breakwater is established through multi-source data fusion of the unmanned aerial vehicle and the unmanned ship, so that a decision basis is provided for actual engineering mapping; the integration is high, and the working efficiency can be improved; unmanned, the safety of staff can be ensured; time synchronization, seamless splicing of water and water data, and higher precision.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are required to be used in the embodiments or the description of the prior art will be briefly described below, and it is obvious that the drawings in the following description are only embodiments of the present invention, and that other drawings can be obtained according to the provided drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flow chart of the unmanned aerial vehicle-based aerial photography and unmanned ship-based underwater integrated scanning method.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The embodiment of the invention discloses an unmanned aerial vehicle-based aerial photography and unmanned ship water-borne and underwater integrated scanning method, which is shown in fig. 1 and comprises the following steps:
all sensors carried by the unmanned aerial vehicle and the unmanned ship are fixedly connected through a ship-borne rigid stable platform;
measuring coordinate values of each sensor under an object coordinate system of the water-on-water-under-water integrated system;
inputting the measured lever arm value into real-time acquisition software to acquire underwater data on water, and acquiring multi-source data;
the data after combined positioning and attitude determination are led into an on-water and underwater integrated system, and the acquired multi-source data are converted into a unified local engineering coordinate system;
and fusing the multisource data converted into the same coordinate system, and establishing a three-dimensional model of the whole appearance of the breakwater.
Further, the sensor includes: the system comprises a laser scanner, a 360-degree panoramic camera, a multi-beam depth finder and GNSS-IMU integrated navigation equipment;
the laser scanner is carried on the unmanned plane and the unmanned ship and is used for acquiring the spatial information data of the water parts of slopes at the two sides of the breakwater;
the multi-beam depth sounder is mounted on the unmanned ship and used for acquiring spatial information data of underwater parts of slopes on two sides of the breakwater; the multi-beam sounding instrument transmits sound waves covered by a wide sector to an underwater area, the sound waves are received by utilizing a receiving transducer array in a narrow beam mode, irradiation footprints of underwater topography are formed through orthogonalization of the transmitting and receiving sectors, and the irradiation footprints of the underwater topography are processed, so that point cloud data of the underwater topography are obtained;
the 360-degree panoramic camera is carried on the unmanned aerial vehicle and used for acquiring image data of the breakwater top structure;
the GNSS-IMU integrated navigation equipment is carried on the unmanned aerial vehicle and used for providing positioning information, time information, gesture information and course information for a laser scanner, a multi-beam depth sounder and a 360-degree panoramic camera.
Further, the method for converting the image data of the breakwater top structure acquired by the 360-degree panoramic camera into the object coordinate system specifically comprises the following steps:
acquiring corresponding shooting parameter information when the 360-degree panoramic camera shoots image data, wherein the shooting parameter information comprises an antenna phase center three-dimensional coordinate and a sensor attitude angle;
three-dimensional sitting marking the antenna phase center as (X) C 、Y C 、Z C ) The sensor attitude angle includes a roll anglePitch angle θ, yaw angle ψ; the formula for converting shooting parameter information into an object coordinate system is as follows:
wherein:representing a rotation matrix from an object coordinate system to an image coordinate system; />A rotation matrix representing the image coordinate system to the sensor coordinate system; />A rotation matrix representing the sensor coordinate system to the carrier coordinate system; />A rotation matrix representing a geocentric earth fixed coordinate system to a navigation coordinate system; />A rotation matrix representing the carrier coordinate system to the navigation coordinate system; />Is a fixed matrix used for the conversion between the navigation coordinate system and the object coordinate system.
Further, the method for acquiring the lever arm value comprises the following steps:
acquiring IMU data and GNSS data at a plurality of sampling moments, and acquiring priori state information and IMU state increment residual errors between adjacent GNSS sampling moments;
and constructing an objective function by combining the lever arm values to be optimized, and solving the lever arm values when the objective function converges.
Further, after acquiring the multi-source data, the method further comprises:
providing corresponding synchronous signals for each sensor by using a hardware synchronous controller, recording corresponding time and establishing a time synchronous standard;
performing data settlement of the reference station and the mobile station by using the IE, and outputting POS data;
and performing point cloud data preprocessing on the POS data to obtain a standard LAS data file.
Further, performing point cloud data preprocessing on the POS data, including point cloud filtering and point cloud thinning;
the point cloud filtering comprises echo signal denoising, distance noise removal and time denoising;
point cloud thinning is to reduce dense point cloud data.
Further, a linear interpolation mode is adopted, low-frequency data are interpolated according to high-frequency data, inertial navigation attitude data are imported, and coordinates and attitudes of center points of the laser scanner and the multi-beam depth finder at all times are obtained according to verification parameters; leading in initial laser data and sounding data and analyzing to obtain coordinates of the measuring point under a sensor coordinate system; and searching the position and the posture of the moment sensor according to the time correspondence of each point to obtain three angles and three translation parameters forming a rotation matrix.
Further, the process of combined localization and pose determination solution comprises:
defining that the displacement value of the unmanned aerial vehicle at the initial moment is 0mm, and then the displacement value of the unmanned aerial vehicle at the direction X, Y, Z of the n moment = the coordinate value of the unmanned aerial vehicle at the direction X, Y, Z of the n moment-the coordinate value of the unmanned aerial vehicle at the initial moment X, Y, Z;
defining the speed of the unmanned aerial vehicle at the time n as the average speed of the unmanned aerial vehicle between the time n-1 and the time n+1, wherein the speed of the unmanned aerial vehicle at the time n X, Y, Z is = (three-dimensional space coordinate of the unmanned aerial vehicle at the time n+1 in X, Y, Z direction-three-dimensional space coordinate of the unmanned aerial vehicle at the time n-1 in X, Y, Z direction)/2/time interval of adjacent time;
and defining the acceleration of the unmanned aerial vehicle at the time n as the average acceleration of the unmanned aerial vehicle at the time n-1 and the time n+1, wherein the acceleration of the unmanned aerial vehicle at the time n X, Y, Z is = (the speed of the unmanned aerial vehicle at the time n+1 in X, Y, Z direction-the speed of the unmanned aerial vehicle at the time n-1 in X, Y, Z direction)/the time interval of 2/adjacent images.
Further, the control of the unmanned aerial vehicle and the unmanned ship comprises:
controlling the unmanned ship to navigate according to a preset path, enabling the unmanned ship to fly above the unmanned ship synchronously along with the unmanned ship, and connecting the unmanned ship with the unmanned ship in a communication way; specifically, wireless communication can be performed by adopting a Bluetooth mode, a WIFI mode, a 5G, ZIGBE mode or a GPRS mode;
when the residual electric quantity of the unmanned aerial vehicle is lower than a preset threshold value, the unmanned aerial vehicle is controlled to reduce the flying height, and the unmanned aerial vehicle is parked on the unmanned aerial vehicle and charges the unmanned aerial vehicle through the unmanned aerial vehicle.
Specifically, the unmanned aerial vehicle is a rotor unmanned aerial vehicle, and a charging port of the unmanned aerial vehicle is positioned at the lower side of the machine body; the unmanned ship comprises a shutdown platform and an unmanned plane take-off and landing charging device arranged on the shutdown platform. The unmanned aerial vehicle take-off and landing charging device comprises a cradle head arranged on the shutdown platform, a charging interface arranged on the cradle head, a buffer device, a fixing device and a landing plate. After the unmanned aerial vehicle is placed on the shutdown platform, the control center controls the clamping arm to rotate to enable the clamping arm to be higher than the sliding track, and the longitudinal clamping motor is driven to enable the clamping rod to move so that the clamping arm clamps and pushes the unmanned aerial vehicle to take off.
Further, through constructing a digital elevation model, elevation information of the breakwater is directly observed and analyzed, a land area DEM model and a water area DEM model are generated by utilizing ground point cloud data, and the water area and land area DEM models are spliced through coordinate registration, so that a three-dimensional model of the whole appearance of the breakwater is obtained.
In the present specification, each embodiment is described in a progressive manner, and each embodiment is mainly described in a different point from other embodiments, and identical and similar parts between the embodiments are all enough to refer to each other.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
Claims (8)
1. The unmanned aerial vehicle aerial photography and unmanned ship water-based underwater integrated scanning method is characterized by comprising the following steps of:
all sensors carried by the unmanned aerial vehicle and the unmanned ship are fixedly connected through a ship-borne rigid stable platform;
measuring coordinate values of each sensor under an object coordinate system of the water-on-water-under-water integrated system;
inputting the measured lever arm value into real-time acquisition software to acquire underwater data on water, and acquiring multi-source data;
the data after combined positioning and attitude determination are led into an on-water and underwater integrated system, and the acquired multi-source data are converted into a unified local engineering coordinate system;
and fusing the multisource data converted into the same coordinate system, and establishing a three-dimensional model of the whole appearance of the breakwater.
2. The unmanned aerial vehicle-based aerial photography and unmanned ship-based underwater integrated scanning method as claimed in claim 1, wherein the sensor comprises: the system comprises a laser scanner, a 360-degree panoramic camera, a multi-beam depth finder and GNSS-IMU integrated navigation equipment;
the laser scanner is carried on the unmanned plane and the unmanned ship and is used for acquiring the spatial information data of the water parts of slopes at the two sides of the breakwater;
the multi-beam depth sounder is mounted on the unmanned ship and used for acquiring spatial information data of underwater parts of slopes on two sides of the breakwater;
the 360-degree panoramic camera is carried on the unmanned aerial vehicle and used for acquiring image data of the breakwater top structure;
the GNSS-IMU integrated navigation equipment is carried on the unmanned aerial vehicle and used for providing positioning information, time information, gesture information and course information for a laser scanner, a multi-beam depth sounder and a 360-degree panoramic camera.
3. The method for integrated underwater and aerial photography based on the unmanned aerial vehicle and the unmanned ship according to claim 2 is characterized in that the image data of the breakwater top structure obtained by the 360-degree panoramic camera is converted into the object coordinate system, and specifically comprises the following steps:
acquiring corresponding shooting parameter information when the 360-degree panoramic camera shoots image data, wherein the shooting parameter information comprises an antenna phase center three-dimensional coordinate and a sensor attitude angle;
three-dimensional sitting marking the antenna phase center as (X) C 、Y C 、Z C ) The sensor attitude angle includes a roll anglePitch angle θ, yaw angle ψ; the formula for converting shooting parameter information into an object coordinate system is as follows:
wherein:representing a rotation matrix from an object coordinate system to an image coordinate system; />A rotation matrix representing the image coordinate system to the sensor coordinate system; />A rotation matrix representing the sensor coordinate system to the carrier coordinate system; />A rotation matrix representing a geocentric earth fixed coordinate system to a navigation coordinate system; />A rotation matrix representing the carrier coordinate system to the navigation coordinate system; />Is a fixed matrix used for the conversion between the navigation coordinate system and the object coordinate system.
4. The method for acquiring the lever arm value based on unmanned aerial vehicle aerial photography and unmanned ship underwater integrated scanning according to claim 1, wherein the method comprises the following steps of:
acquiring IMU data and GNSS data at a plurality of sampling moments, and acquiring priori state information and IMU state increment residual errors between adjacent GNSS sampling moments;
and constructing an objective function by combining the lever arm values to be optimized, and solving the lever arm values when the objective function converges.
5. The unmanned aerial vehicle photography and unmanned ship based underwater integrated scanning method according to claim 1, wherein after the multi-source data is acquired, the method further comprises:
providing corresponding synchronous signals for each sensor by using a hardware synchronous controller, recording corresponding time and establishing a time synchronous standard;
performing data settlement of the reference station and the mobile station by using the IE, and outputting POS data;
and performing point cloud data preprocessing on the POS data to obtain a standard LAS data file.
6. The unmanned aerial vehicle photography and unmanned ship based underwater integrated scanning method is characterized in that point cloud data preprocessing is carried out on POS data, and the point cloud preprocessing comprises point cloud filtering and point cloud thinning;
the point cloud filtering comprises echo signal denoising, distance noise removal and time denoising;
point cloud thinning is to reduce dense point cloud data.
7. The unmanned aerial vehicle photography and unmanned ship based underwater integrated scanning method of claim 1, wherein the combined positioning and attitude determination calculation process comprises the following steps:
defining that the displacement value of the unmanned aerial vehicle at the initial moment is 0mm, and then the displacement value of the unmanned aerial vehicle at the direction X, Y, Z of the n moment = the coordinate value of the unmanned aerial vehicle at the direction X, Y, Z of the n moment-the coordinate value of the unmanned aerial vehicle at the initial moment X, Y, Z;
defining the speed of the unmanned aerial vehicle at the time n as the average speed of the unmanned aerial vehicle between the time n-1 and the time n+1, wherein the speed of the unmanned aerial vehicle at the time n X, Y, Z is = (three-dimensional space coordinate of the unmanned aerial vehicle at the time n+1 in X, Y, Z direction-three-dimensional space coordinate of the unmanned aerial vehicle at the time n-1 in X, Y, Z direction)/2/time interval of adjacent time;
and defining the acceleration of the unmanned aerial vehicle at the time n as the average acceleration of the unmanned aerial vehicle at the time n-1 and the time n+1, wherein the acceleration of the unmanned aerial vehicle at the time n X, Y, Z is = (the speed of the unmanned aerial vehicle at the time n+1 in X, Y, Z direction-the speed of the unmanned aerial vehicle at the time n-1 in X, Y, Z direction)/the time interval of 2/adjacent images.
8. The unmanned aerial vehicle photographing and unmanned ship-based underwater integrated scanning method according to claim 1, wherein the unmanned aerial vehicle and unmanned ship control comprises:
controlling the unmanned ship to navigate according to a preset path, enabling the unmanned ship to fly above the unmanned ship synchronously along with the unmanned ship, and connecting the unmanned ship with the unmanned ship in a communication way;
when the residual electric quantity of the unmanned aerial vehicle is lower than a preset threshold value, the unmanned aerial vehicle is controlled to reduce the flying height, and the unmanned aerial vehicle is parked on the unmanned aerial vehicle and charges the unmanned aerial vehicle through the unmanned aerial vehicle.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310626682.6A CN116642468B (en) | 2023-05-31 | 2023-05-31 | Unmanned aerial vehicle aerial photography and unmanned ship based underwater integrated scanning method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310626682.6A CN116642468B (en) | 2023-05-31 | 2023-05-31 | Unmanned aerial vehicle aerial photography and unmanned ship based underwater integrated scanning method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN116642468A true CN116642468A (en) | 2023-08-25 |
CN116642468B CN116642468B (en) | 2024-05-17 |
Family
ID=87615079
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310626682.6A Active CN116642468B (en) | 2023-05-31 | 2023-05-31 | Unmanned aerial vehicle aerial photography and unmanned ship based underwater integrated scanning method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116642468B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117141765A (en) * | 2023-10-27 | 2023-12-01 | 奥来国信(北京)检测技术有限责任公司 | River course aviation photogrammetry device |
CN117690194A (en) * | 2023-12-08 | 2024-03-12 | 北京虹湾威鹏信息技术有限公司 | Multi-source AI biodiversity observation method and acquisition system |
Citations (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101281209A (en) * | 2007-04-03 | 2008-10-08 | 索尼株式会社 | Inertial sensor and electrical or electronic device |
US20100152933A1 (en) * | 2008-12-11 | 2010-06-17 | Honeywell International Inc. | Apparatus and method for unmanned aerial vehicle ground proximity detection, landing and descent |
JP2012032273A (en) * | 2010-07-30 | 2012-02-16 | Ministry Of Land Infrastructure & Transport Hokkaido Regional Development Bureau | Harbor structure measuring device |
WO2015195939A1 (en) * | 2014-06-19 | 2015-12-23 | Westerngeco Llc | System and method to acquire ultra-long offset seismic data for full waveform inversion (fwi) using unmanned marine vehicle (umv) |
CN105352476A (en) * | 2015-11-23 | 2016-02-24 | 青岛秀山移动测量有限公司 | Shipborne water bank line overwater and underwater integrated measurement system integrated method |
CN105444779A (en) * | 2015-11-24 | 2016-03-30 | 山东科技大学 | Field real-time calibration method for shipborne marine and submarine integrated measurement system |
WO2017132539A1 (en) * | 2016-01-29 | 2017-08-03 | Motion Engine Inc. | System and method for determining the position of sensor elements in a sensor array |
CN107037880A (en) * | 2017-03-02 | 2017-08-11 | 深圳前海极客船长网络科技有限公司 | Space orientation attitude determination system and its method based on virtual reality technology |
CN107121064A (en) * | 2017-04-27 | 2017-09-01 | 上海华测导航技术股份有限公司 | A kind of laser scanner |
CN107883932A (en) * | 2017-11-16 | 2018-04-06 | 国家海洋局第二海洋研究所 | A kind of measuring system and method for being applicable islands and reefs and seashore |
CN108362201A (en) * | 2017-12-25 | 2018-08-03 | 中国人民解放军战略支援部队信息工程大学 | A kind of navigation sensor parameter calibration method and device based on 3 D laser scanning |
CN111311747A (en) * | 2020-01-17 | 2020-06-19 | 中国水利水电科学研究院 | Multi-sensor barrier lake region integrated three-dimensional model rapid construction method |
KR102379303B1 (en) * | 2021-09-03 | 2022-03-30 | 대한민국 | A method and system for on-site investigation of a disaster cause using a special vehicle equipped with an unmanned aerial vehicle |
CN114296057A (en) * | 2021-12-08 | 2022-04-08 | 深圳奥锐达科技有限公司 | Method, device and storage medium for calculating relative external parameter of distance measuring system |
WO2022193106A1 (en) * | 2021-03-16 | 2022-09-22 | 电子科技大学 | Method for fusing gps with laser radar through inertia measurement parameter for positioning |
CN115127510A (en) * | 2022-06-24 | 2022-09-30 | 哈尔滨工业大学 | Triphibian three-dimensional unmanned multi-platform linkage landslide intelligent patrol system |
CN115290055A (en) * | 2022-07-05 | 2022-11-04 | 中国科学院烟台海岸带研究所 | Coastal zone SBT-DEM construction method based on unmanned aerial vehicle and unmanned ship |
CN116026323A (en) * | 2022-12-23 | 2023-04-28 | 喻光升 | Positioning and regional error proofing method for engine oil filling machine |
CN116147622A (en) * | 2023-03-23 | 2023-05-23 | 江苏科技大学 | Combined navigation system fusion positioning method based on graph optimization |
CN116182802A (en) * | 2023-03-13 | 2023-05-30 | 水利部交通运输部国家能源局南京水利科学研究院 | Method and system for detecting artificial island facing block based on three-dimensional scanning technology |
-
2023
- 2023-05-31 CN CN202310626682.6A patent/CN116642468B/en active Active
Patent Citations (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101281209A (en) * | 2007-04-03 | 2008-10-08 | 索尼株式会社 | Inertial sensor and electrical or electronic device |
US20100152933A1 (en) * | 2008-12-11 | 2010-06-17 | Honeywell International Inc. | Apparatus and method for unmanned aerial vehicle ground proximity detection, landing and descent |
JP2012032273A (en) * | 2010-07-30 | 2012-02-16 | Ministry Of Land Infrastructure & Transport Hokkaido Regional Development Bureau | Harbor structure measuring device |
WO2015195939A1 (en) * | 2014-06-19 | 2015-12-23 | Westerngeco Llc | System and method to acquire ultra-long offset seismic data for full waveform inversion (fwi) using unmanned marine vehicle (umv) |
CN105352476A (en) * | 2015-11-23 | 2016-02-24 | 青岛秀山移动测量有限公司 | Shipborne water bank line overwater and underwater integrated measurement system integrated method |
CN105444779A (en) * | 2015-11-24 | 2016-03-30 | 山东科技大学 | Field real-time calibration method for shipborne marine and submarine integrated measurement system |
WO2017132539A1 (en) * | 2016-01-29 | 2017-08-03 | Motion Engine Inc. | System and method for determining the position of sensor elements in a sensor array |
CN107037880A (en) * | 2017-03-02 | 2017-08-11 | 深圳前海极客船长网络科技有限公司 | Space orientation attitude determination system and its method based on virtual reality technology |
CN107121064A (en) * | 2017-04-27 | 2017-09-01 | 上海华测导航技术股份有限公司 | A kind of laser scanner |
CN107883932A (en) * | 2017-11-16 | 2018-04-06 | 国家海洋局第二海洋研究所 | A kind of measuring system and method for being applicable islands and reefs and seashore |
CN108362201A (en) * | 2017-12-25 | 2018-08-03 | 中国人民解放军战略支援部队信息工程大学 | A kind of navigation sensor parameter calibration method and device based on 3 D laser scanning |
CN111311747A (en) * | 2020-01-17 | 2020-06-19 | 中国水利水电科学研究院 | Multi-sensor barrier lake region integrated three-dimensional model rapid construction method |
WO2022193106A1 (en) * | 2021-03-16 | 2022-09-22 | 电子科技大学 | Method for fusing gps with laser radar through inertia measurement parameter for positioning |
KR102379303B1 (en) * | 2021-09-03 | 2022-03-30 | 대한민국 | A method and system for on-site investigation of a disaster cause using a special vehicle equipped with an unmanned aerial vehicle |
CN114296057A (en) * | 2021-12-08 | 2022-04-08 | 深圳奥锐达科技有限公司 | Method, device and storage medium for calculating relative external parameter of distance measuring system |
CN115127510A (en) * | 2022-06-24 | 2022-09-30 | 哈尔滨工业大学 | Triphibian three-dimensional unmanned multi-platform linkage landslide intelligent patrol system |
CN115290055A (en) * | 2022-07-05 | 2022-11-04 | 中国科学院烟台海岸带研究所 | Coastal zone SBT-DEM construction method based on unmanned aerial vehicle and unmanned ship |
CN116026323A (en) * | 2022-12-23 | 2023-04-28 | 喻光升 | Positioning and regional error proofing method for engine oil filling machine |
CN116182802A (en) * | 2023-03-13 | 2023-05-30 | 水利部交通运输部国家能源局南京水利科学研究院 | Method and system for detecting artificial island facing block based on three-dimensional scanning technology |
CN116147622A (en) * | 2023-03-23 | 2023-05-23 | 江苏科技大学 | Combined navigation system fusion positioning method based on graph optimization |
Non-Patent Citations (1)
Title |
---|
李庆松: "基于无人机机载激光和无人船多波束下水陆一体化三维测量技术应用和探讨", 《水利技术监督 》, 15 November 2021 (2021-11-15), pages 42 - 45 * |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117141765A (en) * | 2023-10-27 | 2023-12-01 | 奥来国信(北京)检测技术有限责任公司 | River course aviation photogrammetry device |
CN117141765B (en) * | 2023-10-27 | 2024-01-26 | 奥来国信(北京)检测技术有限责任公司 | River course aviation photogrammetry device |
CN117690194A (en) * | 2023-12-08 | 2024-03-12 | 北京虹湾威鹏信息技术有限公司 | Multi-source AI biodiversity observation method and acquisition system |
CN117690194B (en) * | 2023-12-08 | 2024-06-07 | 北京虹湾威鹏信息技术有限公司 | Multi-source AI biodiversity observation method and acquisition system |
Also Published As
Publication number | Publication date |
---|---|
CN116642468B (en) | 2024-05-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN116642468B (en) | Unmanned aerial vehicle aerial photography and unmanned ship based underwater integrated scanning method | |
CN102495420B (en) | Underwater object precision positioning system and method | |
CN107883932B (en) | Measurement system and method applicable to island and beach | |
CN105352476B (en) | Boat-carrying waterfront line underwater integrated measuring system integrated approach waterborne | |
CN108413926B (en) | High-precision measurement method for underwater topography elevation of pile foundation of offshore wind farm | |
CN109709574B (en) | Seabed microtopography laser scanning imaging system and three-dimensional terrain reconstruction method | |
US11733041B2 (en) | Apparatus and method for fault-proof collection of imagery for underwater survey | |
EP3382335B1 (en) | System, method and computer program product for determining a position and/or attitude of an offshore construction | |
CN115127510A (en) | Triphibian three-dimensional unmanned multi-platform linkage landslide intelligent patrol system | |
US12111155B2 (en) | Systems and methods for measuring water capacity of polar lakes | |
CN112461213B (en) | Multi-mode wave monitoring device and monitoring method | |
NL2032646B1 (en) | Method and system for monitoring local changes of underwater topography | |
CN110672075A (en) | Remote water area detection system and method based on three-dimensional stereo imaging | |
CN114442652B (en) | Port facility three-dimensional inspection method and system based on air-sea diving cross-domain cooperation | |
CN114326791A (en) | Method and system for synchronously acquiring underwater topography of river and lake surface | |
CN115290055A (en) | Coastal zone SBT-DEM construction method based on unmanned aerial vehicle and unmanned ship | |
Calantropio et al. | Photogrammetric underwater and UAS surveys of archaeological sites: The case study of the roman shipwreck of Torre Santa Sabina | |
CN112946685A (en) | Wharf berthing safety monitoring system and method based on three-dimensional laser radar | |
CN115341592B (en) | Underwater robot-based offshore wind power pile foundation scouring detection method and system | |
CN101650426A (en) | Data connection system of synthetic aperture sonar images and method thereof | |
CN111634416A (en) | Unmanned aerial vehicle group-based ship body surveying and mapping facility and working mode | |
CN113989350B (en) | Unmanned ship autonomous exploration and unknown environment three-dimensional reconstruction monitoring system | |
CN215894951U (en) | Data acquisition system capable of simultaneously obtaining overwater and underwater point cloud data | |
CN205872409U (en) | Wall removes and adsorption equipment under water | |
KR100492020B1 (en) | Remotely operated acoustic seabed topology device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |