CN116642468B - Unmanned aerial vehicle aerial photography and unmanned ship based underwater integrated scanning method - Google Patents

Unmanned aerial vehicle aerial photography and unmanned ship based underwater integrated scanning method Download PDF

Info

Publication number
CN116642468B
CN116642468B CN202310626682.6A CN202310626682A CN116642468B CN 116642468 B CN116642468 B CN 116642468B CN 202310626682 A CN202310626682 A CN 202310626682A CN 116642468 B CN116642468 B CN 116642468B
Authority
CN
China
Prior art keywords
aerial vehicle
unmanned aerial
unmanned
data
coordinate system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310626682.6A
Other languages
Chinese (zh)
Other versions
CN116642468A (en
Inventor
赵旭
耿宝磊
张华庆
陈汉宝
金瑞佳
王昊
朱婷婷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin Research Institute for Water Transport Engineering MOT
Original Assignee
Tianjin Research Institute for Water Transport Engineering MOT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin Research Institute for Water Transport Engineering MOT filed Critical Tianjin Research Institute for Water Transport Engineering MOT
Priority to CN202310626682.6A priority Critical patent/CN116642468B/en
Publication of CN116642468A publication Critical patent/CN116642468A/en
Application granted granted Critical
Publication of CN116642468B publication Critical patent/CN116642468B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/02Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/02Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures
    • G01C11/025Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures by scanning the object
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/48Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system
    • G01S19/49Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system whereby the further system is an inertial position system, e.g. loosely-coupled
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/30Assessment of water resources

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses an unmanned aerial vehicle-based aerial photography and unmanned ship water-based underwater integrated scanning method, which relates to the technical field of breakwater health monitoring and comprises the following steps: all sensors carried by the unmanned aerial vehicle and the unmanned ship are fixedly connected through a ship-borne rigid stable platform; measuring coordinate values of each sensor under an object coordinate system of the water-on-water-under-water integrated system; inputting the measured lever arm value into real-time acquisition software to acquire underwater data on water, and acquiring multi-source data; the data after combined positioning and attitude determination are led into an on-water and underwater integrated system, and the acquired multi-source data are converted into a unified local engineering coordinate system; and fusing the multisource data converted into the same coordinate system, and establishing a three-dimensional model of the whole appearance of the breakwater. According to the invention, the three-dimensional model presenting the overall appearance of a wharf, a breakwater, an underwater structure and the like can be established by fusing the multi-source data of the unmanned aerial vehicle and the unmanned ship.

Description

Unmanned aerial vehicle aerial photography and unmanned ship based underwater integrated scanning method
Technical Field
The invention relates to the technical field of breakwater health monitoring, in particular to an unmanned aerial vehicle-based aerial photography and unmanned ship-based underwater integrated scanning method.
Background
As an important component of port infrastructure, the breakwater can defend the invasion port area of waves, ice, silt, water surge and the like, and provides a safe and stable shelter water area for the port, thereby ensuring the stable work of ships in the port and protecting the safety of various facility equipment in the port.
The breakwater working environment is severe, and the breakwater, the revetment, the seawall and other coastal infrastructure damage cases caused by storm surge and near-shore typhoon and high waves are frequent. The breakwater damage can cause huge economic loss, and the safety of staff in a port and the normal operation of various equipment facilities are directly influenced, so that the breakwater inspection and the timely monitoring of the structural health condition are of great economic and social significance.
The current breakwater structure health monitoring mainly relies on visual inspection, artificial photography, tape measures, leveling instrument, total station, GNSS RTK, diver underwater investigation and other modes, and has a plurality of obvious defects: the detection labor efficiency is low, the intensity is high, and the personnel safety risk is high; the detection result is difficult to be displayed from the point and the surface to show the whole appearance of the structure; the artificial subjective factors exist in the manual measurement, so that the measurement error is larger, the measurement result is coarser, and the digitization degree is low; the GNSS RTK automatic monitoring system is costly (only suitable for upright dike body monitoring with few measuring points, not suitable for a number of slope dikes, approach embankment and revetments containing facing block stones).
Based on the knowledge of damage hazard to breakwater and revetment and the shortcomings of the existing monitoring technology, it has been proposed to observe underwater parts on water by combining an unmanned plane and an unmanned ship, but the detection method often has the following shortcomings: the water-borne and underwater coordinate references are not uniform, the water-borne and underwater data precision is inconsistent, the water-borne and underwater data is discontinuous, and the technical bottlenecks of low operation efficiency and the like exist in the situation that data blank needs to be subjected to post-processing work such as interpolation.
Therefore, the underwater integrated mapping on water is a innovation and innovation of the related mapping technical method of the traditional water area, and how to propose intelligent inspection and health monitoring solutions of breakwaters, shore protection and the like based on the underwater integrated mapping on unmanned aerial vehicle and unmanned ship is a technical problem to be solved by those skilled in the art.
Disclosure of Invention
In view of the above, the invention provides an unmanned aerial vehicle aerial photography and unmanned ship water-borne and underwater integrated scanning method, and a three-dimensional model presenting the overall appearance of a wharf, a breakwater, an underwater structure and the like can be built by fusing multi-source data of an unmanned aerial vehicle and an unmanned ship.
In order to achieve the above object, the present invention provides the following technical solutions:
An unmanned aerial vehicle aerial photography and unmanned ship based underwater integrated scanning method comprises the following steps:
All sensors carried by the unmanned aerial vehicle and the unmanned ship are fixedly connected through a ship-borne rigid stable platform;
measuring coordinate values of each sensor under an object coordinate system of the water-on-water-under-water integrated system;
inputting the measured lever arm value into real-time acquisition software to acquire underwater data on water, and acquiring multi-source data;
The data after combined positioning and attitude determination are led into an on-water and underwater integrated system, and the acquired multi-source data are converted into a unified local engineering coordinate system;
and fusing the multisource data converted into the same coordinate system, and establishing a three-dimensional model of the whole appearance of the breakwater.
Optionally, the sensor comprises: the system comprises a laser scanner, a 360-degree panoramic camera, a multi-beam depth finder and GNSS-IMU integrated navigation equipment;
the laser scanner is carried on the unmanned plane and the unmanned ship and is used for acquiring the spatial information data of the water parts of slopes at the two sides of the breakwater;
the multi-beam depth sounder is mounted on the unmanned ship and used for acquiring spatial information data of underwater parts of slopes on two sides of the breakwater;
the 360-degree panoramic camera is carried on the unmanned aerial vehicle and used for acquiring image data of the breakwater top structure;
The GNSS-IMU integrated navigation equipment is carried on the unmanned aerial vehicle and used for providing positioning information, time information, gesture information and course information for a laser scanner, a multi-beam depth sounder and a 360-degree panoramic camera.
Optionally, converting image data of the breakwater roof structure acquired by the 360-degree panoramic camera into an object coordinate system, specifically including the following steps:
Acquiring corresponding shooting parameter information when the 360-degree panoramic camera shoots image data, wherein the shooting parameter information comprises an antenna phase center three-dimensional coordinate and a sensor attitude angle;
The antenna phase center three-dimensional seat is marked as (X C、YC、ZC), and the sensor attitude angle comprises a rolling angle Pitch angle θ, yaw angle ψ; the formula for converting shooting parameter information into an object coordinate system is as follows:
wherein: representing a rotation matrix from an object coordinate system to an image coordinate system; /(I) A rotation matrix representing the image coordinate system to the sensor coordinate system; /(I)A rotation matrix representing the sensor coordinate system to the carrier coordinate system; /(I)A rotation matrix representing a geocentric earth fixed coordinate system to a navigation coordinate system; /(I)A rotation matrix representing the carrier coordinate system to the navigation coordinate system; /(I)Is a fixed matrix used for the conversion between the navigation coordinate system and the object coordinate system.
Optionally, the method for acquiring the lever arm value includes the following steps:
acquiring IMU data and GNSS data at a plurality of sampling moments, and acquiring priori state information and IMU state increment residual errors between adjacent GNSS sampling moments;
and constructing an objective function by combining the lever arm values to be optimized, and solving the lever arm values when the objective function converges.
Optionally, after acquiring the multi-source data, the method further comprises:
providing corresponding synchronous signals for each sensor by using a hardware synchronous controller, recording corresponding time and establishing a time synchronous standard;
Performing data settlement of the reference station and the mobile station by using the IE, and outputting POS data;
And performing point cloud data preprocessing on the POS data to obtain a standard LAS data file.
Optionally, performing point cloud data preprocessing on the POS data, including point cloud filtering and point cloud thinning;
the point cloud filtering comprises echo signal denoising, distance noise removal and time denoising;
point cloud thinning is to reduce dense point cloud data.
Optionally, the process of combined positioning and attitude determination solution includes:
Defining that the displacement value of the unmanned aerial vehicle at the initial moment is 0mm, and then the displacement value of the unmanned aerial vehicle at the direction X, Y, Z at the n moment = the coordinate value of the unmanned aerial vehicle at the direction X, Y, Z at the n moment-the coordinate value of the unmanned aerial vehicle at the initial moment X, Y, Z;
Defining the speed of the unmanned aerial vehicle at the time n as the average speed of the unmanned aerial vehicle between the time n-1 and the time n+1, wherein the speed of the unmanned aerial vehicle at the time n X, Y, Z is = (the three-dimensional space coordinate of the unmanned aerial vehicle at the time n+1 in the direction X, Y, Z-the three-dimensional space coordinate of the unmanned aerial vehicle at the time n-1 in the direction X, Y, Z)/the time interval of 2/adjacent time;
And defining the acceleration of the unmanned aerial vehicle at the time n as the average acceleration of the unmanned aerial vehicle at the time n-1 and the time n+1, wherein the acceleration of the unmanned aerial vehicle at the time n X, Y, Z is = (the speed of the unmanned aerial vehicle at the time n+1 in the direction X, Y, Z-the speed of the unmanned aerial vehicle at the time n-1 in the direction X, Y, Z)/the time interval of 2/adjacent images.
Optionally, the controlling of the unmanned aerial vehicle and the unmanned ship includes:
Controlling the unmanned ship to navigate according to a preset path, enabling the unmanned ship to fly above the unmanned ship synchronously along with the unmanned ship, and connecting the unmanned ship with the unmanned ship in a communication way;
When the residual electric quantity of the unmanned aerial vehicle is lower than a preset threshold value, the unmanned aerial vehicle is controlled to reduce the flying height, and the unmanned aerial vehicle is parked on the unmanned aerial vehicle and charges the unmanned aerial vehicle through the unmanned aerial vehicle.
Compared with the prior art, the invention provides the unmanned aerial vehicle photographing and unmanned ship water-based underwater integrated scanning method, and a three-dimensional model capable of presenting the whole appearance of the breakwater is established through multi-source data fusion of the unmanned aerial vehicle and the unmanned ship, so that a decision basis is provided for actual engineering mapping; the integration is high, and the working efficiency can be improved; unmanned, the safety of staff can be ensured; time synchronization, seamless splicing of water and water data, and higher precision.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are required to be used in the embodiments or the description of the prior art will be briefly described below, and it is obvious that the drawings in the following description are only embodiments of the present invention, and that other drawings can be obtained according to the provided drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flow chart of the unmanned aerial vehicle-based aerial photography and unmanned ship-based underwater integrated scanning method.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The embodiment of the invention discloses an unmanned aerial vehicle-based aerial photography and unmanned ship water-borne and underwater integrated scanning method, which is shown in fig. 1 and comprises the following steps:
All sensors carried by the unmanned aerial vehicle and the unmanned ship are fixedly connected through a ship-borne rigid stable platform;
measuring coordinate values of each sensor under an object coordinate system of the water-on-water-under-water integrated system;
inputting the measured lever arm value into real-time acquisition software to acquire underwater data on water, and acquiring multi-source data;
The data after combined positioning and attitude determination are led into an on-water and underwater integrated system, and the acquired multi-source data are converted into a unified local engineering coordinate system;
and fusing the multisource data converted into the same coordinate system, and establishing a three-dimensional model of the whole appearance of the breakwater.
Further, the sensor includes: the system comprises a laser scanner, a 360-degree panoramic camera, a multi-beam depth finder and GNSS-IMU integrated navigation equipment;
the laser scanner is carried on the unmanned plane and the unmanned ship and is used for acquiring the spatial information data of the water parts of slopes at the two sides of the breakwater;
The multi-beam depth sounder is mounted on the unmanned ship and used for acquiring spatial information data of underwater parts of slopes on two sides of the breakwater; the multi-beam sounding instrument transmits sound waves covered by a wide sector to an underwater area, the sound waves are received by utilizing a receiving transducer array in a narrow beam mode, irradiation footprints of underwater topography are formed through orthogonalization of the transmitting and receiving sectors, and the irradiation footprints of the underwater topography are processed, so that point cloud data of the underwater topography are obtained;
the 360-degree panoramic camera is carried on the unmanned aerial vehicle and used for acquiring image data of the breakwater top structure;
The GNSS-IMU integrated navigation equipment is carried on the unmanned aerial vehicle and used for providing positioning information, time information, gesture information and course information for a laser scanner, a multi-beam depth sounder and a 360-degree panoramic camera.
Further, the method for converting the image data of the breakwater top structure acquired by the 360-degree panoramic camera into the object coordinate system specifically comprises the following steps:
Acquiring corresponding shooting parameter information when the 360-degree panoramic camera shoots image data, wherein the shooting parameter information comprises an antenna phase center three-dimensional coordinate and a sensor attitude angle;
The antenna phase center three-dimensional seat is marked as (X C、YC、ZC), and the sensor attitude angle comprises a rolling angle Pitch angle θ, yaw angle ψ; the formula for converting shooting parameter information into an object coordinate system is as follows:
wherein: representing a rotation matrix from an object coordinate system to an image coordinate system; /(I) A rotation matrix representing the image coordinate system to the sensor coordinate system; /(I)A rotation matrix representing the sensor coordinate system to the carrier coordinate system; /(I)A rotation matrix representing a geocentric earth fixed coordinate system to a navigation coordinate system; /(I)A rotation matrix representing the carrier coordinate system to the navigation coordinate system; is a fixed matrix used for the conversion between the navigation coordinate system and the object coordinate system.
Further, the method for acquiring the lever arm value comprises the following steps:
acquiring IMU data and GNSS data at a plurality of sampling moments, and acquiring priori state information and IMU state increment residual errors between adjacent GNSS sampling moments;
and constructing an objective function by combining the lever arm values to be optimized, and solving the lever arm values when the objective function converges.
Further, after acquiring the multi-source data, the method further comprises:
providing corresponding synchronous signals for each sensor by using a hardware synchronous controller, recording corresponding time and establishing a time synchronous standard;
Performing data settlement of the reference station and the mobile station by using the IE, and outputting POS data;
And performing point cloud data preprocessing on the POS data to obtain a standard LAS data file.
Further, performing point cloud data preprocessing on the POS data, including point cloud filtering and point cloud thinning;
the point cloud filtering comprises echo signal denoising, distance noise removal and time denoising;
point cloud thinning is to reduce dense point cloud data.
Further, a linear interpolation mode is adopted, low-frequency data are interpolated according to high-frequency data, inertial navigation attitude data are imported, and coordinates and attitudes of center points of the laser scanner and the multi-beam depth finder at all times are obtained according to verification parameters; leading in initial laser data and sounding data and analyzing to obtain coordinates of the measuring point under a sensor coordinate system; and searching the position and the posture of the moment sensor according to the time correspondence of each point to obtain three angles and three translation parameters forming a rotation matrix.
Further, the process of combined localization and pose determination solution comprises:
Defining that the displacement value of the unmanned aerial vehicle at the initial moment is 0mm, and then the displacement value of the unmanned aerial vehicle at the direction X, Y, Z at the n moment = the coordinate value of the unmanned aerial vehicle at the direction X, Y, Z at the n moment-the coordinate value of the unmanned aerial vehicle at the initial moment X, Y, Z;
Defining the speed of the unmanned aerial vehicle at the time n as the average speed of the unmanned aerial vehicle between the time n-1 and the time n+1, wherein the speed of the unmanned aerial vehicle at the time n X, Y, Z is = (the three-dimensional space coordinate of the unmanned aerial vehicle at the time n+1 in the direction X, Y, Z-the three-dimensional space coordinate of the unmanned aerial vehicle at the time n-1 in the direction X, Y, Z)/the time interval of 2/adjacent time;
And defining the acceleration of the unmanned aerial vehicle at the time n as the average acceleration of the unmanned aerial vehicle at the time n-1 and the time n+1, wherein the acceleration of the unmanned aerial vehicle at the time n X, Y, Z is = (the speed of the unmanned aerial vehicle at the time n+1 in the direction X, Y, Z-the speed of the unmanned aerial vehicle at the time n-1 in the direction X, Y, Z)/the time interval of 2/adjacent images.
Further, the control of the unmanned aerial vehicle and the unmanned ship comprises:
Controlling the unmanned ship to navigate according to a preset path, enabling the unmanned ship to fly above the unmanned ship synchronously along with the unmanned ship, and connecting the unmanned ship with the unmanned ship in a communication way; specifically, wireless communication can be performed in a Bluetooth, WIFI, 5G, ZIGBE or GPRS mode;
When the residual electric quantity of the unmanned aerial vehicle is lower than a preset threshold value, the unmanned aerial vehicle is controlled to reduce the flying height, and the unmanned aerial vehicle is parked on the unmanned aerial vehicle and charges the unmanned aerial vehicle through the unmanned aerial vehicle.
Specifically, the unmanned aerial vehicle is a rotor unmanned aerial vehicle, and a charging port of the unmanned aerial vehicle is positioned at the lower side of the machine body; the unmanned ship comprises a shutdown platform and an unmanned plane take-off and landing charging device arranged on the shutdown platform. The unmanned aerial vehicle take-off and landing charging device comprises a cradle head arranged on the shutdown platform, a charging interface arranged on the cradle head, a buffer device, a fixing device and a landing plate. After the unmanned aerial vehicle is placed on the shutdown platform, the control center controls the clamping arm to rotate to enable the clamping arm to be higher than the sliding track, and the longitudinal clamping motor is driven to enable the clamping rod to move so that the clamping arm clamps and pushes the unmanned aerial vehicle to take off.
Further, through constructing a digital elevation model, elevation information of the breakwater is directly observed and analyzed, a land area DEM model and a water area DEM model are generated by utilizing ground point cloud data, and the water area and land area DEM models are spliced through coordinate registration, so that a three-dimensional model of the whole appearance of the breakwater is obtained.
In the present specification, each embodiment is described in a progressive manner, and each embodiment is mainly described in a different point from other embodiments, and identical and similar parts between the embodiments are all enough to refer to each other.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (7)

1. The unmanned aerial vehicle aerial photography and unmanned ship water-based underwater integrated scanning method is characterized by comprising the following steps of:
All sensors carried by the unmanned aerial vehicle and the unmanned ship are fixedly connected through a ship-borne rigid stable platform;
measuring coordinate values of each sensor under an object coordinate system of the water-on-water-under-water integrated system;
inputting the measured lever arm value into real-time acquisition software to acquire underwater data on water, and acquiring multi-source data;
The data after combined positioning and attitude determination are led into an on-water and underwater integrated system, and the acquired multi-source data are converted into a unified local engineering coordinate system;
The multisource data converted into the same coordinate system are fused, and a three-dimensional model of the whole appearance of the breakwater is built;
The combined positioning and attitude determination resolving process comprises the following steps:
Defining that the displacement value of the unmanned aerial vehicle at the initial moment is 0mm, and then the displacement value of the unmanned aerial vehicle at the direction X, Y, Z at the n moment = the coordinate value of the unmanned aerial vehicle at the direction X, Y, Z at the n moment-the coordinate value of the unmanned aerial vehicle at the initial moment X, Y, Z;
Defining the speed of the unmanned aerial vehicle at the time n as the average speed of the unmanned aerial vehicle between the time n-1 and the time n+1, wherein the speed of the unmanned aerial vehicle at the time n X, Y, Z is = (the three-dimensional space coordinate of the unmanned aerial vehicle at the time n+1 in the direction X, Y, Z-the three-dimensional space coordinate of the unmanned aerial vehicle at the time n-1 in the direction X, Y, Z)/the time interval of 2/adjacent time;
And defining the acceleration of the unmanned aerial vehicle at the time n as the average acceleration of the unmanned aerial vehicle at the time n-1 and the time n+1, wherein the acceleration of the unmanned aerial vehicle at the time n X, Y, Z is = (the speed of the unmanned aerial vehicle at the time n+1 in the direction X, Y, Z-the speed of the unmanned aerial vehicle at the time n-1 in the direction X, Y, Z)/the time interval of 2/adjacent images.
2. The unmanned aerial vehicle-based aerial photography and unmanned ship-based underwater integrated scanning method as claimed in claim 1, wherein the sensor comprises: the system comprises a laser scanner, a 360-degree panoramic camera, a multi-beam depth finder and GNSS-IMU integrated navigation equipment;
the laser scanner is carried on the unmanned plane and the unmanned ship and is used for acquiring the spatial information data of the water parts of slopes at the two sides of the breakwater;
the multi-beam depth sounder is mounted on the unmanned ship and used for acquiring spatial information data of underwater parts of slopes on two sides of the breakwater;
the 360-degree panoramic camera is carried on the unmanned aerial vehicle and used for acquiring image data of the breakwater top structure;
The GNSS-IMU integrated navigation equipment is carried on the unmanned aerial vehicle and used for providing positioning information, time information, gesture information and course information for a laser scanner, a multi-beam depth sounder and a 360-degree panoramic camera.
3. The method for integrated underwater and aerial photography based on the unmanned aerial vehicle and the unmanned ship according to claim 2 is characterized in that the image data of the breakwater top structure obtained by the 360-degree panoramic camera is converted into the object coordinate system, and specifically comprises the following steps:
Acquiring corresponding shooting parameter information when the 360-degree panoramic camera shoots image data, wherein the shooting parameter information comprises an antenna phase center three-dimensional coordinate and a sensor attitude angle;
The antenna phase center three-dimensional seat is marked as (X C、YC、ZC), and the sensor attitude angle comprises a rolling angle Pitch angle θ, yaw angle ψ; the formula for converting shooting parameter information into an object coordinate system is as follows:
wherein: representing a rotation matrix from an object coordinate system to an image coordinate system; /(I) A rotation matrix representing the image coordinate system to the sensor coordinate system; /(I)A rotation matrix representing the sensor coordinate system to the carrier coordinate system; /(I)A rotation matrix representing a geocentric earth fixed coordinate system to a navigation coordinate system; /(I)A rotation matrix representing the carrier coordinate system to the navigation coordinate system; /(I)Is a fixed matrix used for the conversion between the navigation coordinate system and the object coordinate system.
4. The method for acquiring the lever arm value based on unmanned aerial vehicle aerial photography and unmanned ship underwater integrated scanning according to claim 1, wherein the method comprises the following steps of:
acquiring IMU data and GNSS data at a plurality of sampling moments, and acquiring priori state information and IMU state increment residual errors between adjacent GNSS sampling moments;
and constructing an objective function by combining the lever arm values to be optimized, and solving the lever arm values when the objective function converges.
5. The unmanned aerial vehicle photography and unmanned ship based underwater integrated scanning method according to claim 1, wherein after the multi-source data is acquired, the method further comprises:
providing corresponding synchronous signals for each sensor by using a hardware synchronous controller, recording corresponding time and establishing a time synchronous standard;
Performing data settlement of the reference station and the mobile station by using the IE, and outputting POS data;
And performing point cloud data preprocessing on the POS data to obtain a standard LAS data file.
6. The unmanned aerial vehicle photography and unmanned ship based underwater integrated scanning method is characterized in that point cloud data preprocessing is carried out on POS data, and the point cloud preprocessing comprises point cloud filtering and point cloud thinning;
the point cloud filtering comprises echo signal denoising, distance noise removal and time denoising;
point cloud thinning is to reduce dense point cloud data.
7. The unmanned aerial vehicle photographing and unmanned ship-based underwater integrated scanning method according to claim 1, wherein the unmanned aerial vehicle and unmanned ship control comprises:
Controlling the unmanned ship to navigate according to a preset path, enabling the unmanned ship to fly above the unmanned ship synchronously along with the unmanned ship, and connecting the unmanned ship with the unmanned ship in a communication way;
When the residual electric quantity of the unmanned aerial vehicle is lower than a preset threshold value, the unmanned aerial vehicle is controlled to reduce the flying height, and the unmanned aerial vehicle is parked on the unmanned aerial vehicle and charges the unmanned aerial vehicle through the unmanned aerial vehicle.
CN202310626682.6A 2023-05-31 2023-05-31 Unmanned aerial vehicle aerial photography and unmanned ship based underwater integrated scanning method Active CN116642468B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310626682.6A CN116642468B (en) 2023-05-31 2023-05-31 Unmanned aerial vehicle aerial photography and unmanned ship based underwater integrated scanning method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310626682.6A CN116642468B (en) 2023-05-31 2023-05-31 Unmanned aerial vehicle aerial photography and unmanned ship based underwater integrated scanning method

Publications (2)

Publication Number Publication Date
CN116642468A CN116642468A (en) 2023-08-25
CN116642468B true CN116642468B (en) 2024-05-17

Family

ID=87615079

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310626682.6A Active CN116642468B (en) 2023-05-31 2023-05-31 Unmanned aerial vehicle aerial photography and unmanned ship based underwater integrated scanning method

Country Status (1)

Country Link
CN (1) CN116642468B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117141765B (en) * 2023-10-27 2024-01-26 奥来国信(北京)检测技术有限责任公司 River course aviation photogrammetry device
CN117690194B (en) * 2023-12-08 2024-06-07 北京虹湾威鹏信息技术有限公司 Multi-source AI biodiversity observation method and acquisition system

Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101281209A (en) * 2007-04-03 2008-10-08 索尼株式会社 Inertial sensor and electrical or electronic device
JP2012032273A (en) * 2010-07-30 2012-02-16 Ministry Of Land Infrastructure & Transport Hokkaido Regional Development Bureau Harbor structure measuring device
WO2015195939A1 (en) * 2014-06-19 2015-12-23 Westerngeco Llc System and method to acquire ultra-long offset seismic data for full waveform inversion (fwi) using unmanned marine vehicle (umv)
CN105352476A (en) * 2015-11-23 2016-02-24 青岛秀山移动测量有限公司 Shipborne water bank line overwater and underwater integrated measurement system integrated method
CN105444779A (en) * 2015-11-24 2016-03-30 山东科技大学 Field real-time calibration method for shipborne marine and submarine integrated measurement system
WO2017132539A1 (en) * 2016-01-29 2017-08-03 Motion Engine Inc. System and method for determining the position of sensor elements in a sensor array
CN107037880A (en) * 2017-03-02 2017-08-11 深圳前海极客船长网络科技有限公司 Space orientation attitude determination system and its method based on virtual reality technology
CN107121064A (en) * 2017-04-27 2017-09-01 上海华测导航技术股份有限公司 A kind of laser scanner
CN107883932A (en) * 2017-11-16 2018-04-06 国家海洋局第二海洋研究所 A kind of measuring system and method for being applicable islands and reefs and seashore
CN108362201A (en) * 2017-12-25 2018-08-03 中国人民解放军战略支援部队信息工程大学 A kind of navigation sensor parameter calibration method and device based on 3 D laser scanning
CN111311747A (en) * 2020-01-17 2020-06-19 中国水利水电科学研究院 Multi-sensor barrier lake region integrated three-dimensional model rapid construction method
KR102379303B1 (en) * 2021-09-03 2022-03-30 대한민국 A method and system for on-site investigation of a disaster cause using a special vehicle equipped with an unmanned aerial vehicle
CN114296057A (en) * 2021-12-08 2022-04-08 深圳奥锐达科技有限公司 Method, device and storage medium for calculating relative external parameter of distance measuring system
WO2022193106A1 (en) * 2021-03-16 2022-09-22 电子科技大学 Method for fusing gps with laser radar through inertia measurement parameter for positioning
CN115127510A (en) * 2022-06-24 2022-09-30 哈尔滨工业大学 Triphibian three-dimensional unmanned multi-platform linkage landslide intelligent patrol system
CN115290055A (en) * 2022-07-05 2022-11-04 中国科学院烟台海岸带研究所 Coastal zone SBT-DEM construction method based on unmanned aerial vehicle and unmanned ship
CN116026323A (en) * 2022-12-23 2023-04-28 喻光升 Positioning and regional error proofing method for engine oil filling machine
CN116147622A (en) * 2023-03-23 2023-05-23 江苏科技大学 Combined navigation system fusion positioning method based on graph optimization
CN116182802A (en) * 2023-03-13 2023-05-30 水利部交通运输部国家能源局南京水利科学研究院 Method and system for detecting artificial island facing block based on three-dimensional scanning technology

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100152933A1 (en) * 2008-12-11 2010-06-17 Honeywell International Inc. Apparatus and method for unmanned aerial vehicle ground proximity detection, landing and descent

Patent Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101281209A (en) * 2007-04-03 2008-10-08 索尼株式会社 Inertial sensor and electrical or electronic device
JP2012032273A (en) * 2010-07-30 2012-02-16 Ministry Of Land Infrastructure & Transport Hokkaido Regional Development Bureau Harbor structure measuring device
WO2015195939A1 (en) * 2014-06-19 2015-12-23 Westerngeco Llc System and method to acquire ultra-long offset seismic data for full waveform inversion (fwi) using unmanned marine vehicle (umv)
CN105352476A (en) * 2015-11-23 2016-02-24 青岛秀山移动测量有限公司 Shipborne water bank line overwater and underwater integrated measurement system integrated method
CN105444779A (en) * 2015-11-24 2016-03-30 山东科技大学 Field real-time calibration method for shipborne marine and submarine integrated measurement system
WO2017132539A1 (en) * 2016-01-29 2017-08-03 Motion Engine Inc. System and method for determining the position of sensor elements in a sensor array
CN107037880A (en) * 2017-03-02 2017-08-11 深圳前海极客船长网络科技有限公司 Space orientation attitude determination system and its method based on virtual reality technology
CN107121064A (en) * 2017-04-27 2017-09-01 上海华测导航技术股份有限公司 A kind of laser scanner
CN107883932A (en) * 2017-11-16 2018-04-06 国家海洋局第二海洋研究所 A kind of measuring system and method for being applicable islands and reefs and seashore
CN108362201A (en) * 2017-12-25 2018-08-03 中国人民解放军战略支援部队信息工程大学 A kind of navigation sensor parameter calibration method and device based on 3 D laser scanning
CN111311747A (en) * 2020-01-17 2020-06-19 中国水利水电科学研究院 Multi-sensor barrier lake region integrated three-dimensional model rapid construction method
WO2022193106A1 (en) * 2021-03-16 2022-09-22 电子科技大学 Method for fusing gps with laser radar through inertia measurement parameter for positioning
KR102379303B1 (en) * 2021-09-03 2022-03-30 대한민국 A method and system for on-site investigation of a disaster cause using a special vehicle equipped with an unmanned aerial vehicle
CN114296057A (en) * 2021-12-08 2022-04-08 深圳奥锐达科技有限公司 Method, device and storage medium for calculating relative external parameter of distance measuring system
CN115127510A (en) * 2022-06-24 2022-09-30 哈尔滨工业大学 Triphibian three-dimensional unmanned multi-platform linkage landslide intelligent patrol system
CN115290055A (en) * 2022-07-05 2022-11-04 中国科学院烟台海岸带研究所 Coastal zone SBT-DEM construction method based on unmanned aerial vehicle and unmanned ship
CN116026323A (en) * 2022-12-23 2023-04-28 喻光升 Positioning and regional error proofing method for engine oil filling machine
CN116182802A (en) * 2023-03-13 2023-05-30 水利部交通运输部国家能源局南京水利科学研究院 Method and system for detecting artificial island facing block based on three-dimensional scanning technology
CN116147622A (en) * 2023-03-23 2023-05-23 江苏科技大学 Combined navigation system fusion positioning method based on graph optimization

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于无人机机载激光和无人船多波束下水陆一体化三维测量技术应用和探讨;李庆松;《水利技术监督 》;20211115;42-45 *

Also Published As

Publication number Publication date
CN116642468A (en) 2023-08-25

Similar Documents

Publication Publication Date Title
CN116642468B (en) Unmanned aerial vehicle aerial photography and unmanned ship based underwater integrated scanning method
CN108413926B (en) High-precision measurement method for underwater topography elevation of pile foundation of offshore wind farm
CA2908738A1 (en) Underwater platform with lidar and related methods
CN109709574B (en) Seabed microtopography laser scanning imaging system and three-dimensional terrain reconstruction method
CN105352476A (en) Shipborne water bank line overwater and underwater integrated measurement system integrated method
CN105159320A (en) Underwater target detection platform system suitable for complex water area and using method thereof
US11733041B2 (en) Apparatus and method for fault-proof collection of imagery for underwater survey
EP3382335B1 (en) System, method and computer program product for determining a position and/or attitude of an offshore construction
CN108614270B (en) Underwater riprap real-time monitoring system based on three-dimensional point cloud system and working method thereof
CN115127510A (en) Triphibian three-dimensional unmanned multi-platform linkage landslide intelligent patrol system
CN114185079A (en) Underwater three-dimensional detection system
CN109737921A (en) A kind of beach topographic survey method using unmanned plane tracking flowage line
CN114326791A (en) Method and system for synchronously acquiring underwater topography of river and lake surface
KR101339678B1 (en) Calculation method of rock and non-rock area for surveying
CN114046777A (en) Underwater optical imaging system and method suitable for large-range shallow sea coral reef drawing
Calantropio et al. Photogrammetric underwater and UAS surveys of archaeological sites: The case study of the roman shipwreck of Torre Santa Sabina
CN115341592B (en) Underwater robot-based offshore wind power pile foundation scouring detection method and system
US20230400302A1 (en) Systems and methods for measuring water capacity of polar lakes
CN111634416A (en) Unmanned aerial vehicle group-based ship body surveying and mapping facility and working mode
NL2032646B1 (en) Method and system for monitoring local changes of underwater topography
CN113989350B (en) Unmanned ship autonomous exploration and unknown environment three-dimensional reconstruction monitoring system
CN115290055A (en) Coastal zone SBT-DEM construction method based on unmanned aerial vehicle and unmanned ship
CN114442652A (en) Port facility three-dimensional inspection method and system based on air-sea submarine cross-domain collaboration
KR100492020B1 (en) Remotely operated acoustic seabed topology device
CN111650593A (en) Submarine cable laying state probing system for offshore wind farm and working method of submarine cable laying state probing system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant