CN113643254B - Efficient collection and processing method for laser point cloud of unmanned aerial vehicle - Google Patents

Efficient collection and processing method for laser point cloud of unmanned aerial vehicle Download PDF

Info

Publication number
CN113643254B
CN113643254B CN202110915054.0A CN202110915054A CN113643254B CN 113643254 B CN113643254 B CN 113643254B CN 202110915054 A CN202110915054 A CN 202110915054A CN 113643254 B CN113643254 B CN 113643254B
Authority
CN
China
Prior art keywords
point cloud
data
processing method
aerial vehicle
unmanned aerial
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110915054.0A
Other languages
Chinese (zh)
Other versions
CN113643254A (en
Inventor
毕超豪
邹伟煜
邓烨恒
宋长青
曾耀强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Power Supply Bureau of Guangdong Power Grid Co Ltd
Original Assignee
Guangzhou Power Supply Bureau of Guangdong Power Grid Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Power Supply Bureau of Guangdong Power Grid Co Ltd filed Critical Guangzhou Power Supply Bureau of Guangdong Power Grid Co Ltd
Priority to CN202110915054.0A priority Critical patent/CN113643254B/en
Publication of CN113643254A publication Critical patent/CN113643254A/en
Application granted granted Critical
Publication of CN113643254B publication Critical patent/CN113643254B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/001Industrial image inspection using an image reference approach
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/45Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
    • G01S19/47Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement the supplementary measurement being an inertial measurement, e.g. tightly coupled inertial
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/0006Industrial image inspection using a design-rule based approach
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Abstract

The invention discloses an efficient collection and processing method for laser point cloud of an unmanned aerial vehicle, which comprises the steps of flying an unmanned aerial vehicle device carrying a laser scanner, a GNSS high-precision positioning system, an MEMS inertial navigation system, an IMU system, a high-speed data collection and storage system and a full-picture camera according to a preset region track, planning a course, rapidly obtaining high-precision point cloud data, obtaining precise exposure time position and attitude information of a positive shot image through three-dimensional high-precision laser radar data and an imaging technology and a three-dimensional high-precision laser point cloud and orthographic image registration algorithm which are synchronously obtained, and completing rapid obtaining and precise registration of DOM results. According to the efficient acquisition and processing method for the laser point cloud of the unmanned aerial vehicle, the whole process of point cloud acquisition and processing is opened through a programming process, so that the method is suitable for mountain operation; meanwhile, the method supports the function of no reference station based on a Beidou satellite navigation system and a CORS network, and realizes the quick acquisition of true color three-dimensional point cloud data integrating the functions of acquisition, processing, coloring, transmission, browsing and the like.

Description

Efficient collection and processing method for laser point cloud of unmanned aerial vehicle
Technical Field
The invention relates to the technical field of data acquisition, in particular to an efficient acquisition and processing method for laser point cloud of an unmanned aerial vehicle.
Background
The scanning data is recorded in the form of points, each point comprises three-dimensional coordinates, and some points may comprise color information or reflection intensity information; the point cloud data has color information besides geometric positions, the color information is usually obtained by a camera to obtain a color image, and then the color information of pixels at corresponding positions is endowed to corresponding points in the point cloud, the intensity information is obtained by echo intensity collected by a laser scanner receiving device, and the intensity information is related to the surface material, roughness and incident angle direction of a target, the emission energy of the instrument and the laser wavelength;
the efficient collection and processing of the laser point cloud of the unmanned aerial vehicle have practical significance. The cloud point data are acquired by the unmanned aerial vehicle laser radar in a programmed mode and are efficiently processed, the application of the cloud point data of the laser radar can be assisted in deep excavation, and the cloud management and control of the power transmission line are realized by combining with a GIS (geographic information system);
the existing unmanned aerial vehicle laser point cloud collection and processing technology is high in cost, involves a large amount of manual operations, is split in flow, is low in generation efficiency, and cannot meet the requirement of large-scale application of a power transmission line.
Disclosure of Invention
The invention aims to provide an efficient acquisition and processing method for laser point clouds of an unmanned aerial vehicle, which at least solves the problems of low efficiency, high cost and unsuitability for large-scale acquisition requirements of data acquisition processes and splitting provided in the prior art.
In order to achieve the purpose, the invention provides the following technical scheme: an efficient acquisition and processing method for laser point clouds of an unmanned aerial vehicle comprises the following steps:
s1, flying an unmanned aerial vehicle device carrying a laser scanner, a GNSS high-precision positioning system, an MEMS inertial navigation system, an IMU system, a high-speed data acquisition and storage system and a full-frame camera according to a preset area track, planning a route and quickly acquiring high-precision point cloud data;
s2, acquiring accurate exposure time position and attitude information of the orthographic image through synchronously acquired three-dimensional high-precision laser radar data and imaging technology and a three-dimensional high-precision laser point cloud and orthographic image registration algorithm, and finishing quick acquisition and precision registration of a DOM result;
s3, loading airborne data and base station data, setting correct base station coordinates, adopting a global image-control-free function, and performing fusion calculation of post-difference of GNSS data and an IMU (inertial measurement unit) system through post-processing software calculation to obtain a high-precision pose result;
and S4, preprocessing the acquired original three-dimensional laser point cloud data and POS data in the flight process, and quickly generating true color point cloud according to the image data in the later period, so that the processing time after the flight is finished is saved, and the productivity is increased.
And S5, after point cloud generation, analyzing and detecting whether the point cloud has problems through characteristic comparison, automatically generating a quality programming report and laser data with coordinates, and classifying the laser data.
Preferably, the preprocessing in step S4 may include: and decompressing the POS data, generating a point cloud data las format file and converting a required coordinate system.
Preferably, the POS data is decompressed and then automatically checked for file damage and data loss.
Preferably, the booming or re-flying is required when the following problems arise:
(1) The local data record of the POS system is lost;
(2) Determining that the evaluation indexes of the equipment cannot meet the requirements;
(3) The quality of original data has local defects to influence the precision or density of point cloud;
(4) The original photo has the conditions of missed shooting and large color difference.
Preferably, the judging condition for detecting whether the point cloud has a problem through the feature comparison analysis in step S5 includes:
(1) The point cloud data coverage meets the requirements and has no holes;
(2) The edge connecting errors between the same-set air tracks and different-set air tracks meet the requirements, namely the superposed point clouds have no obvious ghost, dislocation and the like, the plane position error of the same-name points is smaller than the average point cloud spacing, and the elevation error is smaller than 0.3m;
(3) The point cloud density meets the requirements, the requirements of distance measurement, modeling and automatic driving route planning need to be met, and the number of points per square meter is not less than 30;
(4) The absolute precision of the point cloud data should meet the requirement, and if control point data exists, the comparison error with the control point data should be in accordance with the range of 0.3 m.
Preferably, the airline planning height requirements are as follows:
(1) The height of the 500KV line to the ground is 250m;
(2) The line height of the 110KV and 220KV lines to the ground is 210m, the 110KV lines need to be checked for crossing and crossing, and the line height needs to be adjusted to be higher according to the requirement;
(3) The course must be made to vary in accordance with the relief and to maintain a constant relative altitude.
Preferably, the positive deviation of the on-demand range is not greater than 300m.
Preferably, the data obtained in step S2 is subjected to filtering processing, and kalman filtering is used to filter out the random error that occurs.
The invention provides an efficient acquisition and processing method for laser point clouds of an unmanned aerial vehicle, which has the beneficial effects that:
according to the invention, a whole process of point cloud acquisition and processing is opened through a programmed process, a laser scanner, a Beidou high-precision positioning system, an MEMS inertial navigation system, a full-frame camera and other parts are integrated on a vertical take-off and landing fixed-wing unmanned aerial vehicle, and are communicated through a cooperative interface, and an accurate route control interface is integrated, so that the following of terrain and a lead is realized, and the operation in mountainous areas is adapted; meanwhile, the real-time kinematic (RTK) processing is realized without operations such as reference station dotting and the like based on a Beidou satellite navigation system and a CORS network, the operation is convenient and simple, the measurement range is not limited by the position of the reference station, the one-time operation range can reach 200km, the efficiency is high, data achievements can be generated quickly, and the real-color three-dimensional point cloud data integrating the functions of acquisition, processing, coloring, transmission, browsing and the like can be acquired quickly.
Drawings
Fig. 1 is a structural frame diagram of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be obtained by a person skilled in the art without making any creative effort based on the embodiments in the present invention, belong to the protection scope of the present invention.
Example 1
Referring to fig. 1, the present invention provides a technical solution: an efficient acquisition and processing method for laser point clouds of an unmanned aerial vehicle comprises the following steps:
s1, flying an unmanned aerial vehicle device carrying a laser scanner, a GNSS high-precision positioning system, an MEMS inertial navigation system, an IMU system, a high-speed data acquisition and storage system and a full-frame camera according to a preset area track, and planning a course, so as to quickly acquire high-precision point cloud data and ensure the accuracy and the comprehensiveness of the data;
s2, acquiring accurate exposure time position and attitude information of a forward shot image through three-dimensional high-precision laser radar data and an imaging technology and a three-dimensional high-precision laser point cloud and orthoimage registration algorithm which are synchronously acquired, completing quick acquisition and precision registration of a DOM result, realizing that visible light data can meet precision under the condition of no control, and saving a large amount of manpower and material resources;
s3, loading airborne data and base station data, setting correct base station coordinates, adopting a global image-control-free function, and performing fusion calculation of post-difference of GNSS data and an IMU (inertial measurement unit) system through post-processing software calculation to obtain a high-precision pose result;
and S4, preprocessing the acquired original three-dimensional laser point cloud data and POS data in the flight process, and quickly generating true color point cloud according to the image data in the later period, so that the processing time after the flight is finished is saved, the capacity is increased, and the operating efficiency is improved.
And S5, after point cloud generation, analyzing and detecting whether the point cloud has problems through characteristic comparison, automatically generating a quality programming report and laser data with coordinates, and classifying the laser data to obtain final acquired data.
Preferably, the preprocessing in step S4 may further include: the POS data is decompressed, the point cloud data las format file is generated, the coordinate system required by conversion is generated, the data processing time is saved, and the post-processing operation of the data is facilitated.
As a preferred scheme, further, the automatic verification of the POS data after decompression is performed with file damage and data loss, which has a self-verification function, and if damaged, the POS data can be re-collected without being checked after landing, thereby saving data collection time when damaged.
As a preferable mode, further, when the following problems occur, it is necessary to perform a fly-back or a fly-back: the local data record of the POS system is missing; determining that the evaluation indexes of the equipment cannot meet the requirements; the quality of original data has local defects to influence the precision or density of point cloud; the original photos have the conditions of missed photos and large color difference, and the completion and the accuracy of data are ensured.
As a preferable scheme, the judging condition for detecting whether the point cloud has a problem through the feature comparison analysis in step S5 further includes: the point cloud data coverage meets the requirements and has no holes; the edge connecting errors between the same-frame time zones and different-frame time zones meet the requirements, namely, the superposed point clouds have no phenomena of obvious ghost images, dislocation and the like, the plane position error of the same-name points is smaller than the average point cloud spacing, and the elevation error is smaller than 0.3m; the point cloud density meets the requirements, the requirements of distance measurement, modeling and automatic driving route planning need to be met, and the number of points per square meter is not less than 30; the absolute precision of the point cloud data meets the requirement, if control point data exists, the comparison error with the control point data accords with the range of 0.3 meter, and the condition possibly occurring in the data acquisition process is accurately covered.
As a preferred solution, further, the route planning height requirement is as follows: the height of the 500KV line to the ground is 250m; the line height of the 110KV and 220KV lines to the ground is 210m, the 110KV lines need to be checked for crossing and crossing, and the line height needs to be adjusted to be higher according to the requirement; the air route has to keep the relative height unchanged according to the change of the relief, and keep the parameter information in the acquisition process consistent as much as possible.
Preferably, the on-demand range plus deviation is not greater than 300m.
As a preferred scheme, further, the data obtained in step S2 is subjected to filtering processing, and kalman filtering is adopted to filter out random errors that occur, so as to further improve the accuracy of the data and approach to actual numerical parameters.
Example 2
The embodiment is further optimized on the basis of embodiment 1, the application of Kalman filtering is limited under the condition that the statistical characteristics of a system model and noise are uncertain, H infinity filtering can be adopted to effectively solve the problems encountered by the Kalman filtering, the H infinity filtering has high estimation precision and robustness, when noise is colored noise with unknown statistical characteristics or the system has uncertainty, the filtering result of the H infinity filter is higher than the result of the Kalman filtering, the stability of the filtering result of the H infinity filter is high, the filtering effect is always good, at the moment, the H infinity filtering with good robustness is adopted, the performance of the H infinity filter is obviously better than that of the Kalman filtering, the accuracy of data is further improved, and the actual numerical parameters are close to, and other parts of the embodiment are the same as embodiment 1, so that details are not repeated.
Although embodiments of the present invention have been shown and described, it will be appreciated by those skilled in the art that changes, modifications, substitutions and alterations can be made in these embodiments without departing from the principles and spirit of the invention, the scope of which is defined in the appended claims and their equivalents.

Claims (8)

1. An efficient collection and processing method for laser point clouds of an unmanned aerial vehicle is characterized by comprising the following steps:
s1, flying an unmanned aerial vehicle device carrying a laser scanner, a GNSS high-precision positioning system, an MEMS inertial navigation system, an IMU system, a high-speed data acquisition and storage system and a full-frame camera according to a preset region track, planning a course and rapidly acquiring high-precision point cloud data;
s2, acquiring accurate exposure time position and attitude information of the orthographic image through synchronously acquired three-dimensional high-precision laser radar data and imaging technology and a three-dimensional high-precision laser point cloud and orthographic image registration algorithm, and finishing quick acquisition and precision registration of a DOM result;
s3, loading airborne data and base station data, setting correct base station coordinates, adopting a global image-free control function, and performing post-difference of GNSS data and IMU system fusion calculation through post-processing software calculation to obtain a high-precision pose result;
s4, preprocessing the acquired original three-dimensional laser point cloud data and POS data in the flight process, and quickly generating true color point cloud according to image data in the later period, so that the processing time after the flight is finished is saved, and the capacity is increased;
and S5, after point cloud generation, analyzing and detecting whether the point cloud has problems through characteristic comparison, automatically generating a quality programming report and laser data with coordinates, and classifying the laser data.
2. The efficient collection and processing method for the laser point cloud of the unmanned aerial vehicle as claimed in claim 1, wherein the efficient collection and processing method comprises the following steps: the preprocessing step in S4 comprises the following steps: and decompressing the POS data, generating a point cloud data las format file and converting a required coordinate system.
3. The efficient collection and processing method for the laser point cloud of the unmanned aerial vehicle as claimed in claim 2, wherein the efficient collection and processing method comprises the following steps: and after the POS data is decompressed, automatically checking whether the file damage and the data loss exist or not.
4. The efficient collection and processing method for the laser point cloud of the unmanned aerial vehicle as claimed in claim 3, wherein the efficient collection and processing method comprises the following steps: when the following problems occur, a fly-back or a fly-back is required:
(1) The local data record of the POS system is missing;
(2) Determining that the equipment cannot meet the requirements according to the evaluation indexes of the equipment;
(3) The quality of original data has local defects to influence the precision or density of point cloud;
(4) The original photos have the conditions of missed photos and large color difference.
5. The efficient collection and processing method for the laser point cloud of the unmanned aerial vehicle as claimed in claim 1, wherein the efficient collection and processing method comprises the following steps: the judgment condition for detecting whether the point cloud has a problem through feature comparison analysis in the step S5 comprises the following steps:
(1) The point cloud data coverage meets the requirements and has no holes;
(2) The edge connecting errors between the same-frame time zones and different-frame time zones meet the requirements, namely, the superposed point clouds have no phenomena of obvious ghost images, dislocation and the like, the plane position error of the same-name points is smaller than the average point cloud spacing, and the elevation error is smaller than 0.3m;
(3) The point cloud density meets the requirements, the requirements of distance measurement, modeling and automatic driving route planning need to be met, and the number of points per square meter is not less than 30;
(4) The absolute precision of the point cloud data should meet the requirement, and if control point data exists, the comparison error with the control point data should be in accordance with the range of 0.3 m.
6. The efficient collection and processing method for the laser point cloud of the unmanned aerial vehicle as claimed in claim 1, wherein the efficient collection and processing method comprises the following steps: the flight path planning height requirements are as follows:
(1) The height of the 500KV line air line to the ground is 250m;
(2) The line height of the 110KV and 220KV lines to the ground is 210m, the 110KV lines need to be checked for crossing and crossing, and the lines need to be heightened according to requirements;
(3) The course must be made to vary in accordance with the relief and to maintain a constant relative altitude.
7. The efficient collection and processing method for the laser point cloud of the unmanned aerial vehicle as claimed in claim 6, wherein the efficient collection and processing method comprises the following steps: the positive deviation of the range of the demand heightening is not more than 300m.
8. The efficient collection and processing method for the laser point cloud of the unmanned aerial vehicle as claimed in claim 1, wherein the efficient collection and processing method comprises the following steps: and (3) carrying out filtering processing on the data obtained in the step (S2), and filtering the random errors by adopting Kalman filtering.
CN202110915054.0A 2021-08-10 2021-08-10 Efficient collection and processing method for laser point cloud of unmanned aerial vehicle Active CN113643254B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110915054.0A CN113643254B (en) 2021-08-10 2021-08-10 Efficient collection and processing method for laser point cloud of unmanned aerial vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110915054.0A CN113643254B (en) 2021-08-10 2021-08-10 Efficient collection and processing method for laser point cloud of unmanned aerial vehicle

Publications (2)

Publication Number Publication Date
CN113643254A CN113643254A (en) 2021-11-12
CN113643254B true CN113643254B (en) 2023-01-20

Family

ID=78420578

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110915054.0A Active CN113643254B (en) 2021-08-10 2021-08-10 Efficient collection and processing method for laser point cloud of unmanned aerial vehicle

Country Status (1)

Country Link
CN (1) CN113643254B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114663403B (en) * 2022-03-25 2022-11-18 北京城建设计发展集团股份有限公司 Prefabricated part assembling surface local defect identification method based on dense scanning data
CN115205278B (en) * 2022-08-02 2023-05-02 昆山斯沃普智能装备有限公司 Electric automobile chassis scratch detection method and system
CN117128861A (en) * 2023-10-23 2023-11-28 常州市建筑材料研究所有限公司 Monitoring system and monitoring method for station-removing three-dimensional laser scanning bridge

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111458720A (en) * 2020-03-10 2020-07-28 中铁第一勘察设计院集团有限公司 Airborne laser radar data-based oblique photography modeling method for complex mountainous area
CN113189615A (en) * 2021-03-26 2021-07-30 国家电网有限公司 Method for inspecting power transmission line by using vertical take-off and landing fixed wing unmanned aerial vehicle

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111458720A (en) * 2020-03-10 2020-07-28 中铁第一勘察设计院集团有限公司 Airborne laser radar data-based oblique photography modeling method for complex mountainous area
CN113189615A (en) * 2021-03-26 2021-07-30 国家电网有限公司 Method for inspecting power transmission line by using vertical take-off and landing fixed wing unmanned aerial vehicle

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
LIDAR点云的获取及分类方法浅析;吴端松;《地球》;20190408(第04期);全文 *

Also Published As

Publication number Publication date
CN113643254A (en) 2021-11-12

Similar Documents

Publication Publication Date Title
CN113643254B (en) Efficient collection and processing method for laser point cloud of unmanned aerial vehicle
CN110221311B (en) Method for automatically extracting tree height of high-canopy-closure forest stand based on TLS and UAV
CN110308457B (en) Unmanned aerial vehicle-based power transmission line inspection system
CN108109437B (en) Unmanned aerial vehicle autonomous route extraction and generation method based on map features
Lo Brutto et al. UAV platforms for cultural heritage survey: first results
CN111597666B (en) Method for applying BIM to transformer substation construction process
CN108090957B (en) BIM-based terrain mapping method
Sauerbier et al. The practical application of UAV-based photogrammetry under economic aspects
CN113034470B (en) Asphalt concrete thickness nondestructive testing method based on unmanned aerial vehicle oblique photography technology
CN113012292B (en) AR remote construction monitoring method and system based on unmanned aerial vehicle aerial photography
CN108681337B (en) Unmanned aerial vehicle special for inspection of culverts or bridges and unmanned aerial vehicle inspection method
CN109883398A (en) The system and method that the green amount of plant based on unmanned plane oblique photograph is extracted
CN111189433A (en) Karst peak forest landform parameter measuring method based on unmanned aerial vehicle aerial photography
CN112033389A (en) Deformation settlement monitoring method under gully terrain condition
Barrile et al. 3D modeling with photogrammetry by UAVs and model quality verification
CN116129067A (en) Urban live-action three-dimensional modeling method based on multi-source geographic information coupling
CN112711987B (en) Double-laser-radar electric power tower three-dimensional point cloud enhancement system and method
CN108050995B (en) Oblique photography non-image control point aerial photography measurement area merging method based on DEM
CN114078211A (en) Method for intelligently detecting tree lodging based on laser radar
CN104519314A (en) Quick acquisition method of panoramic information of accident site
Amin et al. Reconstruction of 3D accident scene from multirotor UAV platform
CN205176663U (en) System of falling is being fixed a position to unmanned aerial vehicle power line based on machine vision
CN113781639B (en) Quick construction method for digital model of large-scene road infrastructure
Pagliari et al. Use of fisheye parrot bebop 2 images for 3d modelling using commercial photogrammetric software
CN115223030B (en) Pavement disease detection system and method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant