CN111754535A - Airborne laser point cloud overlapping degree detection method and system - Google Patents

Airborne laser point cloud overlapping degree detection method and system Download PDF

Info

Publication number
CN111754535A
CN111754535A CN202010577931.3A CN202010577931A CN111754535A CN 111754535 A CN111754535 A CN 111754535A CN 202010577931 A CN202010577931 A CN 202010577931A CN 111754535 A CN111754535 A CN 111754535A
Authority
CN
China
Prior art keywords
airborne laser
point cloud
boundary
laser point
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010577931.3A
Other languages
Chinese (zh)
Other versions
CN111754535B (en
Inventor
李昊霖
陈琰如
佘毅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sichuan Surveying And Mapping Product Quality Supervision And Inspection Station Ministry Of Natural Resources (sichuan Surveying And Mapping Product Quality Supervision And Inspection Station)
Original Assignee
Sichuan Surveying And Mapping Product Quality Supervision And Inspection Station Ministry Of Natural Resources (sichuan Surveying And Mapping Product Quality Supervision And Inspection Station)
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sichuan Surveying And Mapping Product Quality Supervision And Inspection Station Ministry Of Natural Resources (sichuan Surveying And Mapping Product Quality Supervision And Inspection Station) filed Critical Sichuan Surveying And Mapping Product Quality Supervision And Inspection Station Ministry Of Natural Resources (sichuan Surveying And Mapping Product Quality Supervision And Inspection Station)
Priority to CN202010577931.3A priority Critical patent/CN111754535B/en
Publication of CN111754535A publication Critical patent/CN111754535A/en
Application granted granted Critical
Publication of CN111754535B publication Critical patent/CN111754535B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Quality & Reliability (AREA)
  • Electromagnetism (AREA)
  • Multimedia (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The invention discloses a method and a system for detecting the overlapping degree of airborne laser point clouds. The airborne laser point cloud overlapping degree detection method comprises the following steps: extracting a boundary point set of the airborne laser point cloud navigation band based on the airborne laser point cloud data set; calculating the width of the airborne laser point cloud navigation band; performing retraction processing by adopting an equal proportion method or an integral method based on a boundary point set of the airborne laser point cloud flight band, the width of the airborne laser point cloud flight band and a flight band boundary point pair set to obtain an effective coverage area of each airborne laser point cloud flight band; and detecting the intersection of the overlapping area and the boundary line of the effective coverage areas of the adjacent airborne laser point cloud strips, and determining that the overlapping degree reaches the standard when the overlapping area exists in the effective coverage areas of the adjacent airborne laser point cloud strips and the boundary line does not intersect, otherwise, determining that the overlapping degree does not reach the standard. The invention can improve the efficiency and reliability of the detection of the overlapping degree.

Description

Airborne laser point cloud overlapping degree detection method and system
Technical Field
The invention relates to the field of image processing, in particular to a method and a system for detecting the cloud overlapping degree of airborne laser points.
Background
Airborne lidar technology has been widely used in the production of Digital Surface Models (DSMs), Digital Elevation Models (DEMs) and digital orthographic imagery (DOM). However, the field angle of the airborne laser radar equipment is small, the flying height is low, the influence of factors such as topographic relief, wind power and ground object mirror reflection during the aerial photography is large, an area with unqualified overlapping degree is easy to appear, and the use of data is directly influenced. The currently adopted human-computer interaction type and interval sampling type point cloud overlapping degree detection methods along the flight direction have low efficiency and poor reliability and scientificity, and can not effectively control the aerial photography cost and the point cloud data quality.
Disclosure of Invention
Therefore, it is necessary to provide a method and a system for detecting the degree of overlap of an airborne laser point cloud to improve the efficiency and reliability of the degree of overlap detection.
In order to achieve the purpose, the invention provides the following scheme:
an airborne laser point cloud overlapping degree detection method comprises the following steps:
acquiring an airborne laser point cloud data set and corresponding time information;
extracting a boundary point set of an airborne laser point cloud navigation band based on the airborne laser point cloud data set;
screening a zone boundary point pair set according to the time information and the boundary point set of the airborne laser point cloud flight band, and calculating the width of the airborne laser point cloud flight band;
performing retraction processing by adopting an equal proportion method or an integral method based on the boundary point set of the airborne laser point cloud flight band, the width of the airborne laser point cloud flight band and the flight band boundary point pair set to obtain an effective coverage area of each airborne laser point cloud flight band;
and detecting the intersection of the overlapping area and the boundary line of the effective coverage area of the adjacent airborne laser point cloud strips, and determining that the overlapping degree reaches the standard when the overlapping area exists in the effective coverage area of the adjacent airborne laser point cloud strips and the boundary line of the effective coverage area of the adjacent airborne laser point cloud strips does not intersect, otherwise, determining that the overlapping degree does not reach the standard.
Optionally, the overlap area and the boundary line intersection of the effective coverage areas of the adjacent airborne laser point cloud strips are detected, and when the overlap area exists in the effective coverage area of the adjacent airborne laser point cloud strips and the boundary line of the effective coverage area of the adjacent airborne laser point cloud strips does not intersect, the overlap degree is determined to reach the standard, otherwise, the overlap degree does not reach the standard, and the method specifically includes:
determining the adjacent relation of each airborne laser point cloud navigation band in the measuring area range;
determining an overlapping area of an effective coverage area of the jth airborne laser point cloud aerial strip and an effective coverage area of the (j + 1) th airborne laser point cloud aerial strip based on the adjacent relation;
judging whether a left boundary line in the effective coverage area of the jth airborne laser point cloud aerial strip is not crossed with a left boundary line in the effective coverage area of the (j + 1) th airborne laser point cloud aerial strip to obtain a first judgment result;
judging whether a left side boundary line in the effective coverage area of the jth airborne laser point cloud aerial strip is not crossed with a right side boundary line in the effective coverage area of the jth +1 th airborne laser point cloud aerial strip to obtain a second judgment result;
judging whether a right side boundary line in the effective coverage area of the jth airborne laser point cloud aerial strip is not crossed with a left side boundary line in the effective coverage area of the (j + 1) th airborne laser point cloud aerial strip to obtain a third judgment result;
judging whether a right side boundary line in the effective coverage area of the jth airborne laser point cloud aerial strip is not crossed with a right side boundary line in the effective coverage area of the (j + 1) th airborne laser point cloud aerial strip to obtain a fourth judgment result;
and when the overlapping area is not empty and the first judgment result, the second judgment result, the third judgment result and the fourth judgment result are all yes, determining that the overlapping degree reaches the standard, otherwise, determining that the overlapping degree does not reach the standard.
Optionally, the determining whether a left boundary line in the effective coverage area of the jth airborne laser point cloud flight strip intersects with a left boundary line in the effective coverage area of the (j + 1) th airborne laser point cloud flight strip to obtain a first determination result specifically includes:
determining an inward contraction boundary point set of a left boundary line in the effective coverage area of the jth airborne laser point cloud flight band to obtain a first inward contraction boundary point set, and determining an inward contraction boundary point set of a left boundary line in the effective coverage area of the jth +1 th airborne laser point cloud flight band to obtain a second inward contraction boundary point set;
selecting a boundary point from the first inward contraction boundary point set and the second inward contraction boundary point set to form a boundary point pair, and determining a set formed by boundary points meeting set judgment conditions as a point pair set to be evaluated after traversing the first inward contraction boundary point set and the second inward contraction boundary point set;
selecting a pair of point pairs to be evaluated in the point pair set to be evaluated; a first point to be evaluated in the point pair to be evaluated is a boundary point in a first inward shrinkage boundary point set, and a second point to be evaluated in the point pair to be evaluated is a boundary point in a second inward shrinkage boundary point set corresponding to the first point to be evaluated;
selecting adjacent nodes of the first point to be evaluated from the first retracted boundary point set to obtain a third point to be evaluated and a fourth point to be evaluated; selecting adjacent nodes of the second point to be evaluated in the second inward contraction boundary point set to obtain a fifth point to be evaluated and a sixth point to be evaluated;
respectively selecting adjacent nodes of the third point to be evaluated and the fourth point to be evaluated from the first inward contraction boundary point set to obtain a seventh point to be evaluated and an eighth point to be evaluated; respectively selecting adjacent nodes of the fifth point to be evaluated and the sixth point to be evaluated in the second inward contraction boundary point set to obtain a ninth point to be evaluated and a tenth point to be evaluated;
sequentially connecting the seventh point to be evaluated, the third point to be evaluated, the first point to be evaluated, the fourth point to be evaluated and the eighth point to be evaluated to obtain four straight line segments corresponding to a first inward contraction boundary point set; sequentially connecting the ninth point to be evaluated, the fifth point to be evaluated, the second point to be evaluated, the sixth point to be evaluated and the tenth point to be evaluated to obtain four straight line segments corresponding to a second inward contraction boundary point set;
and judging whether each straight line segment corresponding to the first inward contraction boundary point set is not intersected with any straight line segment corresponding to the second inward contraction boundary point set or not to obtain a first judgment result.
Optionally, the first determination condition is
Figure BDA0002551940910000031
Wherein x isk、ykRespectively the abscissa and ordinate, x, of the boundary point selected from the first set of retracted boundary pointsm、ymRespectively an abscissa and an ordinate of a boundary point selected from the second inward-contraction boundary point set, k is a kth node in the first inward-contraction boundary point set, m represents an mth node in the second inward-contraction boundary point set, and S _ T is a plane coordinate distance threshold.
Optionally, the determining the adjacent relationship of each airborne laser point cloud flight band in the survey area range specifically includes:
when the name of the airborne laser point cloud navigation band in the survey area range is associated with the adjacency of the airborne laser point cloud navigation band, determining the adjacency relation of each airborne laser point cloud navigation band in the survey area range according to the name;
and when the name of the airborne laser point cloud navigation band in the survey area range and the adjacency of the airborne laser point cloud navigation band are not related, calculating the geometric center coordinate value of the airborne laser point cloud navigation band surface, and determining the adjacency relation of each airborne laser point cloud navigation band in the survey area range according to the distance between each geometric center coordinate value.
Optionally, performing retraction processing by using an equal proportion method to obtain an effective coverage area of each airborne laser point cloud flight strip, specifically comprising:
calculating the position of each laser point pair in the navigation band boundary point pair set subjected to retraction processing by an equal proportion method according to the navigation band boundary point pair set and the minimum point cloud navigation band overlapping proportion standard value;
according to the position, according to the acquired time attribute value, carrying out ascending arrangement on the concentrated left boundary points of the flight band boundary points subjected to the proportional method retraction processing to obtain a first sequence;
according to the position, according to the acquired time attribute value, carrying out descending arrangement on the central right boundary points of the flight band boundary points subjected to the proportional method retraction processing to obtain a second sequence;
and sequentially connecting the first sequence and the second sequence to generate an effective coverage area of the airborne laser point cloud navigation band.
Optionally, an integral method is adopted for retraction processing, so as to obtain an effective coverage area of each airborne laser point cloud flight band, and the method specifically comprises the following steps:
arranging the left boundary points in the boundary point set of the airborne laser point cloud flight band in an ascending order according to the acquired time attribute values to obtain a third sequence;
arranging the right boundary points in the boundary point set of the airborne laser point cloud flight band in a descending order according to the acquired time attribute values to obtain a fourth sequence;
sequentially connecting the third sequence and the fourth sequence to generate a polygonal area;
calculating the buffer value of each strip laser point cloud navigation band according to the width of the airborne laser point cloud navigation band and the minimum point cloud navigation band overlapping ratio standard value;
and carrying out retraction processing on the polygonal area based on the buffer value to obtain an effective coverage area of the airborne laser point cloud navigation band.
Optionally, the extracting a boundary point set of the airborne laser point cloud flight band based on the airborne laser point cloud data set specifically includes:
determining a set formed by laser points of which the heading edge attribute value is yes and the absolute value of the scanning angle attribute value is greater than a set angle threshold value as a boundary point set of the airborne laser point cloud navigation band; the laser point with the negative attribute value of the boundary point centralized scanning angle of the airborne laser point cloud navigation band is the left boundary point, and the laser point with the positive attribute value of the boundary point centralized scanning angle of the airborne laser point cloud navigation band is the right boundary point.
Optionally, the screening of the fairway zone boundary point pair set according to the time information and the boundary point set of the airborne laser point cloud fairway zone, and the calculation of the width of the airborne laser point cloud fairway zone specifically include:
determining a set formed by boundary points of which the interval value of the acquisition time attribute values of the boundary points on the left side and the boundary points on the right side in the boundary point set of the airborne laser point cloud flight band is minimum and the interval value is smaller than a set time threshold as a flight band boundary point set;
calculating the plane Euclidean distance of each boundary point pair in the navigation band boundary point pair set to obtain a navigation band width set;
and determining the maximum width, the minimum width and the average width in the navigation band width set to obtain the width of the airborne laser point cloud navigation band.
The invention also provides an airborne laser point cloud overlapping degree detection system, which comprises:
the data acquisition module is used for acquiring the airborne laser point cloud data set and corresponding time information;
the boundary point extraction module is used for extracting a boundary point set of the airborne laser point cloud navigation band based on the airborne laser point cloud data set;
the first calculation module is used for screening a zone boundary point pair set according to the time information and the boundary point set of the airborne laser point cloud flight band and calculating the width of the airborne laser point cloud flight band;
the second calculation module is used for carrying out retraction processing by adopting an equal proportion method or an integral method based on the boundary point set of the airborne laser point cloud flight band, the width of the airborne laser point cloud flight band and the flight band boundary point pair set to obtain an effective coverage area of each airborne laser point cloud flight band;
and the overlapping degree detection module is used for detecting the intersection of the overlapping area and the boundary line of the effective coverage area of the adjacent airborne laser point cloud navigation band, and determining that the overlapping degree reaches the standard when the overlapping area exists in the effective coverage area of the adjacent airborne laser point cloud navigation band and the boundary line of the effective coverage area of the adjacent airborne laser point cloud navigation band does not intersect, otherwise, determining that the overlapping degree does not reach the standard.
Compared with the prior art, the invention has the beneficial effects that:
the invention provides a method and a system for detecting the overlapping degree of airborne laser point clouds, which are characterized in that a boundary point set of an airborne laser point cloud navigation band is extracted based on an airborne laser point cloud data set; calculating the width of the airborne laser point cloud navigation band; performing retraction processing by adopting an equal proportion method or an integral method based on a boundary point set of the airborne laser point cloud flight band, the width of the airborne laser point cloud flight band and a flight band boundary point pair set to obtain an effective coverage area of each airborne laser point cloud flight band; and detecting the intersection of the overlapping area and the boundary line of the effective coverage areas of the adjacent airborne laser point cloud strips, and determining that the overlapping degree reaches the standard when the overlapping area exists in the effective coverage areas of the adjacent airborne laser point cloud strips and the boundary line does not intersect, otherwise, determining that the overlapping degree does not reach the standard. The method can automatically extract the irregular vector boundary of the flight band and the effective coverage area of the flight band, realizes automation, integration, high efficiency and visualization of the detection of the overlap degree of the flight band, and improves the efficiency and reliability of the detection of the overlap degree.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings without inventive exercise.
Fig. 1 is a flowchart of an airborne laser point cloud overlap detection method according to an embodiment of the present invention;
FIG. 2 is a diagram of the effective coverage area of the laser point cloud navigation band of the present invention;
FIG. 3 is a laser point cloud navigation band boundary diagram of the present invention;
FIG. 4 is a logic diagram for swath overlap detection according to the present invention;
fig. 5 is a structural diagram of an airborne laser point cloud overlapping degree detection system according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in further detail below.
Fig. 1 is a flowchart of a method for detecting an overlap degree of airborne laser point clouds according to an embodiment of the present invention.
Referring to fig. 1, the method for detecting the degree of overlap of airborne laser point clouds in the embodiment includes:
step 101: and acquiring an airborne laser point cloud data set and corresponding time information.
Step 102: and extracting a boundary point set of the airborne laser point cloud navigation band based on the airborne laser point cloud data set.
Step 103: and screening a zone boundary point pair set according to the time information and the boundary point set of the airborne laser point cloud flight band, and calculating the width of the airborne laser point cloud flight band.
Step 104: and performing retraction processing by adopting an equal proportion method or an integral method based on the boundary point set of the airborne laser point cloud flight band, the width of the airborne laser point cloud flight band and the flight band boundary point pair set to obtain the effective coverage area of each airborne laser point cloud flight band.
Step 105: and detecting the intersection of the overlapping area and the boundary line of the effective coverage area of the adjacent airborne laser point cloud strips, and determining that the overlapping degree reaches the standard when the overlapping area exists in the effective coverage area of the adjacent airborne laser point cloud strips and the boundary line of the effective coverage area of the adjacent airborne laser point cloud strips does not intersect, otherwise, determining that the overlapping degree does not reach the standard.
The step 105 specifically includes:
1) and determining the adjacent relation of each airborne laser point cloud navigation band in the survey area range. The method specifically comprises the following steps:
and when the name of the airborne laser point cloud navigation band in the survey area range is associated with the adjacency of the airborne laser point cloud navigation band, determining the adjacency relation of each airborne laser point cloud navigation band in the survey area range according to the name.
And when the name of the airborne laser point cloud navigation band in the survey area range and the adjacency of the airborne laser point cloud navigation band are not related, calculating the geometric center coordinate value of the airborne laser point cloud navigation band surface, and determining the adjacency relation of each airborne laser point cloud navigation band in the survey area range according to the distance between each geometric center coordinate value.
2) And determining the overlapping area of the effective coverage area of the jth airborne laser point cloud aerial strip and the effective coverage area of the (j + 1) th airborne laser point cloud aerial strip based on the adjacent relation.
3) And judging whether the left boundary line in the effective coverage area of the jth airborne laser point cloud navigation band and the left boundary line in the effective coverage area of the (j + 1) th airborne laser point cloud navigation band are not crossed to obtain a first judgment result. The method specifically comprises the following steps:
31) determining an inward contraction boundary point set of a left boundary line in the effective coverage area of the jth airborne laser point cloud flight band to obtain a first inward contraction boundary point set, and determining an inward contraction boundary point set of a left boundary line in the effective coverage area of the jth +1 th airborne laser point cloud flight band to obtain a second inward contraction boundary point set.
32) And selecting a boundary point from the first inward contraction boundary point set and the second inward contraction boundary point set to form a boundary point pair, and determining a set formed by boundary points meeting set judgment conditions as a point pair set to be evaluated after traversing the first inward contraction boundary point set and the second inward contraction boundary point set. The first determination condition is
Figure BDA0002551940910000071
Wherein x isk、ykRespectively the abscissa and ordinate, x, of the boundary point selected from the first set of retracted boundary pointsm、ymRespectively an abscissa and an ordinate of a boundary point selected from the second inward-contraction boundary point set, k is a kth node in the first inward-contraction boundary point set, m represents an mth node in the second inward-contraction boundary point set, and S _ T is a plane coordinate distance threshold.
33) Selecting a pair of point pairs to be evaluated in the point pair set to be evaluated; and a first point to be evaluated in the point pair to be evaluated is a boundary point in a first inward shrinkage boundary point set, and a second point to be evaluated in the point pair to be evaluated is a boundary point in a second inward shrinkage boundary point set corresponding to the first point to be evaluated.
34) Selecting adjacent nodes of the first point to be evaluated from the first retracted boundary point set to obtain a third point to be evaluated and a fourth point to be evaluated; and selecting adjacent nodes of the second point to be evaluated in the second inward contraction boundary point set to obtain a fifth point to be evaluated and a sixth point to be evaluated.
35) Respectively selecting adjacent nodes of the third point to be evaluated and the fourth point to be evaluated from the first inward contraction boundary point set to obtain a seventh point to be evaluated and an eighth point to be evaluated; and respectively selecting adjacent nodes of the fifth point to be evaluated and the sixth point to be evaluated in the second inward contraction boundary point set to obtain a ninth point to be evaluated and a tenth point to be evaluated.
36) Sequentially connecting the seventh point to be evaluated, the third point to be evaluated, the first point to be evaluated, the fourth point to be evaluated and the eighth point to be evaluated to obtain four straight line segments corresponding to a first inward contraction boundary point set; and sequentially connecting the ninth point to be evaluated, the fifth point to be evaluated, the second point to be evaluated, the sixth point to be evaluated and the tenth point to be evaluated to obtain four straight line segments corresponding to a second inward contraction boundary point set.
37) And judging whether each straight line segment corresponding to the first inward contraction boundary point set is not intersected with any straight line segment corresponding to the second inward contraction boundary point set or not to obtain a first judgment result.
4) And judging whether the left boundary line in the effective coverage area of the jth airborne laser point cloud navigation band and the right boundary line in the effective coverage area of the (j + 1) th airborne laser point cloud navigation band are not crossed to obtain a second judgment result. The specific process is the same as the step 3).
5) And judging whether the right side boundary line in the effective coverage area of the jth airborne laser point cloud navigation band is not crossed with the left side boundary line in the effective coverage area of the (j + 1) th airborne laser point cloud navigation band, and obtaining a third judgment result. The specific process is the same as the step 3).
6) And judging whether the right side boundary line in the effective coverage area of the jth airborne laser point cloud flight band does not intersect with the right side boundary line in the effective coverage area of the (j + 1) th airborne laser point cloud flight band, and obtaining a fourth judgment result. The specific process is the same as the step 3).
7) And when the overlapping area is not empty and the first judgment result, the second judgment result, the third judgment result and the fourth judgment result are all yes, determining that the overlapping degree reaches the standard, otherwise, determining that the overlapping degree does not reach the standard.
Wherein, the step 104 of adopting an equal proportion method to carry out retraction processing to obtain the effective coverage area of each airborne laser point cloud navigation band specifically comprises the following steps:
and calculating the position of each laser point pair in the navigation band boundary point pair set subjected to retraction processing by an equal proportion method according to the navigation band boundary point pair set and the minimum point cloud navigation band overlapping proportion standard value.
And according to the position, arranging the concentrated left boundary points in an ascending order according to the acquired time attribute values of the flight band boundary points subjected to the retraction processing by the equal proportion method to obtain a first sequence.
And according to the position, arranging the central right boundary points in a descending order according to the acquired time attribute values of the flight band boundary points subjected to the retraction processing by the equal proportion method to obtain a second sequence.
And sequentially connecting the first sequence and the second sequence to generate an effective coverage area of the airborne laser point cloud navigation band.
Wherein, the step 104 of adopting an integral method to carry out retraction processing to obtain the effective coverage area of each airborne laser point cloud navigation band specifically comprises the following steps:
and arranging the left boundary points in the boundary point set of the airborne laser point cloud flight band in an ascending order according to the acquired time attribute values to obtain a third sequence.
And arranging the right boundary points in the boundary point set of the airborne laser point cloud flight band in a descending order according to the acquired time attribute values to obtain a fourth sequence.
And sequentially connecting the third sequence and the fourth sequence to generate a polygonal area.
And calculating the buffer value of each strip laser point cloud navigation band according to the width of the airborne laser point cloud navigation band and the minimum point cloud navigation band overlapping ratio standard value.
And carrying out retraction processing on the polygonal area based on the buffer value to obtain an effective coverage area of the airborne laser point cloud navigation band.
Wherein, step 102 specifically includes:
determining a set formed by laser points of which the heading edge attribute value is yes and the absolute value of the scanning angle attribute value is greater than a set angle threshold value as a boundary point set of the airborne laser point cloud navigation band; the laser point with the negative attribute value of the boundary point centralized scanning angle of the airborne laser point cloud navigation band is the left boundary point, and the laser point with the positive attribute value of the boundary point centralized scanning angle of the airborne laser point cloud navigation band is the right boundary point.
Step 103 specifically includes:
and determining a set formed by boundary points of which the interval value of the acquisition time attribute values of the boundary points on the left side and the boundary points on the right side in the boundary point set of the airborne laser point cloud flight band is minimum and the interval value is smaller than a set time threshold as a flight band boundary point set.
And calculating the plane Euclidean distance of each boundary point pair in the navigation band boundary point pair set to obtain a navigation band width set.
And determining the maximum width, the minimum width and the average width in the navigation band width set to obtain the width of the airborne laser point cloud navigation band.
In practical application, the specific implementation process of the airborne laser point cloud aerial photography vulnerability detection method in the implementation is as follows:
step 1: and traversing the laser point cloud data set, and extracting a boundary point set of the airborne laser point cloud navigation band based on the attribute values of the 'route edge' and the 'scanning angle' of the airborne laser point.
Firstly, traversing an airborne laser point cloud data set, and when the attribute value of the 'route edge' of an airborne laser point is 'yes' and the absolute value of the attribute value of the 'scanning angle' is larger than a set angle threshold AminThen, the point is extracted and put into a 'boundary point set' P ═ P of the navigation bandi}i=1,NIn the middle, if the extracted boundary points are more and dense, the "boundary point set" P can be thinned based on the acquisition time of the laser point, and generally aminMay be set to 5.
Then, classifying the boundary point set P into a left boundary point set P according to the positive and negative of the scanning angle attribute value of the point cloudLAnd "Right set of boundary points" PRIn general, a laser spot with a negative "scan angle" attribute value can be classified as a "left side boundary point set" PLThe laser points with positive attribute value of scan angle are classified into the right side boundary point set PR
Step 2: and calculating the width of the airborne laser point cloud navigation band by using the acquired time information of the laser points.
Sequentially from the point cloud navigation band ' left and right boundary point set ' P ' in the step 1LAnd PRIn the method, the attribute value of the extraction 'acquisition time' has the minimum interval and the interval value is less than the set time threshold value TtThe boundary point pair of (2) calculates the plane Euclidean distance of the boundary point pair, namely the width of the position of the flight band
Figure BDA0002551940910000101
And puts the point pair into the point pair set M. Subsequently, a navigation band width set W is statistically extractedjMaximum width of
Figure BDA0002551940910000102
Minimum width
Figure BDA0002551940910000103
Average width
Figure BDA0002551940910000104
The set time threshold value TtThe method is mainly used for ensuring that the connecting line of the extracted boundary point pair is perpendicular to the flight direction so as to represent the width of the position of the flight band, and generally a time threshold T is settMay be set to 0.1 second.
And step 3: and (4) utilizing an equal proportion method or an integral method to carry out retraction processing on the boundary points of the airborne laser point cloud flight band, and calculating the effective coverage area of the airborne laser point cloud flight band.
1) The method for calculating the effective coverage area of the airborne laser point cloud flight band by using the equal-proportion internal contraction method comprises the following steps:
firstly, obtaining a standard value k of the minimum point cloud flight band overlapping proportion according to standard specifications and technical design books, wherein k is generally 13%.
Then, the laser point pair P (x ') after retraction was calculated by the following formula'iL,y′iL) And P (x'iR,y′iR) In the position of (a) in the first,
Figure BDA0002551940910000111
wherein, P (x)iL,yiL) And P (x)iR,yiR) The point pairs in the boundary point pair set M in step 2 are shown in fig. 2.
And then, arranging the left boundary points after the inward contraction according to an ascending order and arranging the right boundary points according to a descending order respectively by using the attribute value of 'acquisition time' of the laser points.
Finally, the sorted left and right side inward shrinkage boundary points are connected in sequence to generate a polygon tightly wrapping the whole navigation band
Figure BDA0002551940910000118
The point cloud navigation band boundary is the jth laser point cloud navigation band boundary after retraction.
2) The method for effectively covering the area by the airborne laser point cloud flight band by the integral method comprises the following steps:
firstly, the attribute value of 'acquisition time' of the laser spot is used for dividing the 'left side boundary point set' P in the step 1LArranged in ascending order, a set of "right side boundary points" PRThe boundary points in (1) are arranged in descending order.
Then, the sorted left and right boundary points are connected in sequence to generate a polygon tightly wrapping the whole navigation band
Figure BDA0002551940910000112
WRjI.e. the boundary of the jth laser point cloud flight band, as shown in fig. 3.
Finally, using the formula
Figure BDA0002551940910000113
Calculating the buffer value WB of the jth laser point cloud flight bandjAnd based on the buffer value WBjWR (Point-cloud) boundary of aerial zonejIs retracted into
Figure BDA0002551940910000114
I.e. the effective coverage area of the flight band based on the standard value k of the degree of overlap,
Figure BDA0002551940910000115
is the value calculated in step 2. Calculate buffer value WBjIn time, the maximum width of the flight band calculated in the step 2 can be adopted according to the requirement
Figure BDA0002551940910000116
Or minimum width
Figure BDA0002551940910000117
And 4, step 4: and detecting whether the overlapping degree of the airborne laser point cloud adjacent flight strips reaches the standard or not by utilizing the spatial relation of the effective coverage area of the flight strips.
Firstly, when the name of a flight band and the adjacency of the flight band have relevance, determining the adjacency relation between the flight bands in a measuring area according to the name of the flight band; and when the name of the flight band and the adjacency of the flight band are not related, calculating the geometric center coordinate value of the flight band surface, and determining the adjacency relation of the flight band according to the distance of the geometric center coordinate value of each flight band surface.
Then, when the zone effective coverage area is calculated by adopting the equal proportion retraction formula in the step 3, the zone effective coverage area in the step 3 is calculated by using the following formula
Figure BDA0002551940910000121
Of the overlapping area
Figure BDA0002551940910000122
When the effective coverage area of the flight band is calculated by adopting the integral retraction formula in the step 3, the effective coverage area of the flight band in the step 3 is calculated by utilizing the following formula
Figure BDA0002551940910000123
Of the overlapping area
Figure BDA0002551940910000124
Figure BDA0002551940910000125
Thereafter, in the detection step 3
Figure BDA0002551940910000126
Or
Figure BDA0002551940910000127
Left boundary line of
Figure BDA0002551940910000128
Right side boundary line
Figure BDA0002551940910000129
As described in step 3
Figure BDA00025519409100001210
Or
Figure BDA00025519409100001211
Left boundary line of
Figure BDA00025519409100001212
Right side boundary line
Figure BDA00025519409100001213
The determination steps of whether the intersection exists or not and whether the intersection exists or not between the boundary lines are as follows, but the following steps only use one group of boundary lines
Figure BDA00025519409100001214
Figure BDA00025519409100001215
The detection of the intersection relationship of (2) is taken as an example for illustration.
a) From
Figure BDA00025519409100001216
The starting end of the system starts to sequentially acquire nodes in the line segment and respectively puts the nodes into corresponding boundary point sets
Figure BDA00025519409100001217
In step 3, the sorted set of inlined boundary points may also be directly obtained.
b) In turn from
Figure BDA00025519409100001218
Extraction point
Figure BDA00025519409100001219
From
Figure BDA00025519409100001220
Taking-out point
Figure BDA00025519409100001221
Go through
Figure BDA00025519409100001222
All possible pairs of points in the equation
Figure BDA00025519409100001223
Putting the point pair set A _ PP to be evaluatedj,j+1In, but at the same time
Figure BDA00025519409100001224
And point
Figure BDA00025519409100001225
When the boundary lines are overlapped, the intersection of the corresponding boundary lines can be judged without executing other steps;
Figure BDA00025519409100001226
wherein x isk、ykAre respectively as
Figure BDA0002551940910000131
Abscissa and ordinate of (1), xm、ymAre respectively as
Figure BDA0002551940910000132
On the abscissa and ordinate, k denotes
Figure BDA0002551940910000133
M represents the kth node in (1)
Figure BDA0002551940910000134
The mth node in (1), S _ T is a plane coordinate distance threshold value, which can be set according to the boundary points
Figure BDA0002551940910000135
And
Figure BDA0002551940910000136
the average distance of the middle adjacent node is determined adaptively, and the value of the average distance can be set to be 2 times of the average distance of the adjacent node, or S _ T can be directly set to a larger constant value, such as 50 meters.
c) From A _ PPj,j+1In which point pairs to be evaluated are taken out
Figure BDA0002551940910000137
(first point to be evaluated),
Figure BDA0002551940910000138
(second points to be evaluated) and from the corresponding set of boundary points
Figure BDA0002551940910000139
In-process acquisition of two nodes of front and back of point pair to be evaluated
Figure BDA00025519409100001310
(seventh to-be-evaluated Point),
Figure BDA00025519409100001311
(third point to be evaluated),
Figure BDA00025519409100001312
(fourth point to be evaluated),
Figure BDA00025519409100001313
(eighth point to be evaluated),
Figure BDA00025519409100001314
(ninth to-be-evaluated Point),
Figure BDA00025519409100001315
(fifth point to be evaluated),
Figure BDA00025519409100001316
(sixth point to be evaluated),
Figure BDA00025519409100001317
(tenth point to be evaluated).
d) Sequentially judging four straight line segments
Figure BDA00025519409100001318
Each linear segment and four other linear segments
Figure BDA00025519409100001319
All ofWhether one straight line segment is intersected or not is judged by the following formula, but the following formula only uses a pair of straight line segments
Figure BDA00025519409100001320
The judgment of the intersection relationship (2) is an example explanation. When d is1*d2<0 and d3*d4<At 0, there is an intersection of the pair of straight line segments, i.e., there is an intersection of the corresponding boundary lines.
Figure BDA00025519409100001321
Wherein d is1Is a point
Figure BDA00025519409100001322
Constructed vectors and points
Figure BDA00025519409100001323
Cross product of the constructed vectors, d2Is a point
Figure BDA00025519409100001324
Constructed vectors and points
Figure BDA00025519409100001325
Cross product of the constructed vectors, d3Is a point
Figure BDA00025519409100001326
Constructed vectors and points
Figure BDA00025519409100001327
Cross product of the constructed vectors, d4Is a point
Figure BDA00025519409100001328
Constructed vectors and points
Figure BDA00025519409100001329
The cross product of the constructed vectors.
Finally, when adjacent flight zones have effective coverage
Figure BDA0002551940910000141
Or
Figure BDA0002551940910000142
Is not empty and
Figure BDA0002551940910000143
and
Figure BDA0002551940910000144
Figure BDA0002551940910000145
if no intersection exists between the two zones, the overlapping degree of the adjacent zones meets the zone overlapping requirement, the detection logic is as shown in fig. 4, otherwise, the overlapping degree of the adjacent zones does not meet the zone overlapping requirement.
The invention also provides an airborne laser point cloud overlapping degree detection system, and fig. 5 is a structural diagram of the airborne laser point cloud overlapping degree detection system provided by the embodiment of the invention. Referring to fig. 5, the system for detecting the degree of overlap of airborne laser point clouds in the embodiment includes:
and the data acquisition module 501 is configured to acquire an airborne laser point cloud data set and corresponding time information.
A boundary point extracting module 502, configured to extract a boundary point set of the airborne laser point cloud flight band based on the airborne laser point cloud data set.
The first calculating module 503 is configured to screen a sideband boundary point pair set according to the time information and the airborne laser point cloud sideband boundary point set, and calculate the width of the airborne laser point cloud sideband.
And a second calculation module 504, configured to perform retraction processing by using an equal proportion method or an integral method based on the boundary point set of the airborne laser point cloud flight band, the width of the airborne laser point cloud flight band, and the flight band boundary point pair set, so as to obtain an effective coverage area of each airborne laser point cloud flight band.
And the overlapping degree detection module 505 is configured to detect that an overlapping area and a boundary line of effective coverage areas of adjacent airborne laser point cloud strips intersect, and when the overlapping area exists in the effective coverage areas of the adjacent airborne laser point cloud strips and the boundary line of the effective coverage areas of the adjacent airborne laser point cloud strips does not intersect, determine that the overlapping degree reaches the standard, otherwise, determine that the overlapping degree does not reach the standard.
The embodiments in the present description are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other. For the system disclosed by the embodiment, the description is relatively simple because the system corresponds to the method disclosed by the embodiment, and the relevant points can be referred to the method part for description.
The principles and embodiments of the present invention have been described herein using specific examples, which are provided only to help understand the method and the core concept of the present invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, the specific embodiments and the application range may be changed. In view of the above, the present disclosure should not be construed as limiting the invention.

Claims (10)

1. An airborne laser point cloud overlapping degree detection method is characterized by comprising the following steps:
acquiring an airborne laser point cloud data set and corresponding time information;
extracting a boundary point set of an airborne laser point cloud navigation band based on the airborne laser point cloud data set;
screening a zone boundary point pair set according to the time information and the boundary point set of the airborne laser point cloud flight band, and calculating the width of the airborne laser point cloud flight band;
performing retraction processing by adopting an equal proportion method or an integral method based on the boundary point set of the airborne laser point cloud flight band, the width of the airborne laser point cloud flight band and the flight band boundary point pair set to obtain an effective coverage area of each airborne laser point cloud flight band;
and detecting the intersection of the overlapping area and the boundary line of the effective coverage area of the adjacent airborne laser point cloud strips, and determining that the overlapping degree reaches the standard when the overlapping area exists in the effective coverage area of the adjacent airborne laser point cloud strips and the boundary line of the effective coverage area of the adjacent airborne laser point cloud strips does not intersect, otherwise, determining that the overlapping degree does not reach the standard.
2. The method for detecting the overlap degree of the airborne laser point cloud according to claim 1, wherein the step of detecting the intersection of the overlap area and the boundary line of the effective coverage area of the adjacent airborne laser point cloud strips, and when the overlap area exists in the effective coverage area of the adjacent airborne laser point cloud strips and the intersection does not exist in the boundary line of the effective coverage area of the adjacent airborne laser point cloud strips, determining that the overlap degree meets the standard, otherwise, the step of determining that the overlap degree does not meet the standard specifically comprises the steps of:
determining the adjacent relation of each airborne laser point cloud navigation band in the measuring area range;
determining an overlapping area of an effective coverage area of the jth airborne laser point cloud aerial strip and an effective coverage area of the (j + 1) th airborne laser point cloud aerial strip based on the adjacent relation;
judging whether a left boundary line in the effective coverage area of the jth airborne laser point cloud aerial strip is not crossed with a left boundary line in the effective coverage area of the (j + 1) th airborne laser point cloud aerial strip to obtain a first judgment result;
judging whether a left side boundary line in the effective coverage area of the jth airborne laser point cloud aerial strip is not crossed with a right side boundary line in the effective coverage area of the jth +1 th airborne laser point cloud aerial strip to obtain a second judgment result;
judging whether a right side boundary line in the effective coverage area of the jth airborne laser point cloud aerial strip is not crossed with a left side boundary line in the effective coverage area of the (j + 1) th airborne laser point cloud aerial strip to obtain a third judgment result;
judging whether a right side boundary line in the effective coverage area of the jth airborne laser point cloud aerial strip is not crossed with a right side boundary line in the effective coverage area of the (j + 1) th airborne laser point cloud aerial strip to obtain a fourth judgment result;
and when the overlapping area is not empty and the first judgment result, the second judgment result, the third judgment result and the fourth judgment result are all yes, determining that the overlapping degree reaches the standard, otherwise, determining that the overlapping degree does not reach the standard.
3. The method for detecting the overlap of airborne laser point clouds according to claim 2, wherein the step of judging whether a left boundary line in the effective coverage area of the jth airborne laser point cloud flight band intersects with a left boundary line in the effective coverage area of the (j + 1) th airborne laser point cloud flight band to obtain a first judgment result specifically comprises the steps of:
determining an inward contraction boundary point set of a left boundary line in the effective coverage area of the jth airborne laser point cloud flight band to obtain a first inward contraction boundary point set, and determining an inward contraction boundary point set of a left boundary line in the effective coverage area of the jth +1 th airborne laser point cloud flight band to obtain a second inward contraction boundary point set;
selecting a boundary point from the first inward contraction boundary point set and the second inward contraction boundary point set to form a boundary point pair, and determining a set formed by boundary points meeting set judgment conditions as a point pair set to be evaluated after traversing the first inward contraction boundary point set and the second inward contraction boundary point set;
selecting a pair of point pairs to be evaluated in the point pair set to be evaluated; a first point to be evaluated in the point pair to be evaluated is a boundary point in a first inward shrinkage boundary point set, and a second point to be evaluated in the point pair to be evaluated is a boundary point in a second inward shrinkage boundary point set corresponding to the first point to be evaluated;
selecting adjacent nodes of the first point to be evaluated from the first retracted boundary point set to obtain a third point to be evaluated and a fourth point to be evaluated; selecting adjacent nodes of the second point to be evaluated in the second inward contraction boundary point set to obtain a fifth point to be evaluated and a sixth point to be evaluated;
respectively selecting adjacent nodes of the third point to be evaluated and the fourth point to be evaluated from the first inward contraction boundary point set to obtain a seventh point to be evaluated and an eighth point to be evaluated; respectively selecting adjacent nodes of the fifth point to be evaluated and the sixth point to be evaluated in the second inward contraction boundary point set to obtain a ninth point to be evaluated and a tenth point to be evaluated;
sequentially connecting the seventh point to be evaluated, the third point to be evaluated, the first point to be evaluated, the fourth point to be evaluated and the eighth point to be evaluated to obtain four straight line segments corresponding to a first inward contraction boundary point set; sequentially connecting the ninth point to be evaluated, the fifth point to be evaluated, the second point to be evaluated, the sixth point to be evaluated and the tenth point to be evaluated to obtain four straight line segments corresponding to a second inward contraction boundary point set;
and judging whether each straight line segment corresponding to the first inward contraction boundary point set is not intersected with any straight line segment corresponding to the second inward contraction boundary point set or not to obtain a first judgment result.
4. The method as claimed in claim 3, wherein the first determination condition is that
Figure FDA0002551940900000031
Wherein x isk、ykRespectively the abscissa and ordinate, x, of the boundary point selected from the first set of retracted boundary pointsm、ymRespectively an abscissa and an ordinate of a boundary point selected from the second inward-contraction boundary point set, k is a kth node in the first inward-contraction boundary point set, m represents an mth node in the second inward-contraction boundary point set, and S _ T is a plane coordinate distance threshold.
5. The method for detecting the overlapping degree of the airborne laser point clouds according to claim 2, wherein the determining of the adjacent relation of each airborne laser point cloud flight band in the survey area specifically comprises:
when the name of the airborne laser point cloud navigation band in the survey area range is associated with the adjacency of the airborne laser point cloud navigation band, determining the adjacency relation of each airborne laser point cloud navigation band in the survey area range according to the name;
and when the name of the airborne laser point cloud navigation band in the survey area range and the adjacency of the airborne laser point cloud navigation band are not related, calculating the geometric center coordinate value of the airborne laser point cloud navigation band surface, and determining the adjacency relation of each airborne laser point cloud navigation band in the survey area range according to the distance between each geometric center coordinate value.
6. The method for detecting the degree of overlap of airborne laser point clouds according to claim 1, wherein an equal-proportion method is adopted for carrying out retraction processing to obtain effective coverage areas of airborne laser point cloud flight belts, and the method specifically comprises the following steps:
calculating the position of each laser point pair in the navigation band boundary point pair set subjected to retraction processing by an equal proportion method according to the navigation band boundary point pair set and the minimum point cloud navigation band overlapping proportion standard value;
according to the position, according to the acquired time attribute value, carrying out ascending arrangement on the concentrated left boundary points of the flight band boundary points subjected to the proportional method retraction processing to obtain a first sequence;
according to the position, according to the acquired time attribute value, carrying out descending arrangement on the central right boundary points of the flight band boundary points subjected to the proportional method retraction processing to obtain a second sequence;
and sequentially connecting the first sequence and the second sequence to generate an effective coverage area of the airborne laser point cloud navigation band.
7. The method for detecting the degree of overlap of airborne laser point clouds according to claim 1, wherein an integral method is adopted for carrying out retraction processing to obtain effective coverage areas of airborne laser point cloud flight belts, and the method specifically comprises the following steps:
arranging the left boundary points in the boundary point set of the airborne laser point cloud flight band in an ascending order according to the acquired time attribute values to obtain a third sequence;
arranging the right boundary points in the boundary point set of the airborne laser point cloud flight band in a descending order according to the acquired time attribute values to obtain a fourth sequence;
sequentially connecting the third sequence and the fourth sequence to generate a polygonal area;
calculating the buffer value of each strip laser point cloud navigation band according to the width of the airborne laser point cloud navigation band and the minimum point cloud navigation band overlapping ratio standard value;
and carrying out retraction processing on the polygonal area based on the buffer value to obtain an effective coverage area of the airborne laser point cloud navigation band.
8. The method for detecting the overlapping degree of the airborne laser point clouds according to claim 1, wherein the extracting the boundary point set of the airborne laser point cloud flight band based on the airborne laser point cloud data set specifically comprises:
determining a set formed by laser points of which the heading edge attribute value is yes and the absolute value of the scanning angle attribute value is greater than a set angle threshold value as a boundary point set of the airborne laser point cloud navigation band; the laser point with the negative attribute value of the boundary point centralized scanning angle of the airborne laser point cloud navigation band is the left boundary point, and the laser point with the positive attribute value of the boundary point centralized scanning angle of the airborne laser point cloud navigation band is the right boundary point.
9. The method for detecting the overlapping degree of the airborne laser point cloud according to claim 1, wherein the step of screening the fairway belt boundary point pair set according to the time information and the boundary point set of the airborne laser point cloud fairway belt and calculating the width of the airborne laser point cloud fairway belt specifically comprises the steps of:
determining a set formed by boundary points of which the interval value of the acquisition time attribute values of the boundary points on the left side and the boundary points on the right side in the boundary point set of the airborne laser point cloud flight band is minimum and the interval value is smaller than a set time threshold as a flight band boundary point set;
calculating the plane Euclidean distance of each boundary point pair in the navigation band boundary point pair set to obtain a navigation band width set;
and determining the maximum width, the minimum width and the average width in the navigation band width set to obtain the width of the airborne laser point cloud navigation band.
10. An airborne laser point cloud overlap detection system, comprising:
the data acquisition module is used for acquiring the airborne laser point cloud data set and corresponding time information;
the boundary point extraction module is used for extracting a boundary point set of the airborne laser point cloud navigation band based on the airborne laser point cloud data set;
the first calculation module is used for screening a zone boundary point pair set according to the time information and the boundary point set of the airborne laser point cloud flight band and calculating the width of the airborne laser point cloud flight band;
the second calculation module is used for carrying out retraction processing by adopting an equal proportion method or an integral method based on the boundary point set of the airborne laser point cloud flight band, the width of the airborne laser point cloud flight band and the flight band boundary point pair set to obtain an effective coverage area of each airborne laser point cloud flight band;
and the overlapping degree detection module is used for detecting the intersection of the overlapping area and the boundary line of the effective coverage area of the adjacent airborne laser point cloud navigation band, and determining that the overlapping degree reaches the standard when the overlapping area exists in the effective coverage area of the adjacent airborne laser point cloud navigation band and the boundary line of the effective coverage area of the adjacent airborne laser point cloud navigation band does not intersect, otherwise, determining that the overlapping degree does not reach the standard.
CN202010577931.3A 2020-06-23 2020-06-23 Airborne laser point cloud overlapping degree detection method and system Active CN111754535B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010577931.3A CN111754535B (en) 2020-06-23 2020-06-23 Airborne laser point cloud overlapping degree detection method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010577931.3A CN111754535B (en) 2020-06-23 2020-06-23 Airborne laser point cloud overlapping degree detection method and system

Publications (2)

Publication Number Publication Date
CN111754535A true CN111754535A (en) 2020-10-09
CN111754535B CN111754535B (en) 2024-05-31

Family

ID=72674921

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010577931.3A Active CN111754535B (en) 2020-06-23 2020-06-23 Airborne laser point cloud overlapping degree detection method and system

Country Status (1)

Country Link
CN (1) CN111754535B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101159066A (en) * 2007-11-20 2008-04-09 中交第二公路勘察设计研究院有限公司 Highway measuring and setting method based on three-dimensional airborne LIDAR
CN103471567A (en) * 2013-09-03 2013-12-25 中国科学院遥感与数字地球研究所 Checking method of aerophotography flight quality
CN106680798A (en) * 2017-01-23 2017-05-17 辽宁工程技术大学 Airborne LIDAR air strips overlay region redundancy identification and elimination method
US20180313942A1 (en) * 2017-04-28 2018-11-01 SZ DJI Technology Co., Ltd. Calibration of laser sensors
CN109032165A (en) * 2017-07-21 2018-12-18 广州极飞科技有限公司 The generation method and device in unmanned plane course line
CN110008207A (en) * 2019-03-28 2019-07-12 武汉大学 Airborne lidar point cloud data loophole rapid detection method based on density histogram

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101159066A (en) * 2007-11-20 2008-04-09 中交第二公路勘察设计研究院有限公司 Highway measuring and setting method based on three-dimensional airborne LIDAR
CN103471567A (en) * 2013-09-03 2013-12-25 中国科学院遥感与数字地球研究所 Checking method of aerophotography flight quality
CN106680798A (en) * 2017-01-23 2017-05-17 辽宁工程技术大学 Airborne LIDAR air strips overlay region redundancy identification and elimination method
US20180313942A1 (en) * 2017-04-28 2018-11-01 SZ DJI Technology Co., Ltd. Calibration of laser sensors
CN109032165A (en) * 2017-07-21 2018-12-18 广州极飞科技有限公司 The generation method and device in unmanned plane course line
CN110008207A (en) * 2019-03-28 2019-07-12 武汉大学 Airborne lidar point cloud data loophole rapid detection method based on density histogram

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
刘润东 等: "一种机载LiDAR点云条带重叠度自动检查算法", 地理空间信息, vol. 16, no. 11, 20 November 2018 (2018-11-20), pages 42 - 43 *
李昊霖 等: "A3数码航摄仪飞行重叠度检查", 遥感信息, vol. 30, no. 06, 31 December 2015 (2015-12-31), pages 58 - 62 *
杨培义 等: "机载LiDAR航带间高程漂移检查方法研究", 地理空间信息, vol. 17, no. 09, 30 September 2019 (2019-09-30), pages 38 - 40 *
王强辉: "机载LiDAR原始点云数据质量检验方法研究", 测绘地理信息, vol. 43, no. 05, 5 October 2018 (2018-10-05), pages 35 - 37 *

Also Published As

Publication number Publication date
CN111754535B (en) 2024-05-31

Similar Documents

Publication Publication Date Title
US20230099113A1 (en) Training method and apparatus for a target detection model, target detection method and apparatus, and medium
Awrangjeb et al. Automatic detection of residential buildings using LIDAR data and multispectral imagery
CN105260737B (en) A kind of laser scanning data physical plane automatization extracting method of fusion Analysis On Multi-scale Features
CN109829908B (en) Binocular image-based method and device for detecting safety distance of ground object below power line
CN104536009A (en) Laser infrared composite ground building recognition and navigation method
Cheng et al. Building boundary extraction from high resolution imagery and lidar data
CN103714541A (en) Method for identifying and positioning building through mountain body contour area constraint
EP4086846A1 (en) Automatic detection of a calibration standard in unstructured lidar point clouds
CN107679458B (en) Method for extracting road marking lines in road color laser point cloud based on K-Means
dos Santos et al. Extraction of building roof boundaries from LiDAR data using an adaptive alpha-shape algorithm
CN107885224A (en) Unmanned plane barrier-avoiding method based on tri-item stereo vision
Hu et al. A fast and simple method of building detection from LiDAR data based on scan line analysis
CN111487643B (en) Building detection method based on laser radar point cloud and near-infrared image
KR101549155B1 (en) Method of automatic extraction of building boundary from lidar data
EP2677462B1 (en) Method and apparatus for segmenting object area
Axelsson et al. Roof type classification using deep convolutional neural networks on low resolution photogrammetric point clouds from aerial imagery
CN105956544A (en) Remote sensing image road intersection extraction method based on structural index characteristic
Orthuber et al. 3D building reconstruction from lidar point clouds by adaptive dual contouring
Kim et al. Tree and building detection in dense urban environments using automated processing of IKONOS image and LiDAR data
CN111754556A (en) Incremental unmanned aerial vehicle aerial photography overlapping degree detection method and system
Li et al. A fast obstacle detection method by fusion of double-layer region growing algorithm and Grid-SECOND Detector
CN111754535A (en) Airborne laser point cloud overlapping degree detection method and system
Özcan et al. Building detection using local features and DSM data
Abdullah et al. Automatic segmentation of LiDAR point cloud data at different height levels for 3D building extraction
CN111736136B (en) Airborne laser point cloud aerial photography vulnerability detection method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant