CN117741662A - Array interference SAR point cloud fusion method based on double observation visual angles - Google Patents

Array interference SAR point cloud fusion method based on double observation visual angles Download PDF

Info

Publication number
CN117741662A
CN117741662A CN202311754350.2A CN202311754350A CN117741662A CN 117741662 A CN117741662 A CN 117741662A CN 202311754350 A CN202311754350 A CN 202311754350A CN 117741662 A CN117741662 A CN 117741662A
Authority
CN
China
Prior art keywords
point cloud
point
clouds
observation
distance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311754350.2A
Other languages
Chinese (zh)
Inventor
郭其昌
梁兴东
李焱磊
刘云龙
卜祥玺
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Aerospace Information Research Institute of CAS
Original Assignee
Aerospace Information Research Institute of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Aerospace Information Research Institute of CAS filed Critical Aerospace Information Research Institute of CAS
Priority to CN202311754350.2A priority Critical patent/CN117741662A/en
Publication of CN117741662A publication Critical patent/CN117741662A/en
Pending legal-status Critical Current

Links

Landscapes

  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention provides an array interference SAR point cloud fusion method based on a double-observation view angle, which belongs to the technical field of radars and comprises the following steps of 1, point cloud generation; step 2, filtering point cloud; step 3, matching point clouds; and 4, point cloud fusion and false point cloud elimination. According to the method, by fusing multipath results under two different observation angles, on one hand, the reconstruction quality of the building point cloud can be improved, and meanwhile, false point cloud generated between the building and the ground due to secondary scattering can be effectively restrained.

Description

Array interference SAR point cloud fusion method based on double observation visual angles
Technical Field
The invention belongs to the technical field of radars, and particularly relates to an array interference SAR point cloud fusion method based on a double-observation view angle.
Background
The Array interferometric synthetic aperture radar (Array Interferometric Synthetic Aperture Radar Tomography, array-InSAR) is used as an extension of the interferometric synthetic aperture radar (Syntheic Aperture Radar Interfemetry, inSAR) technology, and the resolution capability of the target elevation direction is formed by collecting data at different positions in the elevation direction, so that the problem of overlapping and masking faced by the traditional InSAR can be effectively solved, and the Array interferometric synthetic aperture radar has important application value and potential in the aspects of mountain area mapping, urban planning and the like.
Multipath is one of the common problems in the field of microwave imaging, in synthetic aperture radar imaging, since an imaging algorithm generally only considers a single scattering phenomenon of a target, and ignores the influence of multiple scattering echoes between targets on imaging, a false target caused by multipath often appears in imaging results, such as a false target phenomenon caused by multiple scattering between a bridge and a water surface in a typical Harbor bridge SAR imaging result. When the airborne array interference SAR system performs urban three-dimensional reconstruction, multiple scattering echo signals are received at the position of a receiving antenna due to multipath propagation of electromagnetic waves between a typical artificial structure (a building) and the ground, the echo signals show obvious false target (targets do not appear at the position where targets exist) phenomena in a point cloud result after signal processing, and the existence of the false point cloud targets affects the identification, modeling and the like of the building.
Disclosure of Invention
Aiming at the problem that the false point cloud caused by secondary scattering multipath between a building and the ground influences the quality of point cloud reconstruction in the three-dimensional reconstruction of an airborne array interference SAR urban area, the invention provides an array interference SAR point cloud fusion method based on a double-observation view angle.
The method is realized by the following technical scheme:
an array interference SAR point cloud fusion method based on double observation angles comprises the following steps:
step 1, performing double-observation visual angles, respectively obtaining radar echo signals of different visual angles, and respectively obtaining point cloud results under different visual angles by using an array interference SAR point cloud reconstruction algorithm;
step 2, performing point cloud filtering processing on point clouds under different observation angles by adopting a point cloud filtering method based on spatial distribution, so as to improve the quality of the point clouds under different angles;
step 3, performing point cloud matching by adopting a point cloud matching method based on iteration closest points, taking the result of the point cloud under the observation view angle A as a reference point cloud set, performing point cloud matching on the point cloud under the observation view angle B and the reference point cloud set, and after the point cloud matching is completed, overlapping the point cloud results under the two observation view angles in position;
step 4, carrying out point-by-point retrieval on the reference point cloud set, and judging whether the distance between the point cloud and the retrieval point cloud exists in the result of the point cloud under the measuring view angle B or not; if the distance between the existing point cloud and the search point cloud is smaller than the set threshold, reserving the point cloud meeting the condition in the search point cloud and the point cloud set under the observation angle B; otherwise, continuing to search the next point cloud; repeating the operation until the search of all point clouds in all reference point clouds is completed; after the search of all the point clouds in all the reference point clouds is completed, the point clouds which are reserved are the results of the point clouds after the fusion of the two observation visual angles.
Further, in the step 1, a wk imaging algorithm based on data motion compensation of a positioning and orientation system is adopted to obtain a multi-channel two-dimensional SAR imaging result, and then image registration is carried out on the two-dimensional imaging result among multiple channels; firstly, roughly registering images of all channels by using POS data and corresponding radar parameters, and then finely registering the multi-channel SAR complex images by using a maximum interference phase signal-to-noise ratio criterion; then, the interference phase information is utilized to compensate the phase of each channel, so that the phase meets the basic requirement of tomography; after the amplitude and phase compensation is completed, adopting a compressed sensing algorithm to perform high Cheng Chongjian to obtain a point cloud result under the inclined distance elevation coordinate; baseline calibration is completed by using calibration points to obtain accurate baseline parameters; and finally, carrying out coordinate transformation, and converting the point cloud from the oblique elevation coordinate system to the geodetic coordinate system, so as to obtain a final point cloud result.
Further, in the step 2, performing K-domain statistical analysis on each point cloud, and calculating an average distance from the point to K adjacent points; assuming that the obtained result follows a gaussian distribution, the shape of which depends on the mean and standard deviation, removing the point cloud whose mean distance is outside the given threshold range, comprising:
step (2.1) calculating the distances D of K adjacent points of each point, and calculating the sum D of the distances D;
step (2.2) calculating all points to obtain a sum D, and adding the sum D to obtain a sum D2 of the distance sums of all points;
step (2.3) calculating the average value d3=d2/M of the sum D2 of the distance sums of all points; m is the total number of point clouds;
step (2.4) calculating the standard deviation std (D2) of the sum D2 of the distance sums of all points;
step (2.5) obtaining an outlier distance D3+sig×std (D2) of the noise point;
and (2.6) processing each point cloud, and eliminating the point clouds larger than the outlier distance.
Further, in the step 3, the iterative closest point cloud registration algorithm performs point cloud registration by adopting an iterative strategy, and supposedly registers the point cloud P under the observation view angle B onto the reference point cloud Q, and the iterative closest point cloud registration algorithm includes the following steps:
step (3.1) first requires finding the corresponding point pairs on P and Q: traversing all points in P to perform the following operations: assuming the current traversal to point p, point p is taken i Putting the two points into a space formed by all points in Q, searching the nearest point Q of the distance p in Q, and if the distance between the two points is I p-Q I < xi, considering (p, Q) as a corresponding point, wherein xi is a distance threshold value considered to be set; after traversing all points in P, finding all corresponding point pairs, wherein the set of the corresponding point pairs is Corr= { (P, Q) |p epsilon P, Q epsilon Q|p-q| < ζ }, and i is shown in the tableA point cloud index;
step (3.2) assuming the required transformation matrix is R, forCan all construct an error l p When R represents a transformation matrix, for all corresponding point pairs, an integral error function is constructed by error summation of each point>N is the corresponding number of point pairs, and the problem is converted into a least square problem by minimizing an error function, so that a transformation matrix R is obtained;
after the transformation matrix R is obtained in the step (3.3), transforming the point cloud set P to obtain a point cloud set P' with updated positions, and repeating the steps (3.1) to (3.3) until the end condition is met, and stopping iteration.
Further, in the step 4, firstly, the multipath distribution characteristics between the building and the ground are analyzed, and when the two observation angles of view are larger and the difference of the linear slopes of the multipath distribution is obvious, the point clouds under the two observation angles of view are fused by utilizing the multipath distribution characteristics, so that the false point clouds are removed, and the specific implementation process is as follows: taking the point cloud set obtained under the observation view angle A as a reference point cloud set, carrying out point-by-point retrieval on the reference point cloud set, searching the retrieved point cloud in the point cloud set obtained under the observation view angle B, judging whether point clouds exist in a sphere range taking the retrieved point cloud as an origin and the distance l as a radius, if the point clouds exist, storing the searched point clouds and the retrieved point cloud information into an array C, if the point clouds do not exist in the range, continuing to search the next position point cloud, and repeating the operation until the retrieval and the search of all the point clouds are completed; after the above steps are completed, the point cloud obtained by the array C is the point cloud obtained by fusing the two observation visual angles.
The beneficial effects are that:
1. the method provided by the invention aims at point cloud fusion and multipath inhibition in urban point cloud reconstruction of an airborne array interference SAR system. By utilizing the characteristic that the distribution of the multipath point clouds under different observation angles is different, the point clouds under two observation angles are fused, so that the quality of the point clouds of the building can be improved, and meanwhile, the false point cloud phenomenon formed by secondary scattering between the building and the ground can be effectively restrained.
2. The method only processes the point cloud data, and does not relate to a signal processing process, so that for the integral processing process of the airborne array interference SAR, only fusion processing is needed to be carried out on the point cloud result generated by processing, and the original signal processing steps are not changed.
Drawings
FIG. 1 is a flow chart of an array interferometric SAR point cloud fusion method based on a dual viewing angle of the present invention;
FIG. 2 is a flow chart of a point cloud generation process of an airborne array interferometric SAR system;
FIG. 3 is a schematic diagram of point cloud filtering of an airborne array interferometric SAR system;
FIG. 4 is a schematic diagram of a point cloud matching method based on the ICP method;
FIG. 5 is a schematic diagram of the secondary scattering multipath distribution of an airborne array interferometric SAR system;
fig. 6a, fig. 6b, and fig. 6c are results of array interference SAR point cloud fusion experiment data processing at double viewing angles; fig. 6a is a view angle a point cloud result, fig. 6B is a view angle B point cloud result, and fig. 6c is a fused point cloud result.
Detailed Description
The present invention will be described in detail with reference to the accompanying drawings.
As shown in fig. 1, the array interference SAR point cloud fusion method based on the double-observation view angle of the present invention comprises the following steps:
and step 1, generating point cloud. The method for generating the point cloud of the airborne array interference SAR system is shown in figure 2 and mainly comprises operations such as multi-channel two-dimensional SAR imaging, multi-channel two-dimensional SAR image registration, multi-channel inter-amplitude-phase compensation, elevation reconstruction, baseline calibration, coordinate conversion and the like. In order to obtain a two-dimensional SAR image result, a wk imaging algorithm based on data motion compensation of a Positioning and Orientation System (POS) is adopted to obtain a multi-channel two-dimensional SAR imaging result, and then image registration is carried out on the two-dimensional imaging result among multiple channels. Firstly, performing coarse registration on images of all channels by using POS data and corresponding radar parameters, and then performing fine registration on the multi-channel SAR complex images by using a maximum interference phase signal-to-noise ratio criterion. Then, the interference phase information is used for compensating the phase of each channel so as to enable the phase to meet the basic requirements of tomography. After the amplitude and phase compensation is completed, a compressed sensing algorithm is adopted to conduct high Cheng Chongjian, and a point cloud result under the oblique elevation coordinate is obtained. And (5) finishing baseline calibration by using the calibration points to obtain accurate baseline parameters. And finally, carrying out coordinate transformation, and converting the point cloud from the oblique elevation coordinate system to the geodetic coordinate system, so as to obtain the final point cloud.
And 2, filtering the point cloud. The generated point cloud is often subjected to noise influence, more noise points exist, in order to improve the quality of the point cloud, the point cloud filtering processing is needed, and the filtering principle of the point cloud filtering method based on spatial distribution is shown in figure 3. And carrying out K-domain statistical analysis on each point cloud, and calculating the average distance from the point to K adjacent points. Assuming that the obtained result follows a gaussian distribution, the shape of which depends on the mean and standard deviation, and the point cloud with the mean distance outside the given threshold is removed, the method requires two parameters, one is a multiple parameter of the sig standard deviation and the other is the number of K adjacent points, and the calculation process is as follows: (1) The distances D of the K neighboring points of each point are calculated and the sum D of these distances D is calculated. (2) And adding the sum D obtained by calculating all the points to obtain the sum D2 of the distance sums of all the points. (3) Calculating the average value d3=d2/M of the sum D2 of the distance sums of all points; m is the total number of point clouds. (4) The standard deviation std (D2) of the sum D2 of the distance sums of all points is calculated. (5) obtaining the outlier distance D3+sig×std (D2) of the noise. (6) And processing each point cloud, and rejecting the point cloud with the distance larger than the outlier distance. Default sig=1, k=6.
And 3, matching the point cloud. The position of the point clouds generated under different observation angles is deviated, and the point clouds under different observation angles need to be registered before the point cloud fusion processing, so that the same point cloud under different observation angles is formedThe point clouds of the regions overlap each other in position. In order to complete point cloud registration, the invention adopts a point cloud matching method based on Iterative Closest Point (ICP), and the basic principle is shown in figure 4. The ICP point cloud registration algorithm adopts an iterative strategy to register point clouds, and the ICP algorithm is operated as follows under the assumption that a point cloud set P under an observation view angle B needs to be registered on a reference point cloud set Q: (1) first, the corresponding point pairs on P and Q need to be found: traversing all points in P to perform the following operations: assuming that the current traversal is to the point p, putting the point p into a space formed by all points in Q, searching the nearest point Q of the Q from the p, and if the I p-Q I < xi, considering (p, Q) as a corresponding point, wherein xi is a distance threshold value considered to be set. After traversing all points in P, finding all corresponding point pairs, wherein the set of the corresponding point pairs is Corr= { (P) i ,q i )|p i ∈P,q i ∈Q,||p i -q i I < ζ, i represents the point cloud index. (2) Assuming the required transformation matrix is R, forCan construct an error l p When R represents the transformation matrix, then for all corresponding point pairs, the error summation for each point can be used to construct an overall error functionThe transformation matrix R can be obtained by converting the problem into a least squares problem by minimizing the error function in this way (N is the corresponding number of pairs of points). (3) After the transformation matrix R is obtained, the point cloud set P is transformed to obtain a point cloud set P' with updated positions, and then (1) to (3) are repeated until the termination condition is met, and iteration is stopped.
And 4, point cloud fusion. By utilizing the characteristic that the distribution of the point clouds of the multipath formed by the secondary scattering between the building and the ground under different observation view angles is different, the multipath point clouds can be removed by carrying out fusion processing on the point clouds under different view angles. The multipath profile between the building and the ground is first analyzed as shown in fig. 5. Between a building and the groundFor example, a propagation path r of type I secondary scattering (the propagation path of electromagnetic wave is defined as a transmitting antenna-ground-building-receiving antenna is defined as type I secondary scattering) of electromagnetic wave under an observation angle a t,a,b,r The expression is:
y in the above t ,y,h t ,h,y r ,h r Respectively representing the horizontal coordinate of the transmitting antenna, the horizontal coordinate of the ground scattering point, the height of the transmitting antenna, the height of the floor scattering point, the horizontal coordinate of the receiving antenna and the height of the receiving antenna. According to the distribution relation of array antennas, there is y r =y t +l k ,h r =h t ,l k The distance between the kth receiving antenna and the transmitting antenna is expressed, and the distances are substituted into the above formula:
according to the geometrical relationship, there is y t =-r 0 sinθ,h r =r 0 cos θ, substituting into the above formula:
using taylor approximation expansion, the above equation can be approximated as:
in the above formula, the angle marks t, a, b and r respectively represent a transmitting antenna unit, a ground scattering point, a building floor scattering point and a receiving antenna unit. r is (r) 0 Is the distance from the platform to the building base angle, y, h are the abscissa of point a and the ordinate of point b, respectively, where the origin of coordinates is set to the base angle position of the dihedral angleAnd θ is the lower viewing angle corresponding to the dihedral base angle under the viewing angle a.
The propagation path length of the electromagnetic wave which is secondarily scattered in class II (the propagation path of the electromagnetic wave is defined as a transmitting antenna-building-ground-receiving antenna is defined as secondarily scattered in class II) is as follows:
considering that the electromagnetic wave is mainly specularly reflected, the right end of the upper partThe term is approximately 0, where the secondary scatter propagation distance is approximately:
in echo processing, the right end of the above formulaThe compensation can be achieved by a multi-channel correction,the term does not affect the three-dimensional reconstruction result, in which case the echo s resulting from the secondary scattering r It can be equivalently:
σ 12 representing the radar echo amplitude caused by secondary scattering, exp () represents an exponential function, which can be equivalent to: building base angle resolution in the pitch direction at the pitch-elevation coordinatesThere are two point targets at the cell whose elevations are-hsin theta cos theta, hsin theta cos theta respectively, i.e. the false point targets generated by two secondary scattering are distributed symmetrically about the O point in elevation. In the horizontal-height coordinate, two false point targets formed by the secondary scattering are distributed on a straight line l, and the straight line crosses a point (x o ,y o ,h o ) And the slope isThe linear equation expression is as follows:
h-h o =k(y-y 0 )
the equation expression of the straight line l' distributed by the multipath point cloud under the observation view angle B can be obtained by the same method as follows:
h-h o =k′(y-y 0 )
wherein,and θ' is the lower viewing angle corresponding to the dihedral base angle under the observation viewing angle B.
As can be seen from the results of the linear equation, the distribution of the point clouds generated by the multipath between the building and the ground under different observation angles is different, the false point clouds are distributed near a straight line, the slope corresponding to the straight line is related to the observation angle under the observation, the angle under the observation is different, the slope of the straight line is different, and therefore the false point clouds formed by the multipath are not overlapped in position. When the two observation view angles are larger and the linear slopes of the multipath distribution are obvious, the point clouds under the two observation view angles can be fused by utilizing the characteristics, so that false point clouds are removed, and the method is concretely realized as follows: and (3) taking the point cloud obtained under the observation view angle A as a reference point cloud, carrying out point-by-point retrieval on the reference point cloud, searching in the point cloud obtained under the observation view angle B for the retrieved point cloud, judging whether the point cloud exists in a sphere range taking the retrieved point cloud as an origin and the distance l (set as 2m in the process) as a radius, if the point cloud exists, storing the information of the searched point cloud and the retrieved point cloud into an array C, and if the point cloud does not exist in the range, continuing to search the next position point cloud, and repeating the operation until the retrieval and the search of all the point clouds are completed.
After the above steps are completed, the point cloud obtained by the array C is the point cloud obtained by fusing the two observation visual angles.
To illustrate the performance of the proposed algorithm, the true data processing results are presented as shown in fig. 6a, 6b, and 6c below. Fig. 6a and fig. 6b are respectively point cloud results generated by acquiring data along two flight routes by the airborne array interference SAR system, the angles of view under the two flight routes are different, and it can be seen from the results that the slopes of straight lines where multipath point clouds are located under different angles of view are different, and the point clouds after the point cloud fusion processing are shown in fig. 6 c. The results show that false point clouds (multipath generated by secondary scattering between the building and the ground) in the point clouds after the double-observation-angle fusion are suppressed. Therefore, false point clouds generated by secondary scattering between the ground and a building in the array interference SAR system can be solved by fusing the point cloud data under two different observation angles, so that the quality of the point clouds generated by the array interference SAR system is improved.

Claims (5)

1. An array interference SAR point cloud fusion method based on double observation angles is characterized by comprising the following steps:
step 1, performing double-observation visual angles, respectively obtaining radar echo signals of different visual angles, and respectively obtaining point cloud results under different visual angles by using an array interference SAR point cloud reconstruction algorithm;
step 2, performing point cloud filtering processing on point clouds under different observation angles by adopting a point cloud filtering method based on spatial distribution, so as to improve the quality of the point clouds under different angles;
step 3, performing point cloud matching by adopting a point cloud matching method based on iteration closest points, taking the result of the point cloud under the observation view angle A as a reference point cloud set, performing point cloud matching on the point cloud under the observation view angle B and the reference point cloud set, and after the point cloud matching is completed, overlapping the point cloud results under the two observation view angles in position;
step 4, carrying out point-by-point retrieval on the reference point cloud set, and judging whether the distance between the point cloud and the retrieval point cloud exists in the result of the point cloud under the measuring view angle B or not; if the distance between the existing point cloud and the search point cloud is smaller than the set threshold, reserving the point cloud meeting the condition in the search point cloud and the point cloud set under the observation angle B; otherwise, continuing to search the next point cloud; repeating the operation until the search of all point clouds in all reference point clouds is completed; after the search of all the point clouds in all the reference point clouds is completed, the point clouds which are reserved are the results of the point clouds after the fusion of the two observation visual angles.
2. The method for fusing array interference SAR point clouds based on double observation angles according to claim 1, wherein in the step 1, a wk imaging algorithm based on data motion compensation of a positioning and orientation system is adopted to obtain a multi-channel two-dimensional SAR imaging result, and then image registration is carried out on the two-dimensional imaging result among multiple channels; firstly, roughly registering images of all channels by using POS data and corresponding radar parameters, and then finely registering the multi-channel SAR complex images by using a maximum interference phase signal-to-noise ratio criterion; then, the interference phase information is utilized to compensate the phase of each channel, so that the phase meets the basic requirement of tomography; after the amplitude and phase compensation is completed, adopting a compressed sensing algorithm to perform high Cheng Chongjian to obtain a point cloud result under the inclined distance elevation coordinate; baseline calibration is completed by using calibration points to obtain accurate baseline parameters; and finally, carrying out coordinate transformation, and converting the point cloud from the oblique elevation coordinate system to the geodetic coordinate system, so as to obtain a final point cloud result.
3. The method for fusing array interference SAR point clouds based on double observation angles as set forth in claim 2, wherein in said step 2, each point cloud is subjected to K field statistical analysis, and the average distance from the point to K adjacent points thereof is calculated; assuming that the obtained result follows a gaussian distribution, the shape of which depends on the mean and standard deviation, removing the point cloud whose mean distance is outside the given threshold range, comprising:
step (2.1) calculating the distances D of K adjacent points of each point, and calculating the sum D of the distances D;
step (2.2) calculating all points to obtain a sum D, and adding the sum D to obtain a sum D2 of the distance sums of all points;
step (2.3) calculating the average value d3=d2/M of the sum D2 of the distance sums of all points; m is the total number of point clouds;
step (2.4) calculating the standard deviation std (D2) of the sum D2 of the distance sums of all points;
step (2.5) obtaining an outlier distance D3+sig×std (D2) of the noise point;
and (2.6) processing each point cloud, and eliminating the point clouds larger than the outlier distance.
4. The method of claim 3, wherein in the step 3, the iterative closest point cloud registration algorithm adopts an iterative strategy to register the point cloud, and the iterative closest point cloud registration algorithm includes the following steps:
step (3.1) first requires finding the corresponding point pairs on P and Q: traversing all points in P to perform the following operations: assuming the current traversal to point p, point p is taken i Putting the two points into a space formed by all points in Q, searching the nearest point Q of the distance p in Q, and if the distance between the two points is I p-Q I < xi, considering (p, Q) as a corresponding point, wherein xi is a distance threshold value considered to be set; after traversing all points in P, finding all corresponding point pairs, wherein the set of the corresponding point pairs is Corr= { (P, Q) |p epsilon P, Q epsilon Q|p-q| < ζ }, and i represents a point cloud index;
step (3.2) assuming the required transformation matrix is R, forCan all construct an error l p When R represents a transformation matrix, for all corresponding point pairs, an integral error function is constructed by error summation of each point>N is the corresponding number of point pairs, and the problem is converted into a least square problem by minimizing an error function, so that a transformation matrix R is obtained;
after the transformation matrix R is obtained in the step (3.3), transforming the point cloud set P to obtain a point cloud set P' with updated positions, and repeating the steps (3.1) to (3.3) until the end condition is met, and stopping iteration.
5. The method for fusing the point clouds of the array interference SAR based on the double observation angles as set forth in claim 4, wherein in the step 4, firstly, the multipath distribution characteristics between the building and the ground are analyzed, and when the two observation angles are larger and the linear slopes of the multipath distribution have obvious differences, the point clouds under the two observation angles are fused by utilizing the multipath distribution characteristics, so that the false point clouds are removed, and the method is specifically implemented as follows: taking the point cloud set obtained under the observation view angle A as a reference point cloud set, carrying out point-by-point retrieval on the reference point cloud set, searching the retrieved point cloud in the point cloud set obtained under the observation view angle B, judging whether point clouds exist in a sphere range taking the retrieved point cloud as an origin and the distance l as a radius, if the point clouds exist, storing the searched point clouds and the retrieved point cloud information into an array C, if the point clouds do not exist in the range, continuing to search the next position point cloud, and repeating the operation until the retrieval and the search of all the point clouds are completed; after the above steps are completed, the point cloud obtained by the array C is the point cloud obtained by fusing the two observation visual angles.
CN202311754350.2A 2023-12-20 2023-12-20 Array interference SAR point cloud fusion method based on double observation visual angles Pending CN117741662A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311754350.2A CN117741662A (en) 2023-12-20 2023-12-20 Array interference SAR point cloud fusion method based on double observation visual angles

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311754350.2A CN117741662A (en) 2023-12-20 2023-12-20 Array interference SAR point cloud fusion method based on double observation visual angles

Publications (1)

Publication Number Publication Date
CN117741662A true CN117741662A (en) 2024-03-22

Family

ID=90276989

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311754350.2A Pending CN117741662A (en) 2023-12-20 2023-12-20 Array interference SAR point cloud fusion method based on double observation visual angles

Country Status (1)

Country Link
CN (1) CN117741662A (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110058237A (en) * 2019-05-22 2019-07-26 中南大学 InSAR point Yun Ronghe and three-dimensional deformation monitoring method towards High-resolution SAR Images
CN110658530A (en) * 2019-08-01 2020-01-07 北京联合大学 Map construction method and system based on double-laser-radar data fusion and map
CN112348864A (en) * 2020-11-11 2021-02-09 湖南大学 Three-dimensional point cloud automatic registration method for laser contour features of fusion line
CN112669359A (en) * 2021-01-14 2021-04-16 武汉理工大学 Three-dimensional point cloud registration method, device, equipment and storage medium
CN113419242A (en) * 2021-06-22 2021-09-21 中国科学院空天信息创新研究院 Chromatographic SAR whole scene spot cloud acquisition method and device thereof
WO2022165876A1 (en) * 2021-02-06 2022-08-11 湖南大学 Wgan-based unsupervised multi-view three-dimensional point cloud joint registration method
CN115616505A (en) * 2022-09-08 2023-01-17 中国测绘科学研究院 Three-dimensional point cloud registration method for array interference synthetic aperture radar
CN116299454A (en) * 2023-02-28 2023-06-23 中国电子科技集团公司第十四研究所 Three-dimensional imaging method of space target ISAR based on multi-view fusion
CN116740151A (en) * 2023-06-16 2023-09-12 中南大学 InSAR point cloud registration method and terminal equipment
CN116823895A (en) * 2023-06-25 2023-09-29 东南大学 Variable template-based RGB-D camera multi-view matching digital image calculation method and system
CN117148352A (en) * 2023-10-31 2023-12-01 中国科学院空天信息创新研究院 Array interference SAR three-dimensional imaging method with angle uniqueness constraint

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110058237A (en) * 2019-05-22 2019-07-26 中南大学 InSAR point Yun Ronghe and three-dimensional deformation monitoring method towards High-resolution SAR Images
CN110658530A (en) * 2019-08-01 2020-01-07 北京联合大学 Map construction method and system based on double-laser-radar data fusion and map
CN112348864A (en) * 2020-11-11 2021-02-09 湖南大学 Three-dimensional point cloud automatic registration method for laser contour features of fusion line
CN112669359A (en) * 2021-01-14 2021-04-16 武汉理工大学 Three-dimensional point cloud registration method, device, equipment and storage medium
WO2022165876A1 (en) * 2021-02-06 2022-08-11 湖南大学 Wgan-based unsupervised multi-view three-dimensional point cloud joint registration method
CN113419242A (en) * 2021-06-22 2021-09-21 中国科学院空天信息创新研究院 Chromatographic SAR whole scene spot cloud acquisition method and device thereof
CN115616505A (en) * 2022-09-08 2023-01-17 中国测绘科学研究院 Three-dimensional point cloud registration method for array interference synthetic aperture radar
CN116299454A (en) * 2023-02-28 2023-06-23 中国电子科技集团公司第十四研究所 Three-dimensional imaging method of space target ISAR based on multi-view fusion
CN116740151A (en) * 2023-06-16 2023-09-12 中南大学 InSAR point cloud registration method and terminal equipment
CN116823895A (en) * 2023-06-25 2023-09-29 东南大学 Variable template-based RGB-D camera multi-view matching digital image calculation method and system
CN117148352A (en) * 2023-10-31 2023-12-01 中国科学院空天信息创新研究院 Array interference SAR three-dimensional imaging method with angle uniqueness constraint

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
XIAOHUA TONG: "Automatic Registration of Very Low Overlapping Array InSAR Point Clouds in Urban Scenes", IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, vol. 40, 21 February 2022 (2022-02-21) *
XIAOWAN LI: "Fourfold Bounce Scattering-Based Reconstruction of Building Backs Using Airborne Array TomoSAR Point Clouds", REMOTE SENSING, vol. 14, no. 8, 17 April 2022 (2022-04-17) *
朱庆;李世明;胡翰;钟若飞;吴波;谢林甫;: "面向三维城市建模的多点云数据融合方法综述", 武汉大学学报(信息科学版), no. 12, 25 September 2018 (2018-09-25) *
李晓婉: "基于几何约束移动最小二乘的TomoSAR山区点云高精度三维重建方法", 雷达学报, vol. 11, no. 3, 11 May 2022 (2022-05-11) *

Similar Documents

Publication Publication Date Title
CN110244302B (en) Three-dimensional transformation method for image pixel coordinates of foundation synthetic aperture radar
CN103345757B (en) Optics under multilevel multi-feature constraint and SAR image autoegistration method
CN111948654B (en) Airborne tomography SAR three-dimensional point cloud generation method
CN101738614B (en) Method for estimating target rotation of inverse synthetic aperture radar based on time-space image sequence
CN109188433B (en) Control point-free dual-onboard SAR image target positioning method
CN110488288B (en) Airborne SAR high-resolution tomography method
CN108872985A (en) A kind of near field circumference SAR rapid three dimensional imaging process
Dong et al. Radargrammetric DSM generation in mountainous areas through adaptive-window least squares matching constrained by enhanced epipolar geometry
CN114035188B (en) High-precision monitoring method and system for glacier flow velocity of ground-based radar
RU2444750C2 (en) Method of determining elevation coordinate of low-flying target
CN113176544B (en) Mismatching correction method for slope radar image and terrain point cloud
CN113534110A (en) Static calibration method for multi-laser radar system
CN112986996A (en) Multi-source SAR satellite combined three-dimensional positioning method based on geometric entropy
CN117741662A (en) Array interference SAR point cloud fusion method based on double observation visual angles
CN110276240B (en) SAR image building wall window information extraction method
CN113610902B (en) Ground-based real aperture radar and point cloud data mapping registration method
Song et al. Registration for 3D LiDAR datasets using pyramid reference object
CN115616505A (en) Three-dimensional point cloud registration method for array interference synthetic aperture radar
Chen et al. 3d map building based on stereo vision
RU2406071C1 (en) Method of mobile object navigation
CN115393543A (en) Array SAR point cloud three-dimensional reconstruction method based on building multiple scattering
CN113534130A (en) Multi-station radar multi-target data association method based on sight angle
Xu et al. Registration of airborne LiDAR bathymetry seafloor point clouds based on the adaptive matching of corresponding points
Linglin et al. A fast SAR image position algorithm for maritime target location
Liu et al. Multiple natural features fusion for on-site calibration of LiDAR boresight angle misalignment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination