CN116047465A - Distance and angle measurement method for laser imaging radar to space non-cooperative target - Google Patents

Distance and angle measurement method for laser imaging radar to space non-cooperative target Download PDF

Info

Publication number
CN116047465A
CN116047465A CN202310134586.XA CN202310134586A CN116047465A CN 116047465 A CN116047465 A CN 116047465A CN 202310134586 A CN202310134586 A CN 202310134586A CN 116047465 A CN116047465 A CN 116047465A
Authority
CN
China
Prior art keywords
target
point cloud
point
voxel
laser imaging
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310134586.XA
Other languages
Chinese (zh)
Inventor
王凤香
黄庚华
刘鸿彬
舒嵘
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Institute of Technical Physics of CAS
Original Assignee
Shanghai Institute of Technical Physics of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Institute of Technical Physics of CAS filed Critical Shanghai Institute of Technical Physics of CAS
Priority to CN202310134586.XA priority Critical patent/CN116047465A/en
Publication of CN116047465A publication Critical patent/CN116047465A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/4802Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/50Systems of measurement based on relative movement of target
    • G01S17/58Velocity or trajectory determination systems; Sense-of-movement determination systems
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The invention discloses a distance and angle measurement method for a laser imaging radar to a space non-cooperative target, and particularly relates to the technical field of aerospace. The method comprises the following steps: s1, carrying out denoising and uniform sampling pretreatment operation on target three-dimensional point cloud data acquired by a laser imaging radar; s2, dividing the preprocessed point cloud to distinguish different components of the target, and extracting a target region of interest; s3, identifying a target body: the target point cloud is divided into different class clusters; calculating three-dimensional center and flatness characteristics of each cluster, removing regions such as sailboards with larger area and flatness through comparison of the center and the flatness, and extracting point clouds of the body regions; s4, estimating a centroid; s5, calculating the distance and the angle. The invention solves the problem that the existing measuring system can not realize non-cooperative target tracking measurement, can mutually back up with the existing ranging and angle measuring system, and improves the reliability and redundancy of the system.

Description

Distance and angle measurement method for laser imaging radar to space non-cooperative target
Technical Field
The invention relates to the technical field of aerospace, in particular to a distance and angle measurement method for a laser imaging radar on a space non-cooperative target.
Background
In the space intersection butt joint process, parameters such as the relative distance, azimuth angle and pitch angle between the tracker and the target are required to be measured in a middle-short distance section and are used as key parameters for controlling the tracker to approach the target. The laser imaging radar is used as one of important measuring sensors in a space intersection butt joint measuring system, can directly perform three-dimensional imaging on a target, and obtains the required relative motion parameters such as distance, angle, gesture and the like through the processing of the three-dimensional image. Is substantially insensitive to light and therefore has great potential in space application technology.
At present, the intersection butt joint measurement technology of a cooperative target is mature, the target is provided with a radar transponder, a corner reflector and other cooperative equipment, a combined type measurement system consisting of laser ranging and an optical imaging sensor is adopted in a medium-short distance, and the measurement task of the relative distance and the azimuth is completed. For non-cooperative targets, a combination of scanning laser rangefinder and optical camera is currently in common use. The laser rangefinder and camera are mounted on a structural frame that can be scanned in two axes, and an infrared or visible light camera performs target searching and azimuth measurement, and then directs the laser rangefinder to point to the target and measures the relative distance. The measurement system of such a combination requires accurate pointing to the target and has tracking capabilities to direct the laser rangefinder to the target. However, the laser range finder only has single-point range finding capability, and the optical camera is sensitive to light rays, so that the measuring capability has certain limitation.
Disclosure of Invention
The invention aims to provide a distance and angle measurement method for a laser imaging radar to a space non-cooperative target, so as to solve the problem that the existing measurement system cannot realize non-cooperative target tracking measurement.
In order to achieve the above purpose, the technical scheme of the invention is as follows: a distance and angle measurement method for a laser imaging radar to a space non-cooperative target comprises the following steps:
s1, carrying out denoising and uniform sampling pretreatment operation on target three-dimensional point cloud data acquired by a laser imaging radar;
s2, dividing the preprocessed point cloud to distinguish different components of the target, and extracting a target region of interest;
s3, identifying a target body: the target point cloud is divided into different class clusters; calculating three-dimensional center and flatness characteristics of each cluster, and removing a sailboard area with larger area and flatness by comparing the center and the flatness, so as to extract point clouds of the body area;
s4, centroid estimation: after identifying the target ontology q, the centroid coordinates (X q ,Y q ,Z q );
Figure BDA0004085077200000021
Where n is the number of points of the ontology point cloud q, (x) i ,y i ,z i ) Is the three-dimensional point coordinates in the body point cloud;
s5, calculating the distance and the angle: after extracting the centroid of the target, the distance and azimuth angle theta of the centroid of the target and the line-of-sight direction of the laser imaging radar measurement origin are calculated by x Angle of high and low theta z
Figure BDA0004085077200000022
Figure BDA0004085077200000023
Figure BDA0004085077200000024
Further, the specific steps of the pretreatment in step S1 are as follows:
s1.1, denoising processing based on local distance statistical analysis is carried out on target three-dimensional point cloud data obtained by a laser imaging radar, and discrete space noise points in a measurement point cloud are removed; carrying out statistical analysis on the neighborhood of each point, and calculating the average distance between each point and the neighborhood point; calculating the distribution characteristic mean value and standard deviation of the average distance quantity, wherein points with the distribution mean value and standard deviation larger than the global average distance definition threshold are regarded as outliers and are removed;
s1.2, uniformly sampling point cloud by adopting a point cloud downsampling method based on voxel filtering; dividing the point cloud into three-dimensional voxel grids according to a certain interval; and reserving a point in each voxel grid, selecting the point closest to the voxel center as a reserved point in the voxel, and processing all the voxels to obtain a uniformly sampled point cloud.
Further, the segmentation method in step S2 is as follows: performing three-dimensional voxelization on the point cloud, then scanning each voxel to perform connected domain marking, and finally classifying the regions with the connected characteristics into one type; the method specifically comprises the following steps:
s2.1, three-dimensional voxelization: calculating the maximum and minimum values of the cloud data P of the target point in the directions of three coordinates of X, Y, Z, and determining a minimum cuboid surrounding all points; setting the size d of each voxel, dividing the minimum cuboid into n voxel C sets according to d; finally, for each point p (x, y, z) in the point cloud data, determining a voxel index C (i, j, k) of each point, and deleting blank voxels; the method for determining the voxels of each point comprises the following steps: a point P (x, y, z) epsilon P arranged in the point cloud data, wherein the voxel index of the point P is (i, j, k);
Figure BDA0004085077200000031
in the above formula, d is the voxel size, x min Is the minimum value of the point cloud data in the X direction, y min Z, which is the minimum value of the point cloud data in the Y direction min The minimum value of the point cloud data in the Z direction is set;
s2.2, connectivity checking: for each C k E C (k=1, 2, … … n), checking if there are connected voxels in its 6-neighborhood or 24-neighborhood; if present, the points within the voxel are summedThe points in the neighborhood voxels are divided into a class, and class number marking is carried out, otherwise, the points are regarded as a new class; all voxel grids are traversed until all voxels are labeled.
Compared with the prior art, the beneficial effect of this scheme:
according to the scheme, the space non-cooperative target ranging and angle measurement based on the laser imaging radar can be realized, and the target point cloud of the region of interest can be accurately extracted through the technologies of denoising, uniform sampling, clustering segmentation and the like, so that the high-precision measurement of the target centroid can be realized. Compared with the prior art, the laser imaging radar can measure the distance and the angle of the target at the same time with high precision, can mutually back up with the existing ranging and angle measuring system, and improves the reliability and redundancy of the system.
Drawings
FIG. 1 is a flow chart of a method of ranging and goniometry for a laser imaging radar to a spatially non-cooperative target in accordance with the present invention;
FIG. 2 is a schematic diagram of ranging and angle measurement of a target by a laser imaging radar in the present embodiment;
fig. 3 is a target point cloud data map used in the present embodiment;
fig. 4 is a diagram of the identification result and the estimated centroid position of the target body region in the present embodiment.
Detailed Description
The invention is described in further detail below by way of specific embodiments:
examples
The invention takes a common artificial satellite configuration with a sailboard as an embodiment, and simulates the range and angle measurement process of a laser imaging radar on a target through a multi-line laser imaging radar simulation technology. Fig. 2 shows a schematic diagram of the measurement process. In the simulation design, the laser imaging radar images at a position 200m away from the target, and generates target three-dimensional image data conforming to a real physical process according to the characteristics of an incident angle, target materials and the like.
As shown in fig. 1, a method for measuring distance and angle of a laser imaging radar to a space non-cooperative target comprises the following steps:
s1, carrying out denoising and uniform sampling pretreatment operation on target three-dimensional point cloud data acquired by a laser imaging radar; the specific steps of pretreatment are as follows:
s1.1, generating a target instantaneous three-dimensional point cloud image according to a simulation program of the adaptive laser imaging radar shown in FIG. 2, carrying out denoising processing based on local distance statistical analysis on target three-dimensional point cloud data obtained by the laser imaging radar, and removing discrete spatial noise points in a measurement point cloud; carrying out statistical analysis on the neighborhood of each point, and calculating the average distance between each point and the neighborhood point; the distribution feature mean and standard deviation of the average distance amount are calculated, and points (points greater than 2 times the error in this embodiment are regarded as noise points) of which the distribution mean and standard deviation of the global average distance define a threshold value are regarded as outliers and are eliminated.
S1.2, in order to reduce the influence of the point cloud density on the target centroid estimation, a point cloud downsampling method based on voxel filtering is adopted to uniformly sample the point cloud, and 1000 points of target data are obtained after sampling in the embodiment; fig. 3 shows original target three-dimensional point cloud data (1) and preprocessed point cloud data (2). Dividing the point cloud into three-dimensional voxel grids according to a certain interval; and reserving a point in each voxel grid, selecting the point closest to the voxel center as a reserved point in the voxel, and processing all the voxels to obtain a uniformly sampled point cloud.
S2, dividing the preprocessed point cloud to distinguish different components of the target, and extracting a target region of interest; the segmentation method comprises the following steps: performing three-dimensional voxelization on the point cloud, then scanning each voxel to perform connected domain marking, and finally classifying the regions with the connected characteristics into one type; in this embodiment, voxel division is performed according to a voxel size of 0.1m, and deleting blank voxels specifically includes:
s2.1, three-dimensional voxelization: calculating the maximum and minimum values of the cloud data P of the target point in the directions of three coordinates of X, Y, Z, and determining a minimum cuboid surrounding all points; setting the size d of each voxel, dividing the minimum cuboid into n voxel C sets according to d; finally, for each point p (x, y, z) in the point cloud data, determining a voxel index C (i, j, k) where the point p is located, and deleting blank voxels; the method for determining the voxels of each point comprises the following steps: and a point P (x, y, z) epsilon P arranged in the point cloud data, wherein the voxel index of the point P is (i, j, k).
Figure BDA0004085077200000041
In the above formula, d is the voxel size, x min Is the minimum value of the point cloud data in the X direction, y min Z, which is the minimum value of the point cloud data in the Y direction min The minimum value of the point cloud data in the Z direction is set;
s2.2, connectivity checking: for each C k E C (k=1, 2, … … n), checking if there are connected voxels in its 6-neighborhood or 24-neighborhood; if so, dividing the points in the voxel and the points in the neighborhood voxel into one type, and marking the type number, otherwise, treating the points as a new type; all voxel grids are traversed until all voxels are labeled.
S3, identifying a target body: after the connectivity is marked, the target point cloud is divided into different class clusters according to connectivity. And calculating the three-dimensional center and flatness characteristics of each cluster, and removing areas such as sailboards with larger area and flatness through comparison of the center and the flatness, so as to extract the point cloud of the body area.
The extraction method comprises the following steps: let the object be divided into k classes, calculate the barycentric coordinates C of each class i (X,Y,Z)i=1,2,…k。
Figure BDA0004085077200000051
Thirdly, calculating the length-width ratio of each category, calculating the covariance matrix of the point cloud of each category, performing matrix characteristic decomposition, and solving the characteristic values (d 1 ,d 2 ,d 3 ) Ordered in descending order, then aspect ratio F:
F=d1/d2
in this embodiment, when F <1.5 and the center is located near the center of the bounding box, then it is determined as the body point cloud. Fig. 4 is a target body area point cloud extracted in the present embodiment.
S4, centroid estimation: after identifying the target ontology q, the centroid coordinates (X q ,Y q ,Z q );
Figure BDA0004085077200000052
Where n is the number of points of the ontology point cloud q, (x) i ,y i ,z i ) Is the three-dimensional point coordinates in the body point cloud.
S5, calculating the distance and the angle: after extracting the centroid of the target, the distance and azimuth angle theta of the centroid of the target and the line-of-sight direction of the laser imaging radar measurement origin are calculated by x Angle of high and low theta z
Figure BDA0004085077200000053
Figure BDA0004085077200000054
Figure BDA0004085077200000055
In this embodiment, the calculated target distance d= 199.54m and target azimuth θ x =3.83°, high-low angle θ z =2.23°。
The foregoing is merely exemplary of the present invention and the details of construction and/or the general knowledge of the structures and/or characteristics of the present invention as it is known in the art will not be described in any detail herein. It should be noted that modifications and improvements can be made by those skilled in the art without departing from the structure of the present invention, and these should also be considered as the scope of the present invention, which does not affect the effect of the implementation of the present invention and the utility of the patent. The protection scope of the present application shall be subject to the content of the claims, and the description of the specific embodiments and the like in the specification can be used for explaining the content of the claims.

Claims (3)

1. A distance and angle measurement method for a laser imaging radar to a space non-cooperative target is characterized by comprising the following steps of: the method comprises the following steps:
s1, carrying out denoising and uniform sampling pretreatment operation on target three-dimensional point cloud data acquired by a laser imaging radar;
s2, dividing the preprocessed point cloud to distinguish different components of the target, and extracting a target region of interest;
s3, identifying a target body: the target point cloud is divided into different class clusters; calculating three-dimensional center and flatness characteristics of each cluster, and removing a sailboard area with larger area and flatness by comparing the center and the flatness, so as to extract point clouds of the body area;
s4, centroid estimation: after identifying the target ontology q, the centroid coordinates (X q ,Y q ,Z q );
Figure FDA0004085077180000011
Where n is the number of points of the ontology point cloud q, (x) i ,y i ,z i ) Is the three-dimensional point coordinates in the body point cloud;
s5, calculating the distance and the angle: after extracting the centroid of the target, the distance and azimuth angle theta of the centroid of the target and the line-of-sight direction of the laser imaging radar measurement origin are calculated by x Angle of high and low theta z
Figure FDA0004085077180000012
Figure FDA0004085077180000013
Figure FDA0004085077180000014
2. A method of ranging and angulation of a laser imaging radar to a spatially non-cooperative target according to claim 1, wherein: the specific steps of the pretreatment in the step S1 are as follows:
s1.1, denoising processing based on local distance statistical analysis is carried out on target three-dimensional point cloud data obtained by a laser imaging radar, and discrete space noise points in a measurement point cloud are removed; carrying out statistical analysis on the neighborhood of each point, and calculating the average distance between each point and the neighborhood point; calculating the distribution characteristic mean value and standard deviation of the average distance quantity, wherein points with the distribution mean value and standard deviation larger than the global average distance definition threshold are regarded as outliers and are removed;
s1.2, uniformly sampling point cloud by adopting a point cloud downsampling method based on voxel filtering; dividing the point cloud into three-dimensional voxel grids according to a certain interval; and reserving a point in each voxel grid, selecting the point closest to the voxel center as a reserved point in the voxel, and processing all the voxels to obtain a uniformly sampled point cloud.
3. A method of ranging and angulation of a laser imaging radar to a spatially non-cooperative target according to claim 1, wherein: the segmentation method in step S2 is as follows: performing three-dimensional voxelization on the point cloud, then scanning each voxel to perform connected domain marking, and finally classifying the regions with the connected characteristics into one type; the method specifically comprises the following steps:
s2.1, three-dimensional voxelization: calculating the maximum and minimum values of the cloud data P of the target point in the directions of three coordinates of X, Y, Z, and determining a minimum cuboid surrounding all points; setting the size d of each voxel, dividing the minimum cuboid into n voxel C sets according to d; finally, for each point p (x, y, z) in the point cloud data, determining a voxel index C (i, j, k) of each point, and deleting blank voxels; the method for determining the voxels of each point comprises the following steps: a point P (x, y, z) epsilon P arranged in the point cloud data, wherein the voxel index of the point P is (i, j, k);
Figure FDA0004085077180000021
in the above formula, d is the voxel size, x min Is the minimum value of the point cloud data in the X direction, y min Z, which is the minimum value of the point cloud data in the Y direction min The minimum value of the point cloud data in the Z direction is set;
s2.2, connectivity checking: for each C k E C (k=1, 2, … … n), checking if there are connected voxels in its 6-neighborhood or 24-neighborhood; if so, dividing the points in the voxel and the points in the neighborhood voxel into one type, and marking the type number, otherwise, treating the points as a new type; all voxel grids are traversed until all voxels are labeled.
CN202310134586.XA 2023-02-17 2023-02-17 Distance and angle measurement method for laser imaging radar to space non-cooperative target Pending CN116047465A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310134586.XA CN116047465A (en) 2023-02-17 2023-02-17 Distance and angle measurement method for laser imaging radar to space non-cooperative target

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310134586.XA CN116047465A (en) 2023-02-17 2023-02-17 Distance and angle measurement method for laser imaging radar to space non-cooperative target

Publications (1)

Publication Number Publication Date
CN116047465A true CN116047465A (en) 2023-05-02

Family

ID=86129943

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310134586.XA Pending CN116047465A (en) 2023-02-17 2023-02-17 Distance and angle measurement method for laser imaging radar to space non-cooperative target

Country Status (1)

Country Link
CN (1) CN116047465A (en)

Similar Documents

Publication Publication Date Title
CN110796728B (en) Non-cooperative spacecraft three-dimensional reconstruction method based on scanning laser radar
WO2016106955A1 (en) Laser infrared composite ground building recognition and navigation method
CN109341668B (en) Multi-camera measuring method based on refraction projection model and light beam tracking method
Park et al. Radar localization and mapping for indoor disaster environments via multi-modal registration to prior LiDAR map
CN113359097A (en) Millimeter wave radar and camera combined calibration method
CN110081881A (en) It is a kind of based on unmanned plane multi-sensor information fusion technology warship bootstrap technique
CN114296056A (en) Laser radar external parameter calibration method, device, equipment and storage medium
CN112365592B (en) Local environment feature description method based on bidirectional elevation model
JP7386136B2 (en) Cloud height measurement device, measurement point determination method, and cloud type determination method
CN115272616A (en) Indoor scene three-dimensional reconstruction method, system, device and storage medium
CN113253289A (en) Unmanned aerial vehicle detection tracking system implementation method based on combination of laser radar and vision
CN114035188B (en) High-precision monitoring method and system for glacier flow velocity of ground-based radar
CN107765257A (en) A kind of laser acquisition and measuring method based on the calibration of reflected intensity accessory external
CN109785388B (en) Short-distance accurate relative positioning method based on binocular camera
CN116047465A (en) Distance and angle measurement method for laser imaging radar to space non-cooperative target
CN113124821B (en) Structure measurement method based on curved mirror and plane mirror
CN114910892A (en) Laser radar calibration method and device, electronic equipment and storage medium
Zhang et al. Vision-based uav positioning method assisted by relative attitude classification
Ma et al. A novel method for measuring drogue-UAV relative pose in autonomous aerial refueling based on monocular vision
CN114742141A (en) Multi-source information data fusion studying and judging method based on ICP point cloud
CN109781259B (en) Method for accurately measuring infrared spectrum of small aerial moving target through spectrum correlation
CN113095324A (en) Classification and distance measurement method and system for cone barrel
Su et al. Accurate Pose Tracking for Uncooperative Targets via Data Fusion of Laser Scanner and Optical Camera
Kim et al. Digital surface model generation for drifting Arctic sea ice with low-textured surfaces based on drone images
Witzgall et al. Recovering spheres from 3D point data

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination