CN109598787B - Target three-dimensional reconstruction method based on ISAR image - Google Patents

Target three-dimensional reconstruction method based on ISAR image Download PDF

Info

Publication number
CN109598787B
CN109598787B CN201811488292.2A CN201811488292A CN109598787B CN 109598787 B CN109598787 B CN 109598787B CN 201811488292 A CN201811488292 A CN 201811488292A CN 109598787 B CN109598787 B CN 109598787B
Authority
CN
China
Prior art keywords
matrix
images
isar
point
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811488292.2A
Other languages
Chinese (zh)
Other versions
CN109598787A (en
Inventor
张磊
许涛
张曼
周叶剑
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xidian University
Original Assignee
Xidian University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xidian University filed Critical Xidian University
Priority to CN201811488292.2A priority Critical patent/CN109598787B/en
Publication of CN109598787A publication Critical patent/CN109598787A/en
Application granted granted Critical
Publication of CN109598787B publication Critical patent/CN109598787B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • G06T2207/10044Radar image

Abstract

The invention discloses a target three-dimensional reconstruction method based on ISAR images, which mainly solves the problems of complex operation, large calculated amount and low efficiency in the prior art. The implementation scheme is as follows: preprocessing the n ISAR images by a statistical filtering despecKS method to respectively obtain n preprocessed ISAR images; extracting characteristic points from the n preprocessed ISAR images by using an angular point tracking algorithm KLT and tracking to obtain a characteristic point coordinate matrix W; and taking the characteristic point coordinate matrix W as input, and obtaining a target three-dimensional reconstruction result by utilizing a factor decomposition (OFM) method. The method has the advantages of small operand, high efficiency and high precision, and can be used for attitude estimation and target identification.

Description

Target three-dimensional reconstruction method based on ISAR image
Technical Field
The invention belongs to the technical field of image processing, and particularly relates to a method for three-dimensional reconstruction of a target, which can be used for attitude estimation and target recognition.
Background
With the development of ISAR imaging technology, the research of target three-dimensional structures is receiving more and more attention based on traditional two-dimensional imaging. The three-dimensional structure of the target contains richer contour information, so that attitude estimation and target recognition are easier to realize, and the method has important practical significance for spatial situation perception and spatial safety. Three-dimensional reconstruction technology is an important research content in the field of image processing.
At present, three-dimensional reconstruction methods for an ISAR image mainly include two types:
one is a motion method SFM, which is used for three-dimensional reconstruction based on various collected disordered pictures, extracting image features by utilizing a Scale Invariant Feature Transform (SIFT) algorithm and performing cluster adjustment (BA) so as to obtain a target sparse three-dimensional point cloud. The method has the defects that the operation amount of the bundling adjustment BA process is large, and the method is not suitable for large-scale reconstruction;
the other method is a factorization method OFM, three-dimensional reconstruction is carried out on a sequence two-dimensional ISAR image based on a target, image characteristic points are extracted by using an angular point tracking algorithm KLT, and factorization is carried out, so that a three-dimensional structure of the target is obtained. The method has the defects of low efficiency and large calculation amount because complex and tedious iterative calculation needs to be carried out to solve the coordinates of the characteristic points.
Disclosure of Invention
Aiming at the defects of the prior art, the invention improves the OFM (factorization) method, introduces a statistical filtering method to preprocess a two-dimensional ISAR (inverse synthetic aperture radar) image, and provides a target three-dimensional reconstruction method based on the ISAR image so as to reduce the operation amount and improve the efficiency.
To achieve the above object, the implementation scheme of the present invention comprises the following steps:
(1) Preprocessing the n ISAR images by a statistical filtering despecKS method to respectively obtain n preprocessed ISAR images, wherein n is more than or equal to 3;
(1a) Setting an evaluation window with the size of L multiplied by L for each pixel point P of each ISAR image;
(1b) Performing KS detection on all pixel points in the evaluation window to obtain an evaluation value mu corresponding to each pixel point;
(1c) Setting an evaluation threshold μ 0 Selecting the evaluation value mu to be greater than the threshold value mu 0 Obtaining a pixel point P' which has the same statistical distribution with the pixel point P in the window;
(1d) Arithmetic mean is obtained by arithmetic mean of the evaluation values mu of the two types of pixel points P and P
Figure BDA0001895078520000021
And let the pixel values of the two types of pixel points P and P' be->
Figure BDA0001895078520000022
Setting the pixel values of all the other pixel points in the ISAR image to be 0 to obtain a preprocessed ISAR image;
(2) Extracting characteristic points from the n preprocessed ISAR images by using an angular point tracking algorithm KLT and tracking to obtain a characteristic point coordinate matrix W;
(3) And taking the characteristic point coordinate matrix W as input, and obtaining a target three-dimensional reconstruction result by utilizing a factor decomposition (OFM) method.
Compared with the prior art, the invention has the advantages that:
firstly, compared with the motion method SFM, the invention does not need the clustering adjustment BA process with large operand, has small operand and is suitable for large-scale reconstruction;
secondly, compared with the conventional factor decomposition method OFM, the statistical filtering despecKS preprocessing is carried out on the ISAR image, so that a sufficient number of characteristic point coordinates can be obtained, complex and tedious iterative calculation is not required for solving the characteristic point coordinates, the calculation amount is further reduced, and the efficiency is improved.
Drawings
FIG. 1 is a flow chart of an implementation of the present invention;
FIG. 2 is 3 ISAR images employed by the present invention;
FIG. 3 is a diagram of experimental results obtained by preprocessing 3 ISAR images by a statistical filtering despecKS method according to the present invention;
FIG. 4 is an experimental result diagram of extracting feature points from 3 preprocessed ISAR images and tracking the feature points by using a corner point tracking algorithm KLT;
FIG. 5 is a graph of the results of a three-dimensional reconstruction experiment of a target using a factorization OFM of the present invention;
FIG. 6 is a diagram of a conventional three-dimensional CAD model of an object.
FIG. 7 is a graph of the results of further analysis of FIG. 5;
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be obtained by a person skilled in the art without making any creative effort based on the embodiments in the present invention, belong to the protection scope of the present invention.
Referring to fig. 1, the implementation steps of the invention are as follows:
step 1, preprocessing n ISAR images.
In image analysis, the quality of image quality directly affects the precision of the design and effect of the recognition algorithm, so that preprocessing is required before image analysis. The main purposes of image preprocessing are to eliminate irrelevant information in images, recover useful real information, enhance the detectability of relevant information, and simplify data to the maximum extent, thereby improving the reliability of feature extraction, image segmentation, matching and recognition.
The existing image preprocessing method mainly comprises graying, geometric transformation and image enhancement, and the statistical filtering DespecKS method in the image enhancement method is adopted for preprocessing in the embodiment, and the method specifically comprises the following steps:
(1.1) setting an evaluation window with the size of L multiplied by L for each pixel point P of each ISAR image;
(1.2) performing KS detection on all pixel points in the evaluation window to obtain an evaluation value mu corresponding to each pixel point;
(1.3) setting the evaluation threshold value mu 0 Selecting the evaluation value mu to be larger than the threshold value mu 0 Obtaining a pixel point P' which has the same statistical distribution with the pixel point P in the window;
(1.4) arithmetic mean is taken to the evaluation values mu of the two types of pixel points P and P' to obtain arithmetic mean value
Figure BDA0001895078520000031
And let the pixel values of the two types of pixel points P and P' be->
Figure BDA0001895078520000032
And setting the pixel values of all the other pixel points in the ISAR image to be 0 to obtain the preprocessed ISAR image.
And 2, extracting characteristic points of the preprocessed ISAR images and tracking the characteristic points.
The method for extracting and tracking the characteristic points in the image analysis mainly comprises the following steps: the method comprises the following steps of scale invariant feature transformation SIFT, accelerated robust feature SURF, histogram of oriented gradient HOG, difference DOG of Gaussian function and corner tracking algorithm KLT, wherein the corner tracking algorithm KLT is adopted to obtain more robust feature points, and the method specifically comprises the following steps:
(2.1) obtaining a 2 × 2 matrix Z from the first ISAR image I (x) and the second ISAR image J (x):
Z=∫∫ W g(x)· g T (x)·ω(x)dx
wherein the content of the first and second substances,
Figure BDA0001895078520000033
is a gradient matrix, x = [ x y =] T Is pixel point coordinate, W is a window of a x a, omega (x) is weight function, T is matrix transposition operation, and/or>
Figure BDA0001895078520000034
In order to perform a partial derivative operation on x,
Figure BDA0001895078520000035
calculating partial derivative operation on y;
(2.2) solving two eigenvalues λ of the matrix Z l ,l=1,2;
(2.3) setting a threshold value lambda of the characteristic value 0 Selecting two characteristic values lambda l Are all greater than a threshold value lambda 0 Corresponding pixel strip point coordinate x 1 As a feature point;
(2.4) feature points x according to selection 1 And obtaining a residual error e:
e=∫∫ W [I(x 1 )-J(x 1 )]·g(x 1 )·ω(x 1 )dx
(2.5) from the residual e, the following equation is obtained:
Z·d=e
wherein d = [ Δ ] x Δ y ] T Denotes the offset, Δ x As an offset abscissa, Δ y Is the offset ordinate;
(2.6) solving the equation in the step (2.5) by adopting a Newton-Raphson iteration method to obtain an offset d;
(2.7) Using the feature point coordinate x 1 And tracking the characteristic points of the first two images by the offset d to obtain the characteristic point coordinates W of the first image and the second image 12 The coordinates W of the feature point 12 Is 4 XN 12 In which N is 12 The number of feature points tracked for the first two graphs;
(2.8) according to the selected characteristic point coordinate x 1 Repeating the processes (2.4) - (2.6) for the second and third ISAR images to realize the special purpose of the second and third imagesCharacteristic point x 2 Tracking to obtain the characteristic point coordinates W of the second graph and the third graph 23 The coordinates W of the feature point 23 Is 4 XN 23 Matrix of N 23 Number of feature points, N, tracked for the second and third images 23 ≤N 12
(2.9) repeating the processes of (2.4) - (2.6) on the q picture and the q +1 picture ISAR image in sequence, and realizing the characteristic point x of the q picture and the q +1 picture q To obtain the characteristic point coordinates W of the q-th and q + 1-th graphs qq+1 The feature point coordinate W qq+1 Is 4 XN qq+1 Matrix of N qq+1 The number of feature points tracked for the qth and the q +1 th images, q =3,4.., n-1;
(2.10) coordinates W at feature points 12 、W 23 And W qq+1 In the method, the characteristic point x which is successfully tracked in n images is selected n-1 Obtaining final feature point coordinates W, wherein W is a matrix of 2N × N, and N is the number of feature points successfully tracked finally.
And 3, obtaining a target three-dimensional reconstruction result according to the result of the step 2.
The existing three-dimensional reconstruction method for ISAR images mainly comprises two methods: one is a motion method SFM, which performs three-dimensional reconstruction based on various collected chaotic pictures; the other is a factorization OFM, which performs three-dimensional reconstruction based on a sequence two-dimensional ISAR image of the target. The example adopts a factorization OFM method to realize the three-dimensional reconstruction of the target, which is realized as follows:
(3.1) calculating the arithmetic mean b of the i-th row N elements of the final feature point coordinate W i ,i=1,2...6;
(3.2) definition of
Figure BDA0001895078520000051
The registration matrix of the feature point coordinate is the following elements in the ith row and the jth column:
Figure BDA0001895078520000052
/>
wherein W ij Is the most excellentRow i, column j of final feature point coordinates W, j =1,2.. N;
(3.3) registration matrix for feature point coordinates
Figure BDA0001895078520000053
Singular value decomposition is carried out, namely: />
Figure BDA0001895078520000054
Wherein the left singular matrix O 1 Is a 2N × N matrix, the singular value matrix Σ is an N × N matrix, and is all 0 except for the elements on the main diagonal, each element on the main diagonal is called a singular value, and the right singular matrix O 2 Is an N × N matrix, O 1 And O 2 Are all unitary matrices, i.e. satisfy O 1 T ·O 1 =O 2 T ·O 2 =O 2 ·O 2 T = I, I being a unit matrix of N × N;
(3.4) defining the left singular matrix O 1 All the rows and the first 3 columns of elements of which form a first matrix O 1 ', defining the first 3 rows and 3 columns of elements of the singular value matrix sigma to form a second matrix sigma', defining the right singular matrix O 2 The first 3 rows and all the columns of elements of the matrix form a third matrix O 2 ′;
(3.5) defining a rotation matrix
Figure BDA0001895078520000055
And a shape matrix>
Figure BDA0001895078520000056
(3.6) setting the transformation matrix Q to obtain the following equation set:
u m ·Q·Q T ·u m T =1
v k ·Q·Q T ·v k T =1
u m ·Q·Q T ·v k T =0
wherein u is m Is a rotation matrix
Figure BDA0001895078520000057
Row vector of m-th row, v k Is the rotation matrix pick>
Figure BDA0001895078520000058
Row vector of line k, m =1,2,. Multidot., n, k = n +1, n +2,. Multidot., 2n;
(3.7) solving the equation set (3.6) to obtain a transformation matrix Q and solving an inverse matrix Q thereof -1
(3.8) obtaining a three-dimensional coordinate matrix based on the results of (3.5) and (3.7)
Figure BDA0001895078520000061
The three-dimensional coordinate matrix S is the feature point x n-1 I.e. the three-dimensional reconstruction of the object.
The effects of the present invention can be further demonstrated by the following simulation experiments.
Simulation conditions:
the parameters of the statistical filtering DespecKS method adopted by the simulation of the invention are shown in table 1:
TABLE 1 statistical Filter DespecKS method of Main parameters
Figure BDA0001895078520000062
The parameters of the corner point tracking algorithm KLT adopted by the simulation of the invention are shown in a table 2:
TABLE 2 corner tracking Algorithm KLT principal parameters
Figure BDA0001895078520000063
The parameters of the factorization OFM used in the simulation of the present invention are shown in table 3:
TABLE 3 factorization of OFM major parameters
Figure BDA0001895078520000064
The 3 original ISAR images used in the simulation are shown in FIG. 2, which is derived from a sequence space shuttle ISAR image of a subgrade TIRA as disclosed in the German institute of high frequency physics and radar technology, fraunhofer.
(II) simulation content and results:
simulation 1: the 3 original ISAR images shown in fig. 2 are preprocessed by the statistical filtering desspecks method according to the simulation parameters in table 1, and the preprocessed 3 ISAR images are obtained, respectively, as shown in fig. 3.
Simulation 2: feature points are extracted from the 3 preprocessed ISAR images and tracked by adopting a corner tracking algorithm KLT according to the simulation parameters in the table 2, a feature point coordinate matrix W can be obtained, the extracted and tracked feature points are represented on an ISAR image, and the result is shown in a figure 4, wherein the feature points are represented by dots.
As can be seen from fig. 4, 150 feature points are extracted from the first graph, 46 feature points are successfully tracked in the second graph, and finally 22 feature points are successfully tracked in the third graph, and the number of successfully tracked feature points is gradually reduced.
Simulation 3: using the feature point coordinate matrix W obtained in simulation 2 as an input, a target three-dimensional reconstruction result is obtained by using the factorization OFM according to the simulation parameters in table 3, as shown in fig. 5, where fig. 5 (a) is an oblique view of the target three-dimensional reconstruction result, fig. 5 (b) is a top view of the target three-dimensional reconstruction result, and fig. 5 (c) is a side view of the target three-dimensional reconstruction result.
Comparing fig. 5 with fig. 6, it can be seen that the results obtained by the embodiment of the invention basically keep the length, width and height ratio of the space plane, and have certain accuracy and reliability.
In order to further verify the accuracy and reliability of the target three-dimensional reconstruction result of the embodiment, the specific analysis is as follows:
two feature points extracted from the tail structure of the space shuttle in the third ISAR diagram in FIG. 4 are marked by circles, two feature points extracted from the wing structure of the space shuttle are marked by square frames, and two feature points extracted from the solar sailboard structure of the space shuttle are marked by triangular frames, so as to obtain FIG. 7 (a). Considering the symmetry of the existing CAD model of the space shuttle, the connecting lines of the characteristic points marked by the circle, the square frame and the triangular frame are parallel to each other;
in fig. 5 (b), the feature points marked with a circle, a square frame and a triangular frame in fig. 7 (a) are found, the feature points of the two tail structures marked with the circle are connected by a dotted line, the feature points of the two wing structures marked with the square frame are connected by a solid line, and the feature points of the two solar sailboards marked with the triangular frame are connected by a dot-dotted line, so that fig. 7 (b) is obtained. As can be seen from fig. 7 (b), the dotted line, the solid line and the dotted line connecting the feature points are almost parallel;
the rotation operation in three-dimensional space is performed on fig. 7 (b) to obtain a front view of the three-dimensional reconstruction result, as shown in fig. 7 (c), and as can be seen from fig. 7 (c), the dashed line connecting the two feature points of the tail structure is almost completely parallel to the solid line connecting the two feature points of the wing structure, and the two lines are not completely parallel to the dotted line connecting the two feature points of the solar panel, and they form a small angle of only 7 °.
Through the analysis, the three-dimensional reconstruction result of the target in the embodiment is consistent with the characteristics of the existing CAD model of the target, the error is extremely small, and the scheme of the invention has good effectiveness within a reasonable range.

Claims (3)

1. A target three-dimensional reconstruction method based on ISAR images is characterized by comprising the following steps:
(1) Preprocessing the n ISAR images by a statistical filtering despecKS method to respectively obtain the n preprocessed ISAR images, wherein n is more than or equal to 3:
(1a) Setting an evaluation window with the size of L multiplied by L for each pixel point P of each ISAR image;
(1b) Performing KS detection on all pixel points in the evaluation window to obtain an evaluation value mu corresponding to each pixel point;
(1c) Setting an evaluation threshold μ 0 Selecting the evaluation value mu to be larger than the threshold value mu 0 Obtaining a pixel point P' which has the same statistical distribution with the pixel point P in the window;
(1d) To this endArithmetic mean is obtained by arithmetic averaging the evaluation values mu of the two types of pixel points P and P
Figure FDA0004047238710000011
And let the pixel values of the two types of pixel points P and P' be->
Figure FDA0004047238710000012
Setting the pixel values of all the other pixel points in the ISAR image to be 0 to obtain a preprocessed ISAR image;
(2) Extracting characteristic points from the n preprocessed ISAR images by using an angular point tracking algorithm KLT and tracking to obtain a characteristic point coordinate matrix W;
(3) And taking the characteristic point coordinate matrix W as input, and obtaining a target three-dimensional reconstruction result by utilizing a factor decomposition method OFM.
2. The method of claim 1, wherein in (2), the corner point tracking algorithm KLT is applied to extract and track feature points of the n processed ISAR images, which is implemented as follows:
(2a) Obtaining a 2 × 2 matrix Z according to the first ISAR image I (x) and the second ISAR image J (x):
Z=∫∫ W g(x)·g T (x)·ω(x)dx
wherein the content of the first and second substances,
Figure FDA0004047238710000013
is a gradient matrix, x = [ x y =] T Is the coordinate of the pixel point, omega (x) is the weight function, T is the matrix transposition operation, and>
Figure FDA0004047238710000014
for partial derivative operation on x, ->
Figure FDA0004047238710000015
Calculating partial derivative operation on y;
(2b) Two eigenvalues λ of matrix Z are solved l ,l=1,2;
(2c) Setting characteristicsEigenvalue threshold lambda 0 Selecting two characteristic values lambda l Are all greater than a threshold value lambda 0 Corresponding pixel strip point coordinate x 1 As a feature point;
(2d) According to the selected characteristic point x 1 To obtain the equation:
Z·d=e
wherein e = integral formula W [I(x 1 )-J(x 1 )]·g(x 1 )·ω(x 1 ) dx represents the residual, d = [ Δ = x Δ y ] T Denotes the offset, Δ x As an offset abscissa, Δ y Is an offset ordinate;
(2e) Solving the equation by adopting a Newton-Raphson iterative method to obtain an offset d;
(2f) Using the coordinates x of the feature points 1 And tracking the characteristic points of the first two images by the offset d to obtain the characteristic point coordinates W of the first image and the second image 12 The feature point coordinate W 12 Is 4 XN 12 In which N is 12 The number of feature points tracked for the first two graphs;
(2g) According to the selected characteristic point coordinate x 1 Repeating the processes (2 d) - (2 e) on the second and third ISAR images to realize the feature point x of the second and third images 2 Tracking to obtain the characteristic point coordinates W of the second graph and the third graph 23 The coordinates W of the feature point 23 Is 4 XN 23 Matrix of N 23 Number of feature points, N, tracked for the second and third images 23 ≤N 12
(2h) Repeating the processes (2 d) - (2 e) for the q th and q +1 th ISAR images in sequence, so as to realize the characteristic point x of the q th and q +1 th images q To obtain the characteristic point coordinates W of the q-th and q + 1-th graphs qq+1 The coordinates W of the feature point qq+1 Is 4 XN qq+1 Matrix of N qq+1 The number of feature points tracked for the qth and the q +1 th images, q =3,4.., n-1;
(2i) At the feature point coordinates W 12 、W 23 And W qq+1 In the method, the characteristic point x which is successfully tracked in n images is selected n-1 Obtaining final feature point coordinates W, wherein W is a matrix of 2N × N, and N is the number of feature points successfully tracked finally.
3. The method of claim 1, wherein the OFM in (3) is performed by using a factorization method to obtain a three-dimensional reconstruction of the object, which is as follows:
(3a) Calculating the arithmetic mean b of the ith row N elements of the final feature point coordinate W i ,i=1,2...6;
(3b) Definition of
Figure FDA0004047238710000031
The registration matrix of the feature point coordinate is the following elements in the ith row and the jth column:
Figure FDA0004047238710000032
wherein W ij J =1,2.. N, which is an element of the ith row and jth column of the final feature point coordinate W;
(3c) Registration matrix for feature point coordinates
Figure FDA0004047238710000033
Singular value decomposition is carried out, namely: />
Figure FDA0004047238710000034
Wherein the left singular matrix O 1 Is a 2N × N matrix, the singular value matrix Σ is an N × N matrix, and is all 0 except for the elements on the main diagonal, each element on the main diagonal is called a singular value, and the right singular matrix O 2 Is an N × N matrix, O 1 And O 2 Are all unitary matrices, i.e. satisfy O 1 T ·O 1 =O 2 T ·O 2 =O 2 ·O 2 T I, I is a unit array of N × N;
(3d) Defining a left singular matrix O 1 All the rows and the first 3 columns of elements of which form a first matrix O 1 ', defining the first 3 rows and 3 columns of elements of the singular value matrix sigma to form a second matrix sigma', defining the right singular matrix O 2 The first 3 rows and all the column elements of the matrix form a third matrix O 2 ′;
(3e) Defining a rotation matrix
Figure FDA0004047238710000035
And the shape matrix->
Figure FDA0004047238710000036
(3f) Setting a transformation matrix Q to obtain an equation set:
u m ·Q·Q T ·u m T =1
v k ·Q·Q T ·v k T =1
u m ·Q·Q T ·v k T =0
wherein u is m Is a rotation matrix
Figure FDA0004047238710000037
Row vector of m-th row, v k Is the rotation matrix pick>
Figure FDA0004047238710000038
The row vector of the k-th row,
m=1,2,...,n,k=n+1,n+2,...,2n;
(3g) Solving the equation set to obtain a transformation matrix Q and solving an inverse matrix Q thereof -1
(3h) From the results of (3 e) and (3 g), a three-dimensional coordinate matrix is obtained
Figure FDA0004047238710000039
The three-dimensional coordinate matrix S is the feature point x n-1 I.e. the three-dimensional reconstruction of the object. />
CN201811488292.2A 2018-12-06 2018-12-06 Target three-dimensional reconstruction method based on ISAR image Active CN109598787B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811488292.2A CN109598787B (en) 2018-12-06 2018-12-06 Target three-dimensional reconstruction method based on ISAR image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811488292.2A CN109598787B (en) 2018-12-06 2018-12-06 Target three-dimensional reconstruction method based on ISAR image

Publications (2)

Publication Number Publication Date
CN109598787A CN109598787A (en) 2019-04-09
CN109598787B true CN109598787B (en) 2023-04-07

Family

ID=65962201

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811488292.2A Active CN109598787B (en) 2018-12-06 2018-12-06 Target three-dimensional reconstruction method based on ISAR image

Country Status (1)

Country Link
CN (1) CN109598787B (en)

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1780672A1 (en) * 2005-10-25 2007-05-02 Bracco Imaging, S.P.A. Method of registering images, algorithm for carrying out the method of registering images, a program for registering images using the said algorithm and a method of treating biomedical images to reduce imaging artefacts caused by object movement
CN102353945B (en) * 2011-03-31 2013-06-05 北京航空航天大学 Three-dimensional position reconstructing method based on ISAR (inverse synthetic aperture radar) image sequence for scattering point
CN108647580B (en) * 2018-04-18 2020-07-14 中国人民解放军国防科技大学 Improved SIFT-based ISAR image feature point extraction and matching method

Also Published As

Publication number Publication date
CN109598787A (en) 2019-04-09

Similar Documents

Publication Publication Date Title
CN107657279B (en) Remote sensing target detection method based on small amount of samples
CN107301661B (en) High-resolution remote sensing image registration method based on edge point features
Ma et al. Robust feature matching for remote sensing image registration via locally linear transforming
Paisitkriangkrai et al. Strengthening the effectiveness of pedestrian detection with spatially pooled features
US9117105B2 (en) 3D face recognition method based on intermediate frequency information in geometric image
CN104392241B (en) A kind of head pose estimation method returned based on mixing
CN107103338B (en) SAR target recognition method integrating convolution features and integrated ultralimit learning machine
CN111784770A (en) Three-dimensional attitude estimation method in disordered grabbing based on SHOT and ICP algorithm
CN111275044A (en) Weak supervision target detection method based on sample selection and self-adaptive hard case mining
Berg et al. Shape matching and object recognition
CN110110618B (en) SAR target detection method based on PCA and global contrast
CN105760898A (en) Vision mapping method based on mixed group regression method
CN108021869A (en) A kind of convolutional neural networks tracking of combination gaussian kernel function
CN109753887B (en) SAR image target identification method based on enhanced kernel sparse representation
CN108257153B (en) Target tracking method based on direction gradient statistical characteristics
CN109658340B (en) SAR image rapid denoising method based on RSVD and histogram preservation
CN111325158B (en) CNN and RFC-based integrated learning polarized SAR image classification method
CN109598787B (en) Target three-dimensional reconstruction method based on ISAR image
Li et al. Active shape model segmentation using local edge structures and adaboost
CN109190693B (en) Variant target high-resolution range profile recognition method based on block sparse Bayesian learning
CN109214269B (en) Human face posture alignment method based on manifold alignment and multi-image embedding
CN112435211A (en) Method for describing and matching dense contour feature points in endoscope image sequence
CN107403136B (en) SAR target model identification method based on structure-preserving dictionary learning
CN111127407A (en) Fourier transform-based style migration counterfeit image detection device and method
Han et al. Accurate and robust vanishing point detection method in unstructured road scenes

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant