CN110992399B - High-precision target atmosphere disturbance detection method - Google Patents

High-precision target atmosphere disturbance detection method Download PDF

Info

Publication number
CN110992399B
CN110992399B CN201911096413.3A CN201911096413A CN110992399B CN 110992399 B CN110992399 B CN 110992399B CN 201911096413 A CN201911096413 A CN 201911096413A CN 110992399 B CN110992399 B CN 110992399B
Authority
CN
China
Prior art keywords
image
target
disturbance
follows
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911096413.3A
Other languages
Chinese (zh)
Other versions
CN110992399A (en
Inventor
张月
王旭
张学敏
邬志强
赵号
邓红艳
张琢
苏云
郑国宪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Institute of Space Research Mechanical and Electricity
Original Assignee
Beijing Institute of Space Research Mechanical and Electricity
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Space Research Mechanical and Electricity filed Critical Beijing Institute of Space Research Mechanical and Electricity
Priority to CN201911096413.3A priority Critical patent/CN110992399B/en
Publication of CN110992399A publication Critical patent/CN110992399A/en
Application granted granted Critical
Publication of CN110992399B publication Critical patent/CN110992399B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)
  • Aiming, Guidance, Guns With A Light Source, Armor, Camouflage, And Targets (AREA)

Abstract

A high-precision target atmospheric disturbance detection method mainly comprises the following steps: the method comprises the steps of inputting an image, carrying out whole pixel cross search, newton-Raphson sub-pixel positioning, setting boundary conditions, carrying out grid division, solving a Poisson equation, judging application requirements, detecting disturbance area positions, detecting target positions and inverting flow field density. According to the invention, the full pixel search based on cross search and the sub-pixel positioning mode based on Newton-Raphson iteration are adopted for the first time, so that the high-precision target atmospheric disturbance detection is realized, and the disturbance detection precision can reach 1/50 pixel theoretically. The invention has the advantages that the atmospheric disturbance detection precision of the target is not influenced by the target flying speed, flying height, stealth performance and the like, and can theoretically realize the atmospheric disturbance detection of any high-speed moving target in the atmosphere and realize the visualization of an atmospheric disturbance field. The invention can provide a visual means for the pneumatic characteristic detection of the high-speed moving target, is applied to the fields of pneumatic appearance optimization, dynamic unfolding process optimization and the like of the high-speed moving target, does not need a wind tunnel, can be applied in natural environment, and can reduce the atmospheric disturbance detection difficulty and greatly reduce the cost.

Description

High-precision target atmosphere disturbance detection method
Technical Field
The invention relates to a high-precision target atmospheric disturbance detection method, in particular to a method for detecting high-precision atmospheric disturbance on a ground object image containing invisible atmospheric disturbance information, which can be used in the fields of moving target detection or moving target detail flow field display and the like.
Background
Stealth moving targets such as stealth fighters, stealth bombers and the like are high threat targets, have radar and infrared stealth capability, are difficult to find by adopting means such as foundation radars and space-based infrared and the like, and need a detection method which is not influenced by stealth performance of the stealth moving targets. The targets can interact with the atmosphere in the movement process to form atmospheric disturbance, and the atmospheric disturbance is an inevitable product of dynamic flight in the atmosphere, is the inherent characteristic of the targets and is not influenced by the stealth performance of the targets. Therefore, detection of such targets by atmospheric disturbances is very feasible.
The return type spacecraft, the parachute, the civil airliner and the like are all fast moving objects flying in the atmosphere, and flow field display equipment is needed for optimizing the aerodynamic shape of the target, optimizing the dynamic unfolding process, monitoring the movement state in real time and the like. At present, a ground wind tunnel is the only equipment for displaying the target flow field, but the ground wind tunnel is limited by volume, bearing capacity and the like, and cannot meet the conditions of full-size targets and full-working-condition flow field analysis, and a method capable of carrying out flow field analysis on the full-size targets and the full-working-condition conditions is needed. When the targets fly in the air, if the flying state can be monitored in real time, and flow field information of interaction between the targets and the atmosphere is displayed in real time, flow parameters can be effectively obtained, data support can be provided for stable landing of the return spacecraft, data support can be provided for power unfolding process optimization of the parachute, and support can be provided for power appearance optimization of high-speed and even supersonic civil airliners.
The visual optical display method of the atmospheric disturbance of the moving object mainly comprises a shadow method, a traditional schlieren method, an interferometry method and the like, and the optical measurement method mainly comprises a laser Doppler velocimetry method, a PIV particle velocimetry method and the like, but the methods all need relatively complex active light illumination and optical systems, and are generally only used in a laboratory or a wind tunnel. Background schlieren imaging is a novel flow field display method developed on the basis of traditional schlieren imaging and PIV particle velocimetry, and a random pattern background and an imaging camera are utilized to display flow field changes in a region in front of the background, so that a complex active light source and an optical system are not needed, and wide application is achieved. The existing background schlieren imaging research mainly carries out flow field visual inversion based on horizontal vision background and short-distance high-resolution images, and has no flow field visual inversion research based on vertical vision, ground features as the background and long-distance low-resolution images and moving object detection research.
Disclosure of Invention
The invention solves the technical problems that: the method for detecting the atmospheric disturbance of the high-precision target is capable of overcoming the defects of the prior art, and a feasible technical means can be provided for detecting and identifying the high-speed moving target, particularly the high-speed stealth target.
The technical scheme of the invention is as follows: a high-precision target atmospheric disturbance detection method comprises the following steps:
1) Inputting two images;
2) Performing whole pixel cross search on an input image;
3) Performing Newton-Raphson sub-pixel positioning;
4) Setting a boundary condition of a solving area;
5) Performing grid division;
6) Solving the refractive index of the disturbance field according to a poisson equation;
7) Judging application requirements, and executing detection targets or displaying detail flow fields according to different requirements;
8) Detecting the position of the disturbance zone to obtain the position of the disturbance zone;
9) Detecting the target position to obtain the target position;
10 Inversion is carried out on the flow field density to obtain a flow field density inversion result.
The specific method of the step 1) is as follows: inputting a ground object image 1 with texture features, which is shot by one camera and does not contain target atmosphere disturbance information, and inputting a ground object image 2 with texture features, which is shot by another camera or a plurality of cameras and contains target atmosphere disturbance information; the images 1 and 2 are ground feature images with texture features, such as city images, sea surface images, desert images and the like; the image 1 and the image 2 are images with the same resolution and contain the same ground object information; the image 1 is an image containing no target atmospheric disturbance information or an image containing target atmospheric disturbance information.
The specific method of the step 2) is as follows: setting a search window of N multiplied by M pixels in the image 1 by adopting a whole pixel cross search algorithm, substituting the search window and the image of N multiplied by M pixels in the ground object image 2 into a cross correlation function to carry out iterative solution, and continuously transforming the positions of the N multiplied by M pixels in the image 2 until the extreme points of the cross correlation function are obtained by solving; at this time, the position of the n×m pixels in the image 2 is the result of the cross search of the whole pixels.
The specific method of the step 3) is as follows: and (3) establishing a new correlation function formula by adopting a Newton-Raphson iteration method, and carrying out sub-pixel movement on the N multiplied by M pixel window in the figure 2 on the basis of the step (2) until the correlation function converges to an extreme value, so as to obtain a more accurate relative displacement value of the image 1 and the image 2.
The specific method of the step 4) is as follows: setting boundary conditions as Dirichlet conditions and generalized Neumann conditions;
the Dirichlet condition is:
Figure BDA0002268461370000031
the generalized Neumann condition is:
Figure BDA0002268461370000032
wherein h, r, q and g are defined at the boundary
Figure BDA0002268461370000033
U and c are complex functions defined on the region Ω, n is the boundary +.>
Figure BDA0002268461370000034
Unit normal vector above.
The specific method of the step 5) is as follows: and adopting triangulation to divide the image 1 and the image 2, and constructing a target atmosphere disturbance zone calculation grid.
The specific method of the step 6) is as follows: according to the two steps, solving the node function value vectors to obtain the solution of the poisson equation;
the poisson equation is:
Figure BDA0002268461370000035
wherein C is a constant, n is the refractive index of the atmospheric disturbance field, and Deltax, deltay are the displacement amounts of the solving area in different directions obtained in the step 3).
The specific method of the step 7) is as follows: and selecting and executing the position detection of the turbulent flow region or the detail flow field detection according to different application requirements.
The specific method of the step 8) is as follows: setting a detection threshold according to the solution of the poisson equation obtained in the step 6), namely the refractive index field, and detecting a region with the refractive index larger than the threshold, wherein the region is the position of the disturbance region.
The specific method of the step 9) is as follows: according to the disturbance zone position obtained in the step 8), and the characteristics that the inherent characteristic of the moving target and the strong disturbance zone and the moving target exist simultaneously and certainly, the detected disturbance zone position is the position of the moving target.
The specific method of the step 9) is as follows: according to the poisson equation solving result in the step 6), combining the index of refraction and density relation formula Graniston-Dall formula to invert the flow field density to obtain a flow field density inversion result;
the glaprison-dell formula is:
Figure BDA0002268461370000041
wherein n is the refractive index of the gas, ρ is the gas density, K G-D =f (λ, T, P) is a glaston-dil constant. K (K) G-D Available through the literature references.
Compared with the prior art, the invention has the advantages that:
(1) At present, no effective means and method for detecting stealth moving targets exist, and radar and infrared detection performances are limited by radar and infrared stealth capabilities of stealth targets. The method for detecting the target by detecting the target atmospheric disturbance information contained in the ground object image is free from the influence of stealth performance, can find the stealth target position, and provides the position information of the stealth moving target in real time.
(2) At present, no means and no method for displaying full-size targets, full-working-condition flow fields of return type spacecrafts, parachutes, civil airliners and the like exist. The invention can realize high-precision visual display of the target flow information by carrying out atmospheric disturbance detection on the image taking the ground feature as the background, can provide data support for stable landing of the return spacecraft, can provide data support for power unfolding process optimization of the parachute, and can also provide support for power appearance optimization of high-speed and even supersonic civil airliners. In addition, the difficulty in acquiring the flow information can be reduced, and the cost is greatly reduced.
(3) The invention adopts the method of 'whole pixel cross search + Newton-Raphson sub-pixel positioning' for the first time, realizes 1/50 pixel registration precision, and can realize high-precision detection of target atmosphere disturbance from a long-distance low-resolution image. The invention has the advantages that the atmospheric disturbance detection precision of the target is not influenced by the target flying speed, flying height, stealth performance and the like, and can theoretically realize the atmospheric disturbance detection of any high-speed moving target in the atmosphere and realize the visualization of an atmospheric disturbance field.
Drawings
FIG. 1 is a schematic diagram of the system of the present invention;
FIG. 2 is a schematic diagram of a full pixel cross search of the present invention;
FIG. 3 is a simulation of the disturbance detection values of the present invention.
Detailed Description
The working principle diagram of the high-precision target atmospheric disturbance detection method is shown in fig. 1, and comprises the steps of inputting an image, carrying out whole pixel cross search, newton-Raphson sub-pixel positioning, setting boundary conditions, carrying out grid division, solving a poisson equation, judging application requirements, detecting disturbance area positions, detecting target positions and inverting flow field density. According to the invention, the full pixel search based on cross search and the sub-pixel positioning mode based on Newton-Raphson iteration are adopted for the first time, so that the high-precision target atmospheric disturbance detection is realized, and the disturbance detection precision can reach 1/50 pixel theoretically. The invention has the advantages that the atmospheric disturbance detection precision of the target is not influenced by the target flying speed, flying height, stealth performance and the like, and can theoretically realize the atmospheric disturbance detection of any high-speed moving target in the atmosphere and realize the visualization of an atmospheric disturbance field. The invention can provide a visual means for the pneumatic characteristic detection of the high-speed moving target, is applied to the fields of pneumatic appearance optimization, dynamic unfolding process optimization and the like of the high-speed moving target, does not need a wind tunnel, can be applied in natural environment, and can reduce the atmospheric disturbance detection difficulty and greatly reduce the cost.
The input image related to the high-precision target atmospheric disturbance detection method can be two frames or multiple frames; the range of the features contained in the image 1 and the image 2 can be completely the same or can be mostly the same, but the partial images containing the target atmosphere disturbance information must be ensured to be the same.
The integral pixel cross search principle of the high-precision target atmospheric disturbance detection method is shown in fig. 2, the cross search algorithm changes the distribution of the correlation number from the curved surface of the H dimension into a curve with only a single peak according to the single peak characteristic of the curved surface of the distribution of the correlation coefficient, the search path is reduced from two dimensions to one dimension, the search process is decomposed into multiple steps, and the search is carried out along only one straight line at a time. Such processing allows the whole pixel search to greatly reduce the calculation time without degrading the calculation accuracy. Point c is the initial search point, u 0 Is the coordinate value of the starting point along the u direction, and a curve intersecting the correlation coefficient curved surface is a unimodal curve with the vertex e provided that the correlation coefficient curved surface of the H dimension is subjected to a section parallel to the v direction along the point 0 The position is the extreme point of the correlation coefficient calculated along the straight line. Then go through e 0 And a new section is made perpendicular to the previous section, and the v axis is crossed with v 1 The curve top point of the intersection of the section and the correlation coefficient curved surface is e 1 . Repeating the search several times, and finally at the peak e of the unimodal curve n When the two points are the same point, the point is considered to be the extreme point of the correlation coefficient in the plane, and the whole pixel displacement can be obtained by combining the point c.
The correlation coefficient is:
Figure BDA0002268461370000061
wherein M, N is the number of pixels in the x and y directions of the solution, (x, y) is a certain pixel position in the image 1, (x, y) is an arbitrary iteration window position in the image 2, f (x, y) is a diagnostic window prepared in the image 1 for obtaining the displacement, and g (x, y) is an arbitrary iteration window in the image 2.
The invention relates to a Newton-Raphson sub-pixel positioning principle of a high-precision target atmospheric disturbance detection method. The Newton-Raphson method is an iteration around the correlation coefficient extremum by Newton's method, eventually converging to the correlation function extremum. The method mainly adopts Newton-Raphson method to iteratively solve a correlation function for evaluating the relationship between two images, and is based on Taylor series expansion. Correlation coefficient
Figure BDA0002268461370000064
Extreme values should be taken, i.e. correlation coefficients +.>
Figure BDA0002268461370000065
The gradient tends to 0, namely, the primary bias and the secondary bias are calculated for the correlation function, so that N times of bias are calculated, and the sub-pixel positioning value is calculated.
The correlation function is:
Figure BDA0002268461370000062
in the method, in the process of the invention,
Figure BDA0002268461370000063
for the vector to be solved, u is the distance between x and x ', and v is the distance between y and y'.
The poisson equation and the poisson equation solving steps of the high-precision target atmosphere disturbance detection method are as follows.
Poisson's equation is:
Figure BDA0002268461370000071
wherein C is a constant, n is the refractive index of the atmospheric disturbance field, and Deltax, deltay are the displacement amounts of the solving area in different directions obtained in the step 3).
Poisson's equation belongs to an elliptic equation, which is generally expressed in the form of:
Figure BDA0002268461370000072
in the above formula, c, a, f and the function u to be solved are complex functions defined on the region Ω.
The Dirichlet condition is:
Figure BDA0002268461370000073
the generalized Neumann condition is:
Figure BDA0002268461370000074
wherein h, r, q, g are defined at the boundary
Figure BDA0002268461370000075
N is the boundary +.>
Figure BDA0002268461370000076
Unit normal vector above. The finite element method is effectively a projection of a solution in a weak form of a differential equation in a finite dimensional space. Taking any test function V e V, multiplying by both sides of equation (4), and integrating over the region Ω:
Figure BDA0002268461370000077
the solution of the elliptic equation will now be further described using the generalized Neumann boundary condition, using the Green formula and substituting equation (6):
Figure BDA0002268461370000078
therefore, the virtual work equation can be written as:
Figure BDA0002268461370000079
the solution of the above formula is the solution of an elliptic equation, namely the solution of a poisson equation.
A disturbance detection numerical simulation result of the high-precision target atmospheric disturbance detection method is shown in fig. 3, a mathematical simulation speckle pattern capable of accurately controlling displacement is established for verifying the atmospheric disturbance detection precision of the high-precision target atmospheric disturbance detection method, and a plurality of random Gaussian spot intensity superposition methods are adopted to simulate speckle images before and after displacement. Setting random different numerical displacements of each direction of the speckle pattern, and performing disturbance detection calculation by adopting the method to obtain a pixel displacement calculated value result shown in table 1, wherein u is as follows 0 ,v 0 For a given displacement in the X and Y directions. The distribution diagrams adopt normal distribution N (mu, sigma) 2 ) Fitting was performed. According to the calculation result, a speckle pattern with standard Gaussian distribution can be obtained, and the disturbance detection precision can reach +/-0.01 pixel (namely 1/50 pixel).
Table 1 results of calculation of different displacements of each pixel (unit: pixel)
u 0 v 0 u 0 Calculated value (mu + -3 sigma) v 0 Calculated value (mu + -3 sigma)
0.01 0.01 0.01±1.2×10 -5 0.01±1.23×10 -5
The invention can be used for single-target atmospheric disturbance detection, and can also be used for simultaneous disturbance detection of multi-target atmospheric disturbance.
What is not described in detail in the present specification is a well known technology to those skilled in the art.

Claims (1)

1. A high-precision target atmospheric disturbance detection method is characterized by comprising the following steps:
1) Inputting two images;
2) Performing whole pixel cross search on an input image;
3) Performing Newton-Raphson sub-pixel positioning;
4) Setting a boundary condition of a solving area;
5) Performing grid division;
6) Solving the refractive index of the disturbance field according to a poisson equation;
7) Judging application requirements, and executing detection targets or displaying detail flow fields according to different requirements;
8) Detecting the position of the disturbance zone to obtain the position of the disturbance zone;
9) Detecting the target position to obtain the target position;
10 Inversion is carried out on the flow field density to obtain a flow field density inversion result;
the specific method of the step 1) is as follows: inputting a ground object image 1 with texture features, which is shot by one camera and does not contain target atmosphere disturbance information, and inputting a ground object image 2 with texture features, which is shot by another camera or a plurality of cameras and contains target atmosphere disturbance information; the images 1 and 2 are ground feature images with texture characteristics, and comprise city images, sea surface images and desert images; the image 1 and the image 2 are images with the same resolution and contain the same ground object information; the image 1 is an image which does not contain target atmospheric disturbance information or an image which contains target atmospheric disturbance information;
the specific method of the step 2) is as follows: setting a search window of N multiplied by M pixels in the image 1 by adopting a whole pixel cross search algorithm, substituting the search window and the image of N multiplied by M pixels in the ground object image 2 into a cross correlation function to carry out iterative solution, and continuously transforming the positions of the N multiplied by M pixels in the image 2 until the extreme points of the cross correlation function are obtained by solving; at this time, the position of the n×m pixels in the image 2 is the result of the cross search of the whole pixels;
the specific method of the step 3) is as follows: establishing a new correlation function formula by adopting a Newton-Raphson iteration method, and carrying out sub-pixel movement on an N multiplied by M pixel window in the figure 2 on the basis of the step 2) until the correlation function converges to an extremum, so as to obtain a more accurate relative displacement value of the image 1 and the image 2;
the specific method of the step 4) is as follows: setting boundary conditions as Dirichlet conditions and generalized Neumann conditions;
the Dirichlet condition is:
Figure FDA0004113987180000021
the generalized Neumann condition is:
Figure FDA0004113987180000022
wherein h, r, q and g are defined at the boundary
Figure FDA0004113987180000023
U and c are complex functions defined on the region Ω, n is the boundary +.>
Figure FDA0004113987180000024
Unit normal vector on;
the specific method of the step 5) is as follows: dividing the image 1 and the image 2 by adopting triangulation, and constructing a target atmosphere disturbance zone calculation grid;
the specific method of the step 6) is as follows: according to the two steps, solving the function value vector of each node to obtain the solution of the poisson equation;
the poisson equation is:
Figure FDA0004113987180000025
wherein C is a constant, n is the refractive index of the atmospheric disturbance field, and Deltax, deltay are displacement amounts of the solving area in different directions, which are obtained in the step 3);
the specific method of the step 7) is as follows: selecting and executing the position detection of the turbulent flow region or the detail flow field detection according to different application requirements;
the specific method of the step 8) is as follows: setting a detection threshold according to the solution of the poisson equation obtained in the step 6), namely the refractive index field, and detecting a region with the refractive index larger than the threshold, wherein the region is the position of the disturbance region;
the specific method of the step 9) is as follows: according to the disturbance zone position obtained in the step 8), and the characteristics that the inherent characteristics of the moving target and the strong disturbance zone and the moving target exist simultaneously and certainly, the detected disturbance zone position is the position of the moving target;
the specific method of the step 10) is as follows: and (3) according to the Poisson equation solving result in the step (6), carrying out inversion on the flow field density by combining a refractive index and density relation formula Graniston-Dall formula to obtain a flow field density inversion result.
CN201911096413.3A 2019-11-11 2019-11-11 High-precision target atmosphere disturbance detection method Active CN110992399B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911096413.3A CN110992399B (en) 2019-11-11 2019-11-11 High-precision target atmosphere disturbance detection method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911096413.3A CN110992399B (en) 2019-11-11 2019-11-11 High-precision target atmosphere disturbance detection method

Publications (2)

Publication Number Publication Date
CN110992399A CN110992399A (en) 2020-04-10
CN110992399B true CN110992399B (en) 2023-06-06

Family

ID=70083679

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911096413.3A Active CN110992399B (en) 2019-11-11 2019-11-11 High-precision target atmosphere disturbance detection method

Country Status (1)

Country Link
CN (1) CN110992399B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111929702B (en) * 2020-09-23 2020-12-25 中国人民解放军国防科技大学 Aerial target atmospheric disturbance variable-resolution detection method, storage medium and system
CN112685852B (en) * 2020-12-22 2021-12-17 中国船舶重工集团公司第七0三研究所 Load customization pneumatic optimization method for axial flow compressor

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103793924A (en) * 2014-01-28 2014-05-14 河海大学 Flow field image self-adaption motion vector estimating method based on FHT-CC

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2870821B1 (en) * 2004-05-25 2007-08-17 Airbus France Sas AIRCRAFT COMPRISING A DEVICE FOR DETECTION AND / OR MEASUREMENT OF ATMOSPHERIC DISTURBANCES
CN102721967A (en) * 2012-06-21 2012-10-10 中国人民解放军电子工程学院 Method for discovering target in air based on disturbance type of wind field
CN109996080B (en) * 2017-12-31 2023-01-06 华为技术有限公司 Image prediction method and device and coder-decoder
DE102018202223A1 (en) * 2018-02-14 2019-08-14 Robert Bosch Gmbh A method and apparatus for providing integrity information for checking atmospheric correction parameters for correcting atmospheric disturbances in a satellite navigation for a vehicle

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103793924A (en) * 2014-01-28 2014-05-14 河海大学 Flow field image self-adaption motion vector estimating method based on FHT-CC

Also Published As

Publication number Publication date
CN110992399A (en) 2020-04-10

Similar Documents

Publication Publication Date Title
CN109272537B (en) Panoramic point cloud registration method based on structured light
Opromolla et al. Uncooperative pose estimation with a LIDAR-based system
CN113066162B (en) Urban environment rapid modeling method for electromagnetic calculation
Heineck et al. Background-oriented schlieren imaging of supersonic aircraft in flight
CN110992399B (en) High-precision target atmosphere disturbance detection method
CN108645416B (en) Non-cooperative target relative navigation simulation verification method based on vision measurement system
Li et al. Full-field wing deformation measurement scheme for in-flight cantilever monoplane based on 3D digital image correlation
CN107782288A (en) The method of atmospheric perturbation formula optical monitoring aircraft based on background schlieren imaging
CN113761646B (en) Method for determining dynamic response of aircraft in mobile wind field environment
CN111125869A (en) Moving target atmospheric disturbance characteristic simulation method
Liu et al. High-precision pose measurement method in wind tunnels based on laser-aided vision technology
Gong et al. Horn–Schunck optical flow applied to deformation measurement of a birdlike airfoil
Liu et al. Numerical investigation of the error caused by the aero-optical environment around an in-flight wing in optically measuring the wing deformation
Cai et al. Dynamic illumination optical flow computing for sensing multiple mobile robots from a drone
CN105631100B (en) The fluid simulation method of the infrared Characteristics of Wake of water scene objects
CN104931970A (en) Three-dimensional cloud field generating method based on airborne weather radar simulation
McQueen et al. The double backward-facing step: interaction of multiple separated flow regions
Zohdi A voxel-based machine-learning framework for thermo-fluidic identification of unknown objects
Rabinovitch et al. Full-scale supersonic parachute shape reconstruction using three-dimensional stereo imagery
Khorrami et al. A comparative study of simulated and measured gear-flap flow interaction
CN116202487A (en) Real-time target attitude measurement method based on three-dimensional modeling
Deng et al. Entropy flow-aided navigation
JP2799048B2 (en) Display method of object trajectory in three-dimensional space
Yu et al. 3D imaging application in the studies of micro air vehicles
CN112507282A (en) Flow display method based on velocity gradient tensor characteristics

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant