CN111553876B - Pneumatic optical sight error image processing method and system - Google Patents

Pneumatic optical sight error image processing method and system Download PDF

Info

Publication number
CN111553876B
CN111553876B CN202010198002.1A CN202010198002A CN111553876B CN 111553876 B CN111553876 B CN 111553876B CN 202010198002 A CN202010198002 A CN 202010198002A CN 111553876 B CN111553876 B CN 111553876B
Authority
CN
China
Prior art keywords
image
light beam
flow field
coordinate
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010198002.1A
Other languages
Chinese (zh)
Other versions
CN111553876A (en
Inventor
石伟龙
吴宇阳
甘才俊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Academy of Aerospace Aerodynamics CAAA
Original Assignee
China Academy of Aerospace Aerodynamics CAAA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Academy of Aerospace Aerodynamics CAAA filed Critical China Academy of Aerospace Aerodynamics CAAA
Priority to CN202010198002.1A priority Critical patent/CN111553876B/en
Publication of CN111553876A publication Critical patent/CN111553876A/en
Application granted granted Critical
Publication of CN111553876B publication Critical patent/CN111553876B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration using local operators
    • G06T5/30Erosion or dilatation, e.g. thinning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/155Segmentation; Edge detection involving morphological operators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/66Analysis of geometric attributes of image moments or centre of gravity
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T90/00Enabling technologies or technologies with a potential or indirect contribution to GHG emissions mitigation

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Quality & Reliability (AREA)
  • Geometry (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a pneumatic optical sight error image processing method and a pneumatic optical sight error image processing system. According to the invention, information extraction and statistical analysis are carried out on the test original image, so that the key information of vision error evaluation is obtained.

Description

Pneumatic optical sight error image processing method and system
Technical Field
The invention belongs to the technical field of aerospace experiments, and particularly relates to a pneumatic optical sight error image processing method and system.
Background
With the development of national defense industry in China, the flying speeds of various missiles are higher and higher, and the requirements on the guidance precision of the missiles are higher and higher. However, when the high-speed missile flies in the atmosphere, a severe pneumatic heating high-temperature environment can be generated around the missile, and the guidance system of the missile is generally provided with optical glass at the head, so that the image of a target can be conveniently received. At the same time, the jet flow is used for cooling, so that a mixed layer flow field is formed between the cooling air flow and the external air flow. The target light can generate serious aerodynamic optical effect after passing through the aerodynamic thermal field and the mixed layer flow field, and the light propagation direction deflects, so that the detection system can receive the deflected target image, and the deviation between the target position detected by the detector and the actual target position is the line-of-sight error. The sight error can lead to the missile aiming at the wrong target, influence the detection precision of the missile, and even the situation of off-target possibly occurs.
In order to solve the problem of missile sight errors, a large number of tests are required to be carried out on the ground, and a wind tunnel test method is generally adopted to simulate the flying state of the missile in the atmosphere and the process of receiving target light rays by a detector. However, the distortion of the light after passing through the test flow field is very complex, including direction deflection, energy attenuation, jitter, blurring, deformation, phase distortion, and the like. How to select effective technical indexes to evaluate the vision error degree in the test and reflect the influence of the flow field state on the light transmission is a key problem of the aerodynamic optical test. Meanwhile, an effective method for processing the sight line error image is lacking at present, information extraction and statistical analysis cannot be carried out on the test original image, and key information for evaluating the sight line error cannot be obtained.
Disclosure of Invention
The invention solves the technical problems that: the method and the system for processing the pneumatic optical sight error image are provided for overcoming the defects of the prior art, and the information extraction and the statistical analysis are carried out on the test original image to obtain the key information of the sight error evaluation.
The invention aims at realizing the following technical scheme: a method of aerodynamic optical line-of-sight error image processing, the method comprising the steps of: step one: the method comprises the steps that an original image containing N light beam spots is obtained in a wind tunnel test, and the original image is divided into N sub-images containing one light beam spot according to the positions of the light beam spots in the original image; step two: carrying out frequency domain transformation on the sub-image containing the light beam spots in the first step, obtaining frequency domain information from time domain to frequency domain, carrying out low-pass filtering on the frequency domain information, filtering noise in the background, and obtaining a filtered light beam image; step three: performing morphological operation on the light beam image filtered in the second step, and performing closing operation on the light beam image filtered by adopting a circular structural element with the radius of 2 pixels to obtain a light beam image with smooth edge and complete interior; step four: and thirdly, carrying out information extraction on the smooth edge and the complete internal beam image in the step three to obtain spot deformation information, direction deflection information and energy attenuation information.
In the above-mentioned aerodynamic optical line-of-sight error image processing method, in step three, the performing the closing operation on the filtered beam image using a circular structural element having a radius of 2 pixels includes the steps of:
(31) By structural elementsb expanding the filtered beam image f to obtain an expanded image f', expressed asThe definition is as follows:
wherein D is b B, f (x, y) is assumed to be- +_outside the f-domain, x is the x-coordinate of the beam image f, y is the y-coordinate of the beam image f, x 'is the x-coordinate of the structural element b, and y' is the y-coordinate of the structural element b;
(32) Corroding the inflated image f' with the structural element b to obtain a corroded image f ", expressed asThe definition is as follows:
wherein D is b The domain of b, f '(x, y) is assumed to be ++ -infinity outside the f' domain, x is the x-coordinate of the image f ', y is the y-coordinate of the image f', x 'is the x-coordinate of the structural element b, and y' is the y-coordinate of the structural element b.
In the above pneumatic optical line-of-sight error image processing method, in step four, obtaining the spot deformation information includes the following steps:
firstly, edge detection is carried out on an edge-smoothed and internal complete beam image, an image is subjected to edge recognition by adopting a Canny edge detection operator, the local maximum value of the gradient of the beam image is searched and defined as the edge of the beam image, then the area of the beam is calculated, the area of the beam is defined as the sum of all pixels in the beam image, all pixels in the edge of the recognized beam image are accumulated to obtain the area A of the beam, and the reference image area A at the moment of image and no flow field is calculated ref Is represented by the ratio AR of (2),
AR=A/A ref
wherein A is the area of the beam at a certain time, A ref The AR reflects the spot deformation information of the light beam after passing through the test flow field when the reference beam area is the reference beam area without the flow field.
In the above pneumatic optical line-of-sight error image processing method, in the fourth step, obtaining direction deviation information includes the following steps:
first, the centroid position of the beam needs to be determined, and the beam centroid is defined as follows:
X=(∑X i G i )/(∑G i )
wherein X is i Is the coordinates of the image pixel points, G i The gray value of the pixel point;
and then calculating centroid offset of the light beam after passing through the test flow field, and defining line-of-sight error BSE of the light beam:
BSE=(X-X ref )/L
wherein X is the centroid position of the light beam at a certain moment, X ref The centroid position of the reference beam when the flow field is not present, and L is the distance from the test flow field to the receiving camera;
the line-of-sight error BSE reflects the direction deflection information of the light beam after passing through the test flow field.
In the above pneumatic optical line-of-sight error image processing method, in step four, obtaining energy attenuation information includes the steps of:
firstly, calculating the average gray level B of all pixels in the edge curve of a light beam image by carrying out gray level statistics on the light beam image m Definition of the average Style ratio SR m
SR m =B m /B ref
In B of m B is the average gray level of the beam image at a certain moment ref Is the average gray scale of the reference beam in no flow field, SR m I.e. reflects the energy decay of the beam after passing through the test flow field.
A pneumatic optical line-of-sight error image processing system, comprising: the first module is used for dividing an original image containing N light beam spots obtained in the wind tunnel test into N sub-images containing one light beam spot according to the positions of the light beam spots in the original image; the second module is used for carrying out frequency domain transformation on the sub-image containing the light beam spots in the first module, obtaining frequency domain information from the time domain to the frequency domain of the sub-image, carrying out low-pass filtering on the frequency domain information, filtering noise in the background, and obtaining a filtered light beam image; a third module, configured to perform morphological operation on the filtered beam image in the second module, and perform a closing operation on the filtered beam image by using a circular structural element with a radius of 2 pixels, so as to obtain a beam image with smooth edge and complete interior; and the fourth module is used for extracting information of the edge smooth and internal complete beam image in the third module to obtain spot deformation information, direction deflection information and energy attenuation information.
In the above pneumatic optical line-of-sight error image processing system, the closing operation of the filtered beam image by using a circular structural element with a radius of 2 pixels comprises the following steps:
(31) The filtered beam image f is expanded with structural element b to obtain an expanded image f', expressed asThe definition is as follows:
wherein D is b B, f (x, y) is assumed to be- +_outside the f-domain, x is the x-coordinate of the beam image f, y is the y-coordinate of the beam image f, x 'is the x-coordinate of the structural element b, and y' is the y-coordinate of the structural element b;
(32) Corroding the inflated image f' with the structural element b to obtain a corroded image f ", expressed asThe definition is as follows:
wherein D is b The domain of b, f '(x, y) is assumed to be ++ -infinity outside the f' domain, x is the x-coordinate of the image f ', y is the y-coordinate of the image f', x 'is the x-coordinate of the structural element b, and y' is the y-coordinate of the structural element b.
In the pneumatic optical line-of-sight error image processing system, the method for obtaining the light spot deformation information comprises the following steps:
firstly, edge detection is carried out on an edge-smoothed and internal complete beam image, an image is subjected to edge recognition by adopting a Canny edge detection operator, the local maximum value of the gradient of the beam image is searched and defined as the edge of the beam image, then the area of the beam is calculated, the area of the beam is defined as the sum of all pixels in the beam image, all pixels in the edge of the recognized beam image are accumulated to obtain the area A of the beam, and the reference image area A at the moment of image and no flow field is calculated ref Is represented by the ratio AR of (2),
AR=A/A ref
wherein A is the area of the beam at a certain time, A ref The AR reflects the spot deformation information of the light beam after passing through the test flow field when the reference beam area is the reference beam area without the flow field.
In the pneumatic optical sight line error image processing system, the method for obtaining the direction deflection information comprises the following steps:
first, the centroid position of the beam needs to be determined, and the beam centroid is defined as follows:
X=(∑X i G i )/(∑G i )
wherein X is i Is the coordinates of the image pixel points, G i The gray value of the pixel point;
and then calculating centroid offset of the light beam after passing through the test flow field, and defining line-of-sight error BSE of the light beam:
BSE=(X-X ref )/L
wherein X is the centroid position of the light beam at a certain moment, X ref The centroid position of the reference beam when the flow field is not present, and L is the distance from the test flow field to the receiving camera;
the line-of-sight error BSE reflects the direction deflection information of the light beam after passing through the test flow field.
In the pneumatic optical line-of-sight error image processing system, the energy attenuation information obtaining method comprises the following steps: firstly, calculating the average gray level B of all pixels in the edge curve of a light beam image by carrying out gray level statistics on the light beam image m Definition of the average Style ratio SR m
SR m =B m /B ref
In B of m B is the average gray level of the beam image at a certain moment ref Is the average gray scale of the reference beam in no flow field, SR m I.e. reflects the energy decay of the beam after passing through the test flow field.
Compared with the prior art, the invention has the following beneficial effects:
according to the invention, a series of image processing and information extraction are carried out on a light beam image obtained by a pneumatic optical sight error test, three statistical parameters such as an image area ratio, a centroid deviation and an average Style ratio of the light beam are obtained, deformation, deflection and energy attenuation of the light beam after passing through a test flow field are reflected, the pneumatic optical sight error of the light beam after passing through the test flow field is evaluated, and the pneumatic optical effect of the test flow field is verified. The invention provides an effective evaluation method of aerodynamic optical line-of-sight errors, which overcomes the defects of unclear and incomplete evaluation indexes in the past aerodynamic optical research and is more beneficial to the research of aerodynamic optical line-of-sight error tests.
Drawings
Various other advantages and benefits will become apparent to those of ordinary skill in the art upon reading the following detailed description of the preferred embodiments. The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the invention. Also, like reference numerals are used to designate like parts throughout the figures. In the drawings:
FIG. 1 is a flow chart of a method for processing an aerodynamic optical line-of-sight error image provided by the invention;
FIG. 2 is an original image of a flow field through an experiment provided by the present invention;
FIG. 3 is a sub-image of a single beam of light after image segmentation provided by the present invention;
FIG. 4 is a filtered beam image provided by the present invention;
FIG. 5 is a representation of a beam image after morphological operations according to the present invention;
FIG. 6 is an image of a light beam after edge detection according to the present invention;
FIG. 7 is a graph showing the variation of the line of sight error BSE during the test provided by the present invention;
FIG. 8 is a plot of area ratio AR over the course of a test provided by the present invention;
FIG. 9 shows the average Style ratio SR during the test provided by the present invention m Is a change curve of (a).
Detailed Description
Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art. It should be noted that, without conflict, the embodiments of the present invention and features of the embodiments may be combined with each other. The invention will be described in detail below with reference to the drawings in connection with embodiments.
Fig. 1 is a flowchart of a pneumatic optical line-of-sight error image processing method provided by the invention. As shown in fig. 1, the method comprises the steps of:
step one: the method comprises the steps that an original image (N is the number of beam spots set in a test) containing N beam spots is obtained in a wind tunnel test, the original image is divided into N sub-images containing one beam spot according to the positions of the beam spots in the original image, and the sub-images are processed respectively;
step two: carrying out frequency domain transformation on the sub-image containing the light beam spots, changing the time domain of the sub-image into the frequency domain, carrying out low-pass filtering on the frequency domain information, filtering noise in the background, and obtaining a filtered light beam image, wherein the image contour is clear at the moment, but the inside has break points caused by some noise;
step three: and performing morphological operation on the filtered beam image, performing closing operation on the image by adopting a circular structural element with the radius of 2, filling up break points in the image, smoothing discontinuous points of edges, and obtaining the beam image with smooth edges and complete interior.
Step four: the preprocessed image is obtained after the first step, the second step and the third step, information extraction is carried out on the image, and the extracted information mainly comprises three parts, namely light spot deformation, direction deflection and energy attenuation. The method comprises the steps of firstly carrying out edge detection on a light beam image, carrying out edge recognition on the image by adopting a Canny edge detection operator, searching for a local maximum value of the gradient of the image to be defined as the edge of the image, then calculating the area of the light beam, wherein the area of the light beam is defined as the sum of all pixels in the light beam image, accumulating all pixels in the recognized image edge to obtain the area A of the light beam, and calculating the reference image area A at the moment of image and no flow field ref Is represented by the ratio AR of (2),
AR=A/A ref
wherein A is the area of the light beam at a certain moment, A ref The AR is the reference beam area without the flow field, and reflects the spot deformation of the beam after passing through the test flow field.
The direction deviation information first needs to determine the centroid position of the light beam, and defines the centroid of the light beam as follows:
X=(∑X i G i )/(∑G i )
wherein X is i Is the coordinates of the image pixel points, G i The gray value of the pixel. And then calculating centroid offset of the light beam after passing through the test flow field, and defining line-of-sight error BSE of the light beam:
BSE=(X-X ref )/L
wherein X is the centroid position of the light beam at a certain moment, X ref And L is the distance from the test flow field to the receiving camera. The line-of-sight error BSE reflects the directional deflection of the light beam after passing through the test flow field.
Energy decay informationFirstly, calculating the average gray level B of all pixels in the edge curve of a light beam image by carrying out gray level statistics on the light beam image m Definition of the average Style ratio SR m
SR m =B m /B ref
In B of m B is the average gray level of the beam image at a certain moment ref Is the average gray scale of the reference beam in no flow field, SR m I.e. reflects the energy decay of the beam after passing through the test flow field.
The method further comprises the step of evaluating: the second to fourth steps are the processing process of the sub-images containing the single light beam, and then the N sub-images separated from the original image are all processed by the steps, so that the aerodynamic optical flow field line-of-sight error statistical distribution of different spatial positions at a certain moment is obtained. Then, in the test process, a plurality of original images (the number of the acquired images is more than 100) at different moments can be acquired, and the original images at all moments are respectively processed, so that the change rule of the line-of-sight error of the pneumatic optical flow field along with time in the test process can be obtained. Finally, the aerodynamic optical line-of-sight error statistics of the test flow field under different test states are compared and analyzed, so that the aerodynamic optical line-of-sight error mechanism and rule related to the test states can be obtained, and data support and theoretical basis can be provided for design optimization of the high-speed missile guidance and imaging system.
Referring to fig. 1, the embodiment of the invention provides a pneumatic optical line-of-sight error image processing flow, which mainly comprises three parts: image preprocessing, information extraction and data evaluation. Firstly, an original image of a light beam is obtained from a pneumatic optical sight error test, wherein the original image is an image of a plurality of parallel light beams passing through different areas of a test flow field, and each light beam needs to be processed independently. The original image is subjected to image segmentation to obtain an image of each beam of light, the phenomena of deflection, deformation, energy attenuation and the like appear after the beams pass through a test flow field, and the image is in a round shape with irregular edges. The image is then frequency domain transformed and image filtered to filter out noise in the background. The obtained image has clear outline, but some break points caused by noise exist in the image, morphological operation is performed on the image, and the break points in the image are filled by adopting closing operation. The obtained image is a circular light spot with clear outline and continuous inside, and the preprocessing flow of the image is completed. And extracting information from the preprocessed image, extracting an edge contour of the image by adopting an edge recognition algorithm, calculating the number of pixels in the contour, the centroid position and the average gray level, and comparing the calculated pixel number, centroid position and average gray level with a corresponding reference image to obtain spot deformation, centroid offset and energy attenuation information of the light beam image passing through the test flow field. And finally, carrying out statistical analysis on information extracted from the light beam images at different spatial positions and at different moments in the test, so as to evaluate the aerodynamic optical line-of-sight error of the light beam after passing through the test flow field, evaluate the influence degree of the test flow field on light transmission, and provide basis for detection and aiming of the missile under the condition.
Fig. 2 is an original image of the flow field through the test, and simultaneously, images of 20 parallel light beams at different spatial positions are obtained, and because the deviation and distortion amount of the light beams passing through different regions of the flow field through the test are different, each light beam needs to be processed separately.
Fig. 3 is a sub-image of the single beam of light after image segmentation, and the light has the phenomena of deflection, deformation, energy attenuation and the like after passing through a test flow field, and the obtained beam image is an approximate circular image with edge deformation, blurring and position deflection.
Fig. 4 is the above-mentioned light beam image after image filtering, and the stray light in the background and the diffracted weak light around the light spot are removed by the filtering process, so that the obtained image contour becomes clear, but there are problems of edge discontinuity and break points inside.
Fig. 5 is a view of the above-mentioned beam image after morphological operation, where the discontinuity of the image edge and the existence of a break point therein affect the accuracy of subsequent information extraction, so that the break point is filled by a closing operation in the morphological operation. The closing operation can be understood as a swelling and then etching process of the image. The gray image f is inflated by the structural element b to obtain an inflated image f', which is expressed asThe definition is as follows:
wherein D is b B, f (x, y) is assumed to be- +_outside the f-domain, x is the x-coordinate of the beam image f, y is the y-coordinate of the beam image f, x 'is the x-coordinate of the structural element b, and y' is the y-coordinate of the structural element b.
Corroding the inflated image f' with the structural element b to obtain a corroded image f ", expressed asThe definition is as follows:
wherein D is b The domain of b, f '(x, y) is assumed to be ++ -infinity outside the f' domain, x is the x-coordinate of the image f ', y is the y-coordinate of the image f', x 'is the x-coordinate of the structural element b, and y' is the y-coordinate of the structural element b.
Morphological closing operations tend to smooth the contours of objects, removing dark details smaller than structural elements, connecting narrow breaks and filling fine defects. The beam image was closed using a circular structuring element with a radius of 2 pixels, and the dark spots inside the image and the discontinuous spots at the edges were smoothed out.
Fig. 6 is the above-mentioned beam image after edge detection, and image edge recognition is performed by using a Canny edge detection operator. The Canny edge detection operator finds edges by looking for local maxima of the gradients of the image. Let the pixels of the image be the two-dimensional function f (x, y), the gradient of which is defined as the vector:
the magnitude of this vector is:
in actual computation, a method of omitting square root is generally adopted to be simplified into:or take the absolute value:
the basic properties of the gradient vector are: the direction of the gradient vector is the direction of maximum rate of change of f at the coordinates (x, y). The angle of the maximum change rate is
The image is smoothed by a Gaussian filter with standard deviation sigma to reduce noise interference, then the gradient amplitude and angle of each point in the image are calculated, the edge point is defined as the point with the largest local intensity in the gradient direction, and the edge of the beam image is marked by a line with a pixel width.
And defining the area of the beam image as the sum of all pixels of the image, and accumulating all pixels in the identified image edge curve to obtain an image area A. The change of the area of the light beam reflects the jitter of the image, and the reference image area A is defined when the image is at a certain moment and no flow field exists ref Ratio AR of (2):
AR=A/A ref
the energy of the light beam passing through the test flow field is attenuated, which is expressed as the attenuation of the image gray scale, and the average gray scale B of all pixels in the image edge curve is adopted m Average gray level B of reference image without flow field ref The comparison is defined as the average Style ratio SR m
SR m =B m /B ref
The beam is deformed and deflected after passing through the test flow field, and the centroid point of the beam is used to represent the position of the beam.
Defining the beam centroid as:
X=(∑X i G i )/(∑G i )
wherein X is i Is the coordinates of the image pixel points, G i The gray value of the pixel.
The line of sight error of the light beam is defined as the image centroid position X and the reference image centroid position X without flow field ref Dividing the difference by the distance L from the test flow field to the receiving camera:
BSE=(X-X ref )/L
the aerodynamic optical line-of-sight error of the light beam after passing through the test flow field can be evaluated by carrying out statistical analysis on information extracted from the light beam images at different spatial positions and at different moments in the test, and the influence degree of the test flow field on light transmission can be evaluated. FIG. 7 shows the variation curve of the line of sight error BSE of a light beam passing through a test flow field during a test, FIG. 8 shows the variation curve of the area ratio AR of a light beam passing through a test flow field during a test, and FIG. 9 shows the average Style ratio SR of a light beam passing through a test flow field during a test m Is a change curve of (a). It can be seen that the three statistics are all obviously changed in the operation process of the test flow field, especially in the unsteady state of the starting process of the test flow field, the light beam change amplitude is very large, and the aerodynamic optical distortion is the most serious. The aerodynamic optical line-of-sight error statistics of the test flow fields in different states can be compared and analyzed, and a theoretical basis can be provided for design optimization of a high-speed missile guidance and imaging system.
The embodiment also provides a pneumatic optical line-of-sight error image processing system, which comprises: the first module is used for dividing an original image containing N light beam spots obtained in the wind tunnel test into N sub-images containing one light beam spot according to the positions of the light beam spots in the original image; the second module is used for carrying out frequency domain transformation on the sub-image containing the light beam spots in the first module, obtaining frequency domain information from the time domain to the frequency domain of the sub-image, carrying out low-pass filtering on the frequency domain information, filtering noise in the background, and obtaining a filtered light beam image; a third module, configured to perform morphological operation on the filtered beam image in the second module, and perform a closing operation on the filtered beam image by using a circular structural element with a radius of 2 pixels, so as to obtain a beam image with smooth edge and complete interior; and the fourth module is used for extracting information of the edge smooth and internal complete beam image in the third module to obtain spot deformation information, direction deflection information and energy attenuation information.
According to the invention, a series of image processing and information extraction are carried out on a light beam image obtained by a pneumatic optical sight error test, three statistical parameters such as an image area ratio, a centroid deviation and an average Style ratio of the light beam are obtained, deformation, deflection and energy attenuation of the light beam after passing through a test flow field are reflected, the pneumatic optical sight error of the light beam after passing through the test flow field is evaluated, and the pneumatic optical effect of the test flow field is verified. The invention provides an effective evaluation method of aerodynamic optical line-of-sight errors, which overcomes the defects of unclear and incomplete evaluation indexes in the past aerodynamic optical research and is more beneficial to the research of aerodynamic optical line-of-sight error tests.
Although the present invention has been described in terms of the preferred embodiments, it is not intended to be limited to the embodiments, and any person skilled in the art can make any possible variations and modifications to the technical solution of the present invention by using the methods and technical matters disclosed above without departing from the spirit and scope of the present invention, so any simple modifications, equivalent variations and modifications to the embodiments described above according to the technical matters of the present invention are within the scope of the technical matters of the present invention.

Claims (2)

1. A method of processing an aerodynamic optical line-of-sight error image, the method comprising the steps of:
step one: the method comprises the steps that an original image containing N light beam spots is obtained in a wind tunnel test, and the original image is divided into N sub-images containing one light beam spot according to the positions of the light beam spots in the original image;
step two: carrying out frequency domain transformation on the sub-image containing the light beam spots in the first step, obtaining frequency domain information from time domain to frequency domain, carrying out low-pass filtering on the frequency domain information, filtering noise in the background, and obtaining a filtered light beam image;
step three: performing morphological operation on the light beam image filtered in the second step, and performing closing operation on the light beam image filtered by adopting a circular structural element with the radius of 2 pixels to obtain a light beam image with smooth edge and complete interior;
step four: carrying out information extraction on the smooth edge and the complete internal beam image in the third step to obtain spot deformation information, direction deflection information and energy attenuation information;
in the third step, the method for performing the closing operation on the filtered beam image by adopting a circular structural element with the radius of 2 pixels comprises the following steps:
(31) The filtered beam image f is expanded with structural element b to obtain an expanded image f', expressed asThe definition is as follows:
wherein D is b B, f (x, y) is assumed to be- +_outside the f-domain, x is the x-coordinate of the beam image f, y is the y-coordinate of the beam image f, x 'is the x-coordinate of the structural element b, and y' is the y-coordinate of the structural element b;
(32) Corroding the inflated image f' with the structural element b to obtain a corroded image f ", expressed asThe definition is as follows:
wherein D is b The domain of b, f '(x, y) is assumed to be ++ -infinity outside the f' domain, x is the x-coordinate of the image f ', y is the y coordinate of the image f', x 'is the x coordinate of the structural element b, and y' is the y coordinate of the structural element b;
in the fourth step, obtaining the spot deformation information includes the following steps:
firstly, edge detection is carried out on an edge-smoothed and internal complete beam image, an image is subjected to edge recognition by adopting a Canny edge detection operator, the local maximum value of the gradient of the beam image is searched and defined as the edge of the beam image, then the area of the beam is calculated, the area of the beam is defined as the sum of all pixels in the beam image, all pixels in the edge of the recognized beam image are accumulated to obtain the area A of the beam, and the reference image area A at the moment of image and no flow field is calculated ref Is represented by the ratio AR of (2),
AR=A/A ref
wherein A is the area of the beam at a certain time, A ref When the reference beam area is the reference beam area without the flow field, the AR reflects the spot deformation information of the beam after passing through the test flow field;
in the fourth step, obtaining the direction deviation information includes the steps of:
first, the centroid position of the beam needs to be determined, and the beam centroid is defined as follows:
X=(∑X i G i )/(∑G i )
wherein X is i Is the coordinates of the image pixel points, G i The gray value of the pixel point;
and then calculating centroid offset of the light beam after passing through the test flow field, and defining line-of-sight error BSE of the light beam:
BSE=(X-X ref )/L
wherein X is the centroid position of the light beam at a certain moment, X ref The centroid position of the reference beam when the flow field is not present, and L is the distance from the test flow field to the receiving camera;
the line-of-sight error BSE reflects the direction deflection information of the light beam after passing through the test flow field;
in step four, obtaining energy attenuation information includes the steps of:
first, go throughPerforming gray level statistics on the light beam image, and calculating average gray level B of all pixels in the edge curve of the light beam image m Definition of the average Style ratio SR m
SR m =B m /B ref
In B of m B is the average gray level of the beam image at a certain moment ref Is the average gray scale of the reference beam in no flow field, SR m I.e. reflects the energy decay of the beam after passing through the test flow field.
2. A pneumatic optical line-of-sight error image processing system, comprising:
the first module is used for dividing an original image containing N light beam spots obtained in the wind tunnel test into N sub-images containing one light beam spot according to the positions of the light beam spots in the original image;
the second module is used for carrying out frequency domain transformation on the sub-image containing the light beam spots in the first module, obtaining frequency domain information from the time domain to the frequency domain of the sub-image, carrying out low-pass filtering on the frequency domain information, filtering noise in the background, and obtaining a filtered light beam image;
a third module, configured to perform morphological operation on the filtered beam image in the second module, and perform a closing operation on the filtered beam image by using a circular structural element with a radius of 2 pixels, so as to obtain a beam image with smooth edge and complete interior;
the fourth module is used for extracting information of the edge smooth and internal complete beam images in the third module to obtain spot deformation information, direction deflection information and energy attenuation information;
the closing operation of the filtered beam image with a circular structuring element of radius 2 pixels comprises the steps of:
(31) The filtered beam image f is expanded with structural element b to obtain an expanded image f', expressed asThe definition is as follows:
wherein D is b B, f (x, y) is assumed to be- +_outside the f-domain, x is the x-coordinate of the beam image f, y is the y-coordinate of the beam image f, x 'is the x-coordinate of the structural element b, and y' is the y-coordinate of the structural element b;
(32) Corroding the inflated image f' with the structural element b to obtain a corroded image f ", expressed asThe definition is as follows:
wherein D is b The domain of b, f '(x, y) is assumed to be ++ -infinity outside the f' domain, x is the x-coordinate of the image f ', y is the y coordinate of the image f', x 'is the x coordinate of the structural element b, and y' is the y coordinate of the structural element b;
the method for obtaining the light spot deformation information comprises the following steps:
firstly, edge detection is carried out on an edge-smoothed and internal complete beam image, an image is subjected to edge recognition by adopting a Canny edge detection operator, the local maximum value of the gradient of the beam image is searched and defined as the edge of the beam image, then the area of the beam is calculated, the area of the beam is defined as the sum of all pixels in the beam image, all pixels in the edge of the recognized beam image are accumulated to obtain the area A of the beam, and the reference image area A at the moment of image and no flow field is calculated ref Is represented by the ratio AR of (2),
AR=A/A ref
wherein A is the area of the beam at a certain time, A ref When the reference beam area is the reference beam area without the flow field, the AR reflects the spot deformation information of the beam after passing through the test flow field;
the method for obtaining the direction deviation information comprises the following steps:
first, the centroid position of the beam needs to be determined, and the beam centroid is defined as follows:
X=(∑X i G i )/(∑G i )
wherein X is i Is the coordinates of the image pixel points, G i The gray value of the pixel point;
and then calculating centroid offset of the light beam after passing through the test flow field, and defining line-of-sight error BSE of the light beam:
BSE=(X-X ref )/L
wherein X is the centroid position of the light beam at a certain moment, X ref The centroid position of the reference beam when the flow field is not present, and L is the distance from the test flow field to the receiving camera;
the line-of-sight error BSE reflects the direction deflection information of the light beam after passing through the test flow field;
the energy attenuation information is obtained by the following steps:
firstly, calculating the average gray level B of all pixels in the edge curve of a light beam image by carrying out gray level statistics on the light beam image m Definition of the average Style ratio SR m
SR m =B m /B ref
In B of m B is the average gray level of the beam image at a certain moment ref Is the average gray scale of the reference beam in no flow field, SR m I.e. reflects the energy decay of the beam after passing through the test flow field.
CN202010198002.1A 2020-03-19 2020-03-19 Pneumatic optical sight error image processing method and system Active CN111553876B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010198002.1A CN111553876B (en) 2020-03-19 2020-03-19 Pneumatic optical sight error image processing method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010198002.1A CN111553876B (en) 2020-03-19 2020-03-19 Pneumatic optical sight error image processing method and system

Publications (2)

Publication Number Publication Date
CN111553876A CN111553876A (en) 2020-08-18
CN111553876B true CN111553876B (en) 2023-11-10

Family

ID=72007240

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010198002.1A Active CN111553876B (en) 2020-03-19 2020-03-19 Pneumatic optical sight error image processing method and system

Country Status (1)

Country Link
CN (1) CN111553876B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113034415B (en) * 2021-03-23 2021-09-14 哈尔滨市科佳通用机电股份有限公司 Method for amplifying small parts of railway locomotive image

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101886972A (en) * 2010-04-09 2010-11-17 中国科学院上海技术物理研究所 System and method for testing influence of vacuum plume on infrared laser characteristics
CN108692820A (en) * 2018-05-23 2018-10-23 马晓燠 A kind of Wavefront measuring apparatus and method
CN110246115A (en) * 2019-04-23 2019-09-17 西安理工大学 A kind of detection method of far-field laser light spot image

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
IL150243A (en) * 2002-06-16 2010-05-31 Rafael Advanced Defense Sys Method and system for evaluating optical disturbances occuring in a supersonic flow field

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101886972A (en) * 2010-04-09 2010-11-17 中国科学院上海技术物理研究所 System and method for testing influence of vacuum plume on infrared laser characteristics
CN108692820A (en) * 2018-05-23 2018-10-23 马晓燠 A kind of Wavefront measuring apparatus and method
CN110246115A (en) * 2019-04-23 2019-09-17 西安理工大学 A kind of detection method of far-field laser light spot image

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
孙喜万 等.气动光学效应研究进展综述.中国力学大会 CCTAM 2019.2019,第1.2.2-1.2.3节. *

Also Published As

Publication number Publication date
CN111553876A (en) 2020-08-18

Similar Documents

Publication Publication Date Title
CN111626290B (en) Infrared ship target detection and identification method under complex sea surface environment
CN108510467B (en) SAR image target identification method based on depth deformable convolution neural network
CN104981105B (en) A kind of quickly accurate detection and method for correcting error for obtaining element central and deflection angle
CN106530347B (en) Stable high-performance circle feature detection method
CN110728668B (en) Airspace high-pass filter for maintaining small target form
CN107504966A (en) There is the method that nautical star asterism extracts under cloud environment in a kind of daytime
CN112683228A (en) Monocular camera ranging method and device
CN115655263A (en) Star extraction method based on attitude information
CN114972083A (en) Image restoration method based on measured data under complex optical imaging condition
CN111553876B (en) Pneumatic optical sight error image processing method and system
CN107479037B (en) PD radar clutter area distinguishing method
Zhang et al. Nearshore vessel detection based on Scene-mask R-CNN in remote sensing image
CN115100616A (en) Point cloud target detection method and device, electronic equipment and storage medium
CN110705553A (en) Scratch detection method suitable for vehicle distant view image
CN113223074A (en) Underwater laser stripe center extraction method
CN109325958A (en) A kind of offshore ship detection method for refining and improve generalised Hough transform based on profile
CN116740572A (en) Marine vessel target detection method and system based on improved YOLOX
CN112330669B (en) Star point position positioning method of star sensor based on point light source diffraction starburst phenomenon
Yang et al. Method for building recognition from FLIR images
CN115641300A (en) Data processing method for measuring flow field speed based on mark line characteristics
CN111932635B (en) Image calibration method adopting combination of two-dimensional and three-dimensional vision processing
CN111473944B (en) PIV data correction method and device for observing complex wall surface in flow field
CN114596271A (en) Method for extracting corrosion characteristics of high-light-reflection surface
Yao et al. Real-time multiple moving targets detection from airborne IR imagery by dynamic Gabor filter and dynamic Gaussian detector
CN115096196B (en) Visual height and speed measuring method and system for rocket recovery and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant