CN111951375A - Method for extracting flow particle image in curved surface visualization model - Google Patents

Method for extracting flow particle image in curved surface visualization model Download PDF

Info

Publication number
CN111951375A
CN111951375A CN202010688286.2A CN202010688286A CN111951375A CN 111951375 A CN111951375 A CN 111951375A CN 202010688286 A CN202010688286 A CN 202010688286A CN 111951375 A CN111951375 A CN 111951375A
Authority
CN
China
Prior art keywords
image
point
shooting
calibration plate
curved surface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010688286.2A
Other languages
Chinese (zh)
Other versions
CN111951375B (en
Inventor
王宏伟
袁明磊
李晓辉
于靖波
王旭东
黄湛
秦永明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Academy of Aerospace Aerodynamics CAAA
Original Assignee
China Academy of Aerospace Aerodynamics CAAA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Academy of Aerospace Aerodynamics CAAA filed Critical China Academy of Aerospace Aerodynamics CAAA
Priority to CN202010688286.2A priority Critical patent/CN111951375B/en
Publication of CN111951375A publication Critical patent/CN111951375A/en
Application granted granted Critical
Publication of CN111951375B publication Critical patent/CN111951375B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/20Design optimisation, verification or simulation
    • G06F30/25Design optimisation, verification or simulation using particle-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T90/00Enabling technologies or technologies with a potential or indirect contribution to GHG emissions mitigation

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Geometry (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Image Processing (AREA)

Abstract

A method for extracting a flow particle image in a curved surface visualization model can realize clear particle images with controllable distortion in a three-dimensional curved surface visualization window, and further perform velocity field measurement, and belongs to the technical field of aerospace particle image velocity measurement. The invention comprises the following steps: determining the optical layout of the flow field shooting in the visual model, and simulating the optical path difference of each shooting point; arranging a plano-convex compensation mirror outside the visual model to form a compensation optical system for compensating the optical path difference of each shooting point so that the optical paths from each point illuminated by the laser sheet light to an image point on a camera shooting chip are consistent; shooting a standard calibration plate image in a plane by using a compensating optical system to establish a mapping relation between a distorted image and a theoretical undistorted image; and shooting a particle image in a test state, and correcting the particle image by using a mapping relation obtained by calibration.

Description

Method for extracting flow particle image in curved surface visualization model
Technical Field
The invention relates to a method for extracting an inflow particle image in a curved surface visualization model, which can realize clear and distortion-controllable particle images in a three-dimensional curved surface visualization window so as to measure a velocity field, and belongs to the technical field of aerospace particle image velocity measurement.
Background
In the design of a new generation of airplane, in order to improve the stealth performance of the airplane, an engine is required to be arranged inside a fuselage and embedded into the fuselage, so that the engine air inlet is required to be designed into an S shape. The S-shaped air inlet channel not only can reduce the radar scattering cross section of the propulsion system, but also greatly reduces the resistance of the whole aircraft due to the fact that the whole propulsion system is arranged in the aircraft body, and is beneficial to improving the flight performance of the whole aircraft. However, due to the shape limitation of the S-shaped air inlet and the air inlet characteristic, the air flow is easily separated in the S-shaped air inlet, which causes the circumferential distortion index of the outlet of the air inlet to be increased and the total pressure recovery coefficient to be reduced.
The traditional measuring method generally adopts an embedded sensor to measure the internal pressure characteristic of an engine to predict the flow separation condition, and has low judgment precision on the separation position and no contribution to the space flow field characteristic because the distribution of points is impossible to be too many, so that the result is not very intuitive and the obtained information quantity is small. With the development of a non-contact surface measurement technology and the establishment of a model visualization technology, non-contact, global and dynamic display and measurement of a flow field in the annular air inlet channel become possible. The PIV can obtain a two-dimensional velocity field on a two-dimensional section, a three-component velocity field (SPIV) of the two-dimensional section, a three-dimensional velocity field (TOMOPIV) in a three-dimensional space and even a velocity field (TRPIV) with time resolution through the development in recent years, can obtain more visual and abundant flow information, can obtain the dynamic characteristics of the flow field, and provides a powerful tool for the performance prediction of the unsteady flow field characteristic of an air inlet channel and the implementation and comparative analysis of a flow control means.
In a wind tunnel test of actual needs, in order to better simulate a real aircraft air inlet profile, a visualization window of the wind tunnel test is an irregular three-dimensional curved surface, when an internal flow field structure is observed on the outer side, optical path difference and image distortion are caused, the obtained image is difficult to identify and is not beneficial to cross-correlation calculation of particle images, a necessary optical design method needs to be introduced to enable the imaging to be clear, and meanwhile, a certain calibration and correction method needs to be adopted to restore a real flow field image.
Disclosure of Invention
The technical problem solved by the invention is as follows: the method overcomes the defects of the prior art, provides a curved surface visualization model inflow particle image extraction method, and solves the problem that clear and undistorted particle images cannot be obtained in the existing curved surface visualization model due to the insufficient optical conditions.
The technical solution of the invention is as follows: a curved surface visualization model inflow particle image extraction method comprises the following steps:
determining the optical layout of the flow field shooting in the visual model, and simulating the optical path difference of each shooting point;
arranging a plano-convex compensation mirror outside the visual model to form a compensation optical system for compensating the optical path difference of each shooting point so that the optical paths from each point illuminated by the laser sheet light to an image point on a camera shooting chip are consistent;
shooting a standard calibration plate image in a plane by using a compensating optical system to establish a mapping relation between a distorted image and a theoretical undistorted image;
and shooting a particle image in a test state, and correcting the particle image by using a mapping relation obtained by calibration.
Furthermore, the circle center of the plano-convex compensating mirror is positioned at the distance d from the visual window of the visual model2The distance from the virtual image convergence point B is B, the deflection angle between the compensated light ray and the optical axis of the virtual image convergence point B is beta, wherein B is d-d2+(l'd+l'r2)/(l-l'),β=arctan((l-l')/(d+r2) D is the distance from the outer molded surface to the camera chip on the optical axis, l' is the distance from the point to be compensated to the original point image on the camera imaging chip, r2The radius of the outer surface of the visual window is shown, and l is the distance from a point to be compensated to the origin on the plane to be shot.
Further, the method for establishing the mapping relationship between the distorted image and the theoretical undistorted image comprises the following steps:
solving image XY coordinate correction weighting matrix WxAnd Wy
By passing
Figure 100002_2
Solving the coordinate value of the corresponding point of each pixel point on the corrected image on the image before correction to complete the mapping between the distorted image and the theoretical undistorted image; wherein,
Figure 100002_1
for functional functions of coordinate points on the pre-calibration plate image X, Y is the abscissa and ordinate of coordinate points (X ', Y') of the pre-calibration plate image corresponding to coordinate points on the standard calibration plate image, Xj、YjThe central coordinate value of the circular spot array used for calibration on the standard calibration plate image is shown, N is the number of the circular spot marking points used for calibration, and G is a mapping function.
Further, the solved image XY coordinate correction weighting matrix WxAnd WyThe method comprises the following steps:
respectively using the identified calibration plate image circle spot array center value (X'i,Y′i) And the central value (X) of the circular spot array on the standard calibration platej,Yj) Establishing a relation:
Figure BDA0002588411420000033
wherein, WxjThe center coordinates (X) of the circular spot array on the standard calibration platej,Yj) Calibration plate image circle spot array center coordinate (X 'before correction'i,Y′i) Middle X'iWeighting coefficient of WyjThe center coordinates (X) of the circular spot array on the standard calibration platej,Yj) Calibrating plate image circle spot array center coordinate point (X 'before correction'i,Y′i) Middle Y'iThe weighting coefficient of (2);
let Gi,j=G(Xi-Xj,Yi-Yj) (ii) a Inputting all the calibration point values to obtain an NxN equation set:
Figure BDA0002588411420000034
wherein X'I=(X′1,X′2,…,X′i)T,Y′I=(Y′1,Y′2,…,Y′i)T
Figure BDA0002588411420000035
Wx=(Wx1,Wx2,…,WxN),Wy=(Wy1,Wy2,…,WyN);
Solving for
Figure BDA0002588411420000036
Available image XY coordinate correction weighting matrix WxAnd Wy
Further, the mapping function G is G2(x)=|x|2(ln | x | -1); wherein x is a parameter vector.
A computer-readable storage medium, in which a computer program is stored, which, when being executed by a processor, performs the steps of the method for extracting an image of flow particles in a curved surface visualization model.
An apparatus for flow particle image extraction in a surface visualization model, comprising a memory, a processor, and a computer program stored in the memory and executable on the processor, characterized in that: and when the processor executes the computer program, the steps of the method for extracting the flow particle image in the curved surface visualization model are realized.
Compared with the prior art, the invention has the advantages that:
(1) the PIV system adopts a laser light source, has excellent monochromaticity and almost does not have chromatic dispersion aberration, so the optical path of light rays entering the camera at each position is balanced completely through the optical path compensation design, the light rays are consistent as much as possible, all tracing particles on a light shooting plane can be imaged clearly, and chromatic aberration correction is not required to be considered. Based on the principle, under the condition that the external shape of the model visualization window can be properly modified, the curved glass of the visual model can be designed into self-compensating thickness variation, so that the margin of shooting layout is improved;
(2) the distorted image correction part adopts a two-dimensional double harmonic spline interpolation function based on a Green function as a mapping function, corrects a nonlinear distorted image generated by a three-dimensional curved surface visual model, has the characteristic of minimum curvature, has smooth correction effect, is not easy to generate oscillation, can ensure local and overall correction precision, and adopts a bicubic interpolation method to perform gray interpolation when the image gray is sampled, thereby ensuring the interpolation precision and the quality of image gray resampling reconstruction;
drawings
Fig. 1 is a schematic diagram of optical path calculation of an approximate spherical visible window, where 1 is a laser sheet, 2 is a sheet light incident window, 3 is a metal model, 4 is a visible optical window, 5 is a camera chip, 6 is a lens depth of field, 7 is an uncompensated focus plane, 8 is an optical path compensating mirror, and 9 is a compensated focus plane;
FIG. 2 is a schematic diagram of optical path compensation of an approximate spherical viewing window;
FIG. 3 is an image of a calibration plate prior to correction under typical distortion conditions;
FIG. 4 is a calibrated plate image after correction under typical distortion conditions.
Detailed Description
The method for extracting an inflow particle image in a curved surface visualization model provided by the embodiment of the present application is further described in detail below with reference to the drawings in the specification, and a specific implementation manner of the method may include the following steps (a method flow is shown in fig. 1):
step 1: the basic optical layout of the internal flow field shooting of the visual model is determined, and the optical path difference of each shooting point is simulated.
Specifically, in the solution provided in the embodiment of the present application, under an unmodified condition, the visualization window is generally designed with an equal thickness, and may be processed to approximate to a spherical surface, as shown in fig. 1, where point O is a spherical center, a radius of an inner profile of the model is r1, a radius of an outer profile of the model is r2, a distance from the outer profile on the optical axis to the camera chip is d, a depth of field of the camera shooting system is h, and an optical path from the point O to the point O' is defined as follows when an incident laser sheet light 1 passes through the point O:
sOO'=d+r1+n(r2-r1)
point a is a point on the optical sheet surface at a distance of O point l, point a ' is an image of point a on the camera chip, the distance between the point a and the point O ' is l ', points a1 and a2 are respectively an incident point and an emergent point of the point a to the point a ' in the visible window, and points θ Ain and θ Aout are respectively angles between a1 and a2 and the center of the circle relative to the optical axis, then the optical path from the point a to the point a ' can be expressed as:
Figure BDA0002588411420000051
wherein
Figure BDA0002588411420000052
Figure BDA0002588411420000053
Figure BDA0002588411420000054
The calculation formula of the incident angle and the emergent angle is
Figure BDA0002588411420000055
Figure BDA0002588411420000056
It can be seen thatOO'Is i.e. sAA'At thetaAin=θAoutIn the limit of 0, in this case the focal plane of the image on the camera chip is located in an approximately arc-shaped area (uncompensated focal plane 7 in fig. 1) that deviates from the plane of the sheet.
Step 2: and (5) introducing an optical path compensation design and determining an optical path compensation amount.
Specifically, in the solution provided in this embodiment of the present application, taking a typical optical path compensation system as an example, as shown in fig. 2, a plano-convex compensation mirror with a fan angle α and a radius rb is added, a circle center is located at a position where a distance d2 from a visible window and a distance B from a virtual image convergence point B are provided, an off-angle between the virtual image convergence point B and an optical axis of a compensated light ray is β, and a compensation system parameter can be approximately determined according to an uncompensated parameter as follows
b≈d-d2+(l'd+l'r2)/(l-l')
β≈arctan((l-l')/(d+r2))
Further estimating the optical path length from A3 to A4 point in the compensation system
Figure BDA0002588411420000061
As follows
Figure BDA0002588411420000062
It can be seen that when the point a of the space to be photographed moves to the point O, the optical path from the point A3 to the point a4
Figure BDA0002588411420000063
Continuously increases to the point O and reaches the maximum value
Figure BDA0002588411420000064
Compensated optical path
Figure BDA0002588411420000065
The optical path compensation design is carried out to ensure that the optical paths from each point illuminated by the laser sheet to the image point on the camera shooting chip are consistent, and the optical path difference is controlled within the depth of field range of the camera, namely the requirement is met
Figure BDA0002588411420000066
Since the light intensity of the particle scattering laser sheet is weak, the aperture needs to be increased to increase the light entering amount under most conditions during actual shooting, so that the depth of field of a camera shooting system is not too large, therefore, a proper profile radius and a half-fan angle of a compensating mirror need to be selected to meet the compensation condition, a formed focusing plane is positioned in the depth of field range of the camera and is close to the light illumination area of the laser sheet (a compensated focusing plane 9 in fig. 2. particularly, the profile of a model visualization window can be properly modified, the design of the curved glass of a visual model into self-compensated thickness change is considered, the basic compensation principle is the same as the principle of adding the compensating mirror in the embodiment, and the auxiliary calculation can be carried out by means of optical design software for more complex profile design.
And step 3: and shooting a standard calibration board image in the plane by using the compensated optical system and illuminating the plane by the photo light to establish a mapping relation between a distorted image and a theoretical undistorted image.
Specifically, in the solution provided in the embodiment of the present application, the correction of the particle image is not supported by the calibration plate data, a correction relationship is established between the center point of the circular spot of the calibration plate image photographed by the camera and the center point of the circular spot of the standard calibration plate, and the distorted image is converted into a distortion-free image with an orthographic effect by using the correction relationship, as shown in fig. 3 and 4.
Defining coordinates of points on an image taken by a camera before correction as
Figure BDA0002588411420000071
(Unit: Pixel), after correctionThe coordinates of a point on the image are
Figure BDA0002588411420000072
(unit: Pixel) and corresponding real space coordinates are
Figure BDA0002588411420000073
The following mapping relationship exists between the two groups:
Figure BDA0002588411420000074
Figure BDA0002588411420000075
by definition, mapping F represents the process of capturing an image by a camera, while mapping G represents the process of geometrically correcting the captured distorted image. Theoretically, it is assumed that the mapping function G is present so that the target image after the distortion image correction is the same as the image with a certain magnification M in the front view, and the coordinates of the points on the target image are
Figure BDA0002588411420000076
Then there is
Figure BDA0002588411420000077
By which the accuracy of the correction map G can be defined:
Figure BDA0002588411420000078
the higher the precision of G, the smaller the Δ x,
Figure BDA0002588411420000079
the closer to each other
Figure BDA00025884114200000710
In a possible implementation manner, the image correction part adopts a two-dimensional double harmonic spline interpolation function w based on a green function as a mapping function G, and for a two-dimensional space, a specific expression of a double harmonic green function Gm is
G2=|x|2(ln|x|-1) (Eq-5)
The spline interpolation function w has the characteristic of minimum curvature, so that a curved surface generated by the spline interpolation function w is smooth and is not easy to generate oscillation, is not easy to be influenced by local errors, and is beneficial to interpolating irregular space data points.
For correcting the two-dimensional image in this experiment, the correction formula is:
Figure BDA0002588411420000081
wherein,
Figure BDA0002588411420000082
as a function of the coordinate points of the image before correction, (X)j,Yj) The central value, W, of the circular spot array on the standard calibration platejFor correcting the weighted value, a relationship can be established between the calibration plate image shot by the camera and the standard calibration plate image to solve the weighted value Wj
Respectively using the identified calibration plate image circle spot array center value (X'i,Y′i) And the central value (X) of the circular spot array on the standard calibration platej,Yj) Establishing a relation:
Figure BDA0002588411420000083
Figure BDA0002588411420000084
order to
Gi,j=G(Xi-Xj,Yi-Yj) (Eq-9)
Inputting all the calibration point values to obtain an NxN equation set,
X′I=GIWx (Eq-10)
Y′I=GIWy (Eq-11)
wherein
X′I=(X′1,X′2,…,X′i)T (Eq-12)
Y′I=(Y′1,Y′2,…,Y′i)T (Eq-13)
Figure BDA0002588411420000091
Wx=(Wx1,Wx2,…,WxN) (Eq-15)
Wy=(Wy1,Wy2,…,WyN) (Eq-16)
Solving the equation set can obtain an image XY coordinate correction weighting matrix WxAnd WyAnd then, the coordinate value of the corresponding point of each pixel point on the corrected image on the image before correction is calculated through (Eq-6).
And 4, step 4: and (4) shooting a particle image under a test state, and correcting the particle image by using a correction coefficient obtained by calibration.
According to the formula (Eq-6), the correction algorithm adopts a reverse gray sampling method, namely, the corresponding pixel point of the corrected image is found out by utilizing the integer pixel value of the corrected image through a Green function interpolation method. The advantage of doing so is that, generally speaking, the image after the correction will carry out certain amplification to the image before the correction, and then its coordinate value must not exceed the image boundary before the correction after the pixel on the image after the correction is mapped back to the image before the correction, guarantees the effective utilization of its scope. The pixel points mapped to the image before correction are generally non-integer values, and gray sampling is carried out on the pixel points, so that image gray interpolation is required.
In a possible implementation manner, a bicubic interpolation method is adopted to perform gray level interpolation, the coordinates of a point on an image before correction are obtained by mapping and are (i + u, j + v), the bicubic interpolation utilizes u, v values and adjacent 16 point gray level values to interpolate to obtain the gray level value of the point, and the interpolation basis function is as follows:
Figure BDA0002588411420000092
the gray interpolation formula at (i + u, j + v) is calculated as follows:
I'(i+u,j+v)=ABC (Eq-18)
wherein A, B, C matrix form is as follows:
A=[S(1+u) S(u) S(1-u) S(2-u)] (Eq-19-a)
Figure BDA0002588411420000101
C=[S(1+v) S(v) S(1-v) S(2-v)]T (Eq-19-c)
although the bicubic interpolation method is more complex and time-consuming than several commonly used interpolation methods such as a nearest point interpolation method, a bilinear interpolation method and the like, the interpolation precision is higher, an edge enhancement effect is achieved, and the quality of image gray level resampling reconstruction can be guaranteed.
The invention balances the optical paths of the light rays entering the camera at all positions through the optical path compensation design to ensure that the light rays are consistent as much as possible, and the particles on the photo shooting plane can be clearly imaged. The external compensator is introduced aiming at a visual model with the size of the inner interface and the outer interface of a window not adjustable, and due to the fact that the optical path deviation of two beams of light without the compensator reaching a shooting plane is likely to be larger through a curved surface model, the problem that clear imaging cannot be achieved on the shooting plane at the same time is caused. If the external surface of the model visualization window can be properly modified, the curved glass of the visual model is designed into the self-compensating thickness change, so that the large shooting margin can be realized. And calculating the optical path difference of the imaging system through basic optical system indexes including the refractive index of a window material, the shooting distance and the like, and determining a compensation scheme.
Although the problem of the optical path difference of shooting is solved, the visible model has curvature and is probably an irregular three-dimensional curved surface, and the particle image in the visible model has large distortion, so that the calculated speed field has large deviation and cannot reflect the real situation of the air inlet flow field. The invention adopts a multi-point calibration plate to calibrate the shooting plane, corrects the distortion of the shot image through a correction algorithm, and restores the real flow field image, and mainly comprises the following two steps: a. establishing a mapping relation between a distorted image and an undistorted image before and after correction, and b, carrying out gray sampling on the image before correction of the corrected image.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is intended to include such modifications and variations as well.
Those skilled in the art will appreciate that those matters not described in detail in the present specification are well known in the art.

Claims (7)

1. A curved surface visualization model inflow particle image extraction method is characterized by comprising the following steps:
determining the optical layout of the flow field shooting in the visual model, and simulating the optical path difference of each shooting point;
arranging a plano-convex compensation mirror outside the visual model to form a compensation optical system for compensating the optical path difference of each shooting point so that the optical paths from each point illuminated by the laser sheet light to an image point on a camera shooting chip are consistent;
shooting a standard calibration plate image in a plane by using a compensating optical system to establish a mapping relation between a distorted image and a theoretical undistorted image;
and shooting a particle image in a test state, and correcting the particle image by using a mapping relation obtained by calibration.
2. The method for extracting flow particle images in the curved surface visualization model according to claim 1, wherein the method comprises the following steps: the distance d between the circle center of the plano-convex compensating mirror and the visual window of the visual model2The distance from the virtual image convergence point B is B, the deflection angle between the compensated light ray and the optical axis of the virtual image convergence point B is beta, wherein B is d-d2+(l'd+l'r2)/(l-l'),β=arctan((l-l')/(d+r2) D is the distance from the outer molded surface to the camera chip on the optical axis, l' is the distance from the point to be compensated to the original point image on the camera imaging chip, r2The radius of the outer surface of the visual window is shown, and l is the distance from a point to be compensated to the origin on the plane to be shot.
3. The method for extracting flow particle images in the curved surface visualization model according to claim 1, wherein the method for establishing the mapping relationship between the distorted image and the theoretical undistorted image comprises the following steps:
solving image XY coordinate correction weighting matrix WxAnd Wy
By passing
Figure 2
Solving the coordinate value of the corresponding point of each pixel point on the corrected image on the image before correction to complete the mapping between the distorted image and the theoretical undistorted image; wherein,
Figure 1
for functional functions of coordinate points on the pre-calibration plate image X, Y is the abscissa and ordinate of coordinate points (X ', Y') of the pre-calibration plate image corresponding to coordinate points on the standard calibration plate image, Xj、YjThe central coordinate value of the circular spot array used for calibration on the standard calibration plate image is shown, N is the number of the circular spot marking points used for calibration, and G is a mapping function.
4. The method for extracting flow particle images in curved surface visualization models according to claim 3, wherein the solved image XY coordinate correction weighting matrix WxAnd WyThe method comprises the following steps:
respectively using the identified calibration plate image circle spot array center value (X'i,Y′i) And the central value (X) of the circular spot array on the standard calibration platej,Yj) Establishing a relation:
Figure FDA0002588411410000021
wherein, WxjThe center coordinates (X) of the circular spot array on the standard calibration platej,Yj) Calibration plate image circle spot array center coordinate (X 'before correction'i,Y′i) Middle X'iWeighting coefficient of WyjThe center coordinates (X) of the circular spot array on the standard calibration platej,Yj) Calibrating plate image circle spot array center coordinate point (X 'before correction'i,Y′i) Middle Y'iThe weighting coefficient of (2);
let Gi,j=G(Xi-Xj,Yi-Yj) (ii) a Inputting all the calibration point values to obtain an NxN equation set:
Figure FDA0002588411410000022
wherein X'I=(X′1,X′2,…,X′i)T,Y′I=(Y′1,Y′2,…,Y′i)T
Figure FDA0002588411410000023
Wx=(Wx1,Wx2,…,WxN),Wy=(Wy1,Wy2,…,WyN);
Solving for
Figure FDA0002588411410000024
Available image XY coordinate correction weighting matrix WxAnd Wy
5. The method for extracting flow particle images in the curved surface visualization model according to claim 3, wherein the method comprises the following steps: the mapping function G is G2(x)=|x|2(ln | x | -1); wherein x is a parameter vector.
6. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 5.
7. An apparatus for flow particle image extraction in a surface visualization model, comprising a memory, a processor, and a computer program stored in the memory and executable on the processor, characterized in that: the processor, when executing the computer program, performs the steps of the method according to any one of claims 1 to 5.
CN202010688286.2A 2020-07-16 2020-07-16 Method for extracting particle images in curved surface visual model Active CN111951375B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010688286.2A CN111951375B (en) 2020-07-16 2020-07-16 Method for extracting particle images in curved surface visual model

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010688286.2A CN111951375B (en) 2020-07-16 2020-07-16 Method for extracting particle images in curved surface visual model

Publications (2)

Publication Number Publication Date
CN111951375A true CN111951375A (en) 2020-11-17
CN111951375B CN111951375B (en) 2023-06-30

Family

ID=73340064

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010688286.2A Active CN111951375B (en) 2020-07-16 2020-07-16 Method for extracting particle images in curved surface visual model

Country Status (1)

Country Link
CN (1) CN111951375B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114022398A (en) * 2022-01-07 2022-02-08 武汉工程大学 Green function based dual harmonic spline interpolation heat radiation effect correction method and device
CN117805434A (en) * 2024-03-01 2024-04-02 中国空气动力研究与发展中心低速空气动力研究所 SPIV measurement and calibration device and method for space-time evolution wall turbulence boundary layer

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104958061A (en) * 2015-07-28 2015-10-07 北京信息科技大学 Fundus OCT imaging method utilizing three-dimensional imaging of binocular stereo vision and system thereof
CN109493418A (en) * 2018-11-02 2019-03-19 宁夏巨能机器人股份有限公司 A kind of three-dimensional point cloud acquisition methods based on LabVIEW
CN110458901A (en) * 2019-06-26 2019-11-15 西安电子科技大学 A kind of optimum design method of overall importance based on the photo electric imaging system for calculating imaging

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104958061A (en) * 2015-07-28 2015-10-07 北京信息科技大学 Fundus OCT imaging method utilizing three-dimensional imaging of binocular stereo vision and system thereof
CN109493418A (en) * 2018-11-02 2019-03-19 宁夏巨能机器人股份有限公司 A kind of three-dimensional point cloud acquisition methods based on LabVIEW
CN110458901A (en) * 2019-06-26 2019-11-15 西安电子科技大学 A kind of optimum design method of overall importance based on the photo electric imaging system for calculating imaging

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114022398A (en) * 2022-01-07 2022-02-08 武汉工程大学 Green function based dual harmonic spline interpolation heat radiation effect correction method and device
CN117805434A (en) * 2024-03-01 2024-04-02 中国空气动力研究与发展中心低速空气动力研究所 SPIV measurement and calibration device and method for space-time evolution wall turbulence boundary layer
CN117805434B (en) * 2024-03-01 2024-06-04 中国空气动力研究与发展中心低速空气动力研究所 SPIV measurement and calibration device and method for space-time evolution wall turbulence boundary layer

Also Published As

Publication number Publication date
CN111951375B (en) 2023-06-30

Similar Documents

Publication Publication Date Title
CN109454634B (en) Robot hand-eye calibration method based on plane image recognition
CN110288657B (en) Augmented reality three-dimensional registration method based on Kinect
CN104981105A (en) Detecting and error-correcting method capable of rapidly and accurately obtaining element center and deflection angle
JPWO2016076400A1 (en) Calibration apparatus, calibration method, optical apparatus, photographing apparatus, projection apparatus, measurement system, and measurement method
CN111951375B (en) Method for extracting particle images in curved surface visual model
CN103426149A (en) Large-viewing-angle image distortion correction and processing method
WO2021175281A1 (en) Infrared temperature measurement method, apparatus, and device, and storage medium
CN109341720A (en) A kind of remote sensing camera geometric calibration method based on fixed star track
CN112365540B (en) Ship target positioning detection method and system suitable for multiple scales
CN109859137A (en) A kind of irregular distortion universe bearing calibration of wide angle camera
CN111009014A (en) Calibration method of orthogonal spectral imaging pose sensor of general imaging model
CN112950719A (en) Passive target rapid positioning method based on unmanned aerial vehicle active photoelectric platform
CN115290006A (en) System and method for alignment of optical axis and detection of surface curvature of reflection light path
CN115375775A (en) Underwater camera calibration method and device based on forward projection refraction model
CN117557466B (en) Optical remote sensing image target image enhancement method and device based on imaging conditions
CN117197241B (en) Robot tail end absolute pose high-precision tracking method based on multi-eye vision
CN103926693A (en) Compact large-framework-angle conformal optical system
CN109754435B (en) Camera online calibration method based on small target fuzzy image
CN109099871B (en) Interference detection alignment method based on circular target
CN114119747B (en) Three-dimensional flow field flow display method based on PMD wave front detection
WO2023023961A1 (en) Piv image calibration apparatus and method based on laser linear array
CN112504464B (en) Image feature point fusion method for three-probe infrared imaging system
CN113124821A (en) Structure measurement method based on curved mirror and plane mirror
CN113781581A (en) Depth of field distortion model calibration method based on target loose attitude constraint
CN107783249A (en) A kind of space active thermal optical system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant