CN107102165B - Surface flow field measuring method based on particle image velocimetry - Google Patents
Surface flow field measuring method based on particle image velocimetry Download PDFInfo
- Publication number
- CN107102165B CN107102165B CN201710242846.XA CN201710242846A CN107102165B CN 107102165 B CN107102165 B CN 107102165B CN 201710242846 A CN201710242846 A CN 201710242846A CN 107102165 B CN107102165 B CN 107102165B
- Authority
- CN
- China
- Prior art keywords
- flow
- image
- field
- flow field
- coordinate
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01P—MEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
- G01P5/00—Measuring speed of fluids, e.g. of air stream; Measuring speed of bodies relative to fluids, e.g. of ship, of aircraft
- G01P5/18—Measuring speed of fluids, e.g. of air stream; Measuring speed of bodies relative to fluids, e.g. of ship, of aircraft by measuring the time taken to traverse a fixed distance
- G01P5/20—Measuring speed of fluids, e.g. of air stream; Measuring speed of bodies relative to fluids, e.g. of ship, of aircraft by measuring the time taken to traverse a fixed distance using particles entrained by a fluid stream
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Indicating Or Recording The Presence, Absence, Or Direction Of Movement (AREA)
- Image Analysis (AREA)
Abstract
The invention discloses a surface flow field measuring method based on particle image velocimetry, which comprises the steps of scattering tracer particles with unlimited material and size on the surface of a flow field to be measured; setting at least one view field in the flow field to be detected, sampling video images of the view field, and establishing a corresponding aerial view after correcting the view field of the video images; dividing each aerial view into a plurality of sub-grids, and averaging the flow rates of the corresponding sub-grids in all the aerial views to obtain the flow rate of a view field; and sequentially carrying out data splicing and visualization processing on the flow velocity of the plurality of fields to obtain a visual flow velocity image of the surface of the flow field to be measured, and the like, and successfully realizing the measurement of the flow velocity of the surface flow field by utilizing the PIV technology. The remarkable effects are as follows: the video acquisition equipment has free visual angle, has better robustness to complex terrain and external illumination, and can meet different test environment requirements; support any type of trace particle; the analysis speed is fast, and the measurement accuracy is high.
Description
Technical Field
The invention relates to the technical field of water conservancy measurement, in particular to a surface flow field measurement method based on particle image velocity measurement.
Background
The surface flow field velocity measurement is an important means for river model analysis and is also a basic technology in the water conservancy measurement field. At present, a single-point flow velocity measuring device commonly adopted in the industry not only interferes a flow field during measurement to cause measurement errors; and because the measurement efficiency is too low, the method is difficult to be applied to flow velocity field analysis and non-constant flow field analysis of large river engineering models.
Moreover, the existing surface flow velocity measurement methods mainly have the following disadvantages: (1) the video acquisition equipment must work at an angle vertical to the flow velocity surface, the requirement on external illumination conditions is high, and the operation process is complex; (2) the analysis speed of the multi-path data is low, and the measurement precision is not high enough; (3) most flow field velocimetry systems use tracer particles which are relatively costly.
Disclosure of Invention
Aiming at the defects of the prior art, the invention aims to provide a surface flow field measurement method based on particle image velocity measurement, wherein the visual angle of video acquisition equipment in the method is free, and the method has good robustness and convenience and can meet different test environment requirements; any type of tracer particle is also supported; the analysis speed is fast, and the measurement accuracy is high.
In order to achieve the purpose, the technical scheme adopted by the invention is as follows:
a surface flow field measurement method based on particle image velocimetry is characterized by comprising the following steps:
step 1: distributing tracer particles with unlimited material and size on the surface of a flow field to be detected;
step 2: at least one view field is arranged in the flow field to be measured, and the flow velocity of each view field is obtained by adopting the following method:
step 2-1: carrying out video image sampling on the view field for multiple times, then carrying out view field correction, and establishing a corresponding aerial view;
step 2-2: dividing each aerial view into a plurality of sub-grids, calculating the velocity vector of the tracer particles in each sub-grid through a pyramid Lucas-Kanade optical flow algorithm, and taking the velocity vector as the flow velocity of the center position of the sub-grid;
step 2-3: averaging the flow velocities of the corresponding sub-grids in all the aerial views, and taking the obtained flow velocity matrix as the flow velocity of the view field;
and step 3: and sequentially carrying out data splicing and visualization processing on the flow velocity of the plurality of fields to obtain a visualized flow velocity image of the surface of the flow field to be measured.
Further, the video images in the step 2-1 are collected by video collecting equipment, and the collecting angle and the erecting position of the video collecting equipment are arbitrary.
Further, the specific steps of correcting the field of view and establishing the bird's-eye view in the step 2-1 are as follows:
step 2-1-1: carrying out distortion correction processing on an original video image;
step 2-1-2: selecting four pixel points on the image after distortion correction, forming a rectangular measurement region R by taking the four pixel points as vertexes, and thenAcquiring actual flow field coordinate points corresponding to the four pixel points, and forming an actual flow field measurement area R by using the four actual flow field coordinate points0;
Step 2-1-3: calculating the ratio T of the width to the length of the rectangular measuring region R; constructing a bird's-eye view plane R with any size and a ratio of width to length of T1;
Step 2-1-4: according to the rectangular measurement area R and the aerial view plane R1Calculating a perspective transformation matrix H corresponding to the pixel coordinate of the image after the distortion correction and the aerial view coordinate;
step 2-1-5: and establishing a bird's-eye view according to the perspective transformation matrix H.
Further, in step 2-1-2, the actual flow field measurement region R is obtained according to the pixel coordinates of the rectangular measurement region R0The specific steps of the actual coordinates of the corresponding points in the process are as follows:
step s 1: marking four points with known actual coordinates in an actual flow field;
step s 2: acquiring pixel coordinates corresponding to the four points in the image after the distortion correction;
step s 3: and determining the coordinate of the corresponding position of the actual flow field according to the proportional relation of the pixel coordinates of the image after the distortion correction.
Further, the step of performing data splicing on the flow rate in step 3 is as follows:
a1: matching the transverse coordinates and the longitudinal coordinates of the flow speed data;
a2: when multiple groups of flow velocity data exist in the same coordinate point, an average value is obtained and is used as the flow velocity data of the coordinate point; and if the coordinate point does not exist, directly taking the flow speed data obtained by calculation as the flow speed data of the coordinate point.
Further, the step of visualizing the surface flow velocity of the flow field in the step 3 is as follows:
b1: creating a gridding virtual flow field which is in the same proportion with the flow field to be detected;
b2: mapping the flow velocity data to the virtual flow field according to the coordinate corresponding relation;
b3: and drawing a visual flow velocity image of the surface of the flow field to be measured through the virtual flow field according to the obtained flow velocity data.
According to the scheme, firstly, distortion correction is carried out on an obtained video image, and a corresponding aerial view is established; secondly, dividing each aerial view into a plurality of sub-grids, calculating the velocity vector of the tracer particles in each sub-grid through a pyramid Lucas-Kanade optical flow algorithm, and taking the velocity vector as the flow velocity of the center position of the sub-grid; thirdly, averaging the flow velocities of the corresponding sub-grids in all the aerial views, and taking the obtained flow velocity matrix as the flow velocity of the view field; and finally, sequentially carrying out data splicing and visualization processing on the flow velocity of the plurality of fields of view to obtain a visualized flow velocity image of the surface of the flow field to be measured.
The invention has the following remarkable effects:
(1) the flow velocity vector is obtained by utilizing the PIV particle image velocity measurement technology, and the flow velocity measurement can be realized more quickly and accurately.
(2) The method has the advantages that the collected video is corrected in view field and reconstructed in aerial view according to the algorithm, the main optical axis of the video collecting device can be supported to form any angle with the surface of the flow field, the free view angle is realized, manual installation and debugging are omitted, and the method has good robustness to complex terrains and external illumination.
(3) Any type of trace particles are supported, and the convenience and the application range of the system are increased while the measurement cost of the system is reduced.
(4) The device supports any type of lighting source and any number of video acquisition devices, and can meet the requirements of different test environments.
Drawings
FIG. 1 is a flow chart of a method of the present invention;
FIG. 2 is an original video image;
FIG. 3 is an image after distortion correction;
FIG. 4 is a schematic view of a selected rectangular measurement region R;
FIG. 5 is an aerial view of the build;
FIG. 6 is a flow velocity visualization image;
fig. 7 is a measurement accuracy comparison chart.
Detailed Description
The following describes embodiments of the present invention in further detail with reference to the accompanying drawings.
As shown in fig. 1, a surface flow field measurement method based on particle image velocimetry includes the following steps:
step 1: distributing tracer particles with unlimited material and size on the surface of a flow field to be detected;
in practical application, the broken paper scraps of the tracer particles are tested and are illuminated by natural light, so that the cost is almost zero and the operation is easy.
Step 2: at least one view field is arranged in the flow field to be measured, and the flow velocity of each view field is obtained by adopting the following method:
step 2-1: carrying out video image sampling on the view field for multiple times, then carrying out view field correction, and establishing a corresponding aerial view;
in this example, the video images are captured by a video capture device that uses a Haekwondo (HIKVISION) DS-2CD3410FD-IW camera. The resolution of the camera is 1920 multiplied by 1080, and the frame rate is 25-30 fps. Surface flow velocity measurements were performed on flow fields ranging from 1.7m x 2.5m at a temperature of 17 ℃ and a relative humidity of 66%.
The specific steps of correcting the visual field and establishing the aerial view are as follows:
step 2-1-1: as shown in fig. 2, the original video image is not good for image recognition, analysis and judgment due to the distortion. Therefore, the original video image needs to be subjected to distortion correction, and the image after distortion correction is as shown in fig. 3;
step 2-1-2: four pixel points are selected on the image after the distortion correction, and a rectangular measurement area formed by the four points as vertexes is defined as R, such as a rectangular frame area shown in fig. 4. Then, acquiring actual flow field coordinate points corresponding to the four points, wherein a flow field measurement area formed by the four actual flow field coordinate points is defined as R0。
In this example, the four vertices of the rectangular measurement region R are defined as A, B, C, D, the pixel coordinates of A, B, C, D are obtained, and A, B, C, D is determined for the corresponding pixel coordinatesActual flow field coordinate A0、B0、C0、D0. The specific method comprises the following steps:
step s 1: marking four points with known actual coordinates in an actual flow field;
step s 2: acquiring pixel coordinates corresponding to the four points in the image after the distortion correction;
step s 3: and determining the coordinate of the corresponding position of the actual flow field according to the proportional relation of the pixel coordinates of the image after the distortion correction.
Step 2-1-3: calculating the ratio T of the width to the length of the rectangular measurement area R, and constructing a bird's-eye view plane R with any size and the ratio of the width to the length of the rectangular measurement area R being T1The aerial view plane R1Is defined as A1、B1、C1、D1. Known as A0、B0、C0、D0A can be obtained by the ratio T1、B1、C1、D1;
Step 2-1-4: according to the rectangular measurement area R and the aerial view plane R1According to A, B, C, D, A1、B1、C1、D1Calculating a perspective transformation matrix H corresponding to the pixel coordinates of the image after the distortion correction and the bird's-eye view coordinates according to the corresponding relation among the 8 coordinate points;
in the present embodiment, the perspective transformation matrix H is calculated using a homogeneous coordinate system.
The perspective projective transformation can be described in particular in the form of a 3x3 matrix, namely:
the perspective transformation matrix H is then expressed as:
where (x, y, w) is the vertex pixel coordinates of the rectangular measurement region R, and (x ', y', w ') is the vertex coordinates of the bird's eye view. In the same way, a plurality of corresponding relations shown above are obtained, and the perspective transformation matrix H can be obtained through SVD decomposition.
Step 2-1-5: and establishing a bird's-eye view according to the perspective transformation matrix H.
The process of creating the bird's-eye view is a process of converting the image (fig. 3) after the distortion correction into the bird's-eye view (fig. 5) by using the perspective transformation matrix H. And (4) calculating to obtain the bird's-eye view coordinate corresponding to each pixel coordinate by knowing the perspective transformation matrix H and the pixel coordinate, wherein the calculation mode is as described in the step 2-1-4, and the conversion from the forward viewing angle image to the bird's-eye view image is completed.
The field distortion correction is carried out on the video frame images through the steps, the corresponding aerial view is established, the angle and the erection position of the video acquisition equipment can be freely selected, namely the visual angle is free, manual installation and debugging are omitted, the selection of a measurement area is more flexible, the orientation of the video acquisition equipment can be changed through changing, the reflection influence generated by sunlight and light irradiation is reduced, better robustness is achieved for complex terrains and external illumination, and the adaptability of the system to illumination change is improved.
Step 2-2: dividing each aerial view into a plurality of sub-grids, calculating the velocity vector of the tracer particles in each sub-grid through a pyramid Lucas-Kanade optical flow algorithm, and taking the velocity vector as the flow velocity of the center position of the sub-grid;
the specific calculation steps of the tracer particle velocity vector are as follows:
I. extracting the coordinate u ═ u of the center point of the tracer particlexuy]T。
II. Finding a point z ═ z in the second frame image J by using pyramid Lucas-Kanade (LK) optical flow algorithmxzy]TSo that z is u + d is ux+dxuy+dy]T. Vector d ═ dxdy]TIs the optical flow at point u.
For the pyramid LK optical flow algorithm, the main steps are three steps: and establishing a pyramid, tracking pyramid characteristics and iterating.
in this embodiment, a recursive method is used to build the pyramid: by means of I0(original video image) calculation I1Then use I1Calculation of I2Wherein L is the pyramid layer number, then ILPictures representing the L-th layer of the image I, JLA picture representing the lth layer of image J; l ismIndicating the height of the pyramid. According to the definition of image pyramid, if the size of an image I is nx×nyThen, ILThe size of the layer picture is (n)x/2L)×(ny/2L)。
S2: tracking pyramid features;
the specific description is as follows:
1) first, at the highest level LmIs equal to 3, according toComputing an optical flow estimator for an initialization pyramidAnd by minimizing the mismatch functionTo obtain a residual optical flow estimator
wherein u isxAnd uyIs the coordinate of the grid center point u. w is axAnd wyIs two integers, generally 2,3,4,5,6,7 pixels;
2) then, the highest layer L is formedmThe result of the calculation is 3 as an initial value, according to the formulaTo the next layer, i.e. Lm-1 ═ 2 layer image, Lm-1 layer image on the basis of this initial value, a transfer-initialized optical flow estimator for this layer is calculatedAnd by minimizing the mismatch functionTo obtain a residual optical flow estimator
3) Then put L intom-transfer initialization estimator of 1-2 layersAnd residual optical flow estimatorIs an initial value and is transmitted to the next layer, namely Lm-2-1 layer picture until the last layer, i.e. the original video picture layer, is passed, in accordance withCalculate Lm-2-layer optical flow estimatorAnd by minimizing the mismatch functionTo obtainResidual optical flow estimator
4) Mixing L withmThe calculation results of-2-1 layer are passed to the next layer, i.e. Lm-3-0 layers, and calculating by the same calculation method to obtain Lm-optical flow estimate g of 3 layers0=2(g1+d1) And minimizing the mismatch function ε0(d0) To obtain d0And finally, obtaining an optical flow result d through optical flow calculation:
d=g0+d0
s3: iterative process
At each level of the pyramid, the goal is to calculate the optical flow d such that the mismatching function ε is minimizedLAnd minimum. Since the iterative process is the same for each layer, this example only describes the iterative process from one layer to the next.
Let k be the iteration index, initialized to 1 at the beginning. The solution to minimization can be found by calculating the LK optical flow:
wherein the content of the first and second substances,is the image mismatch vector and G is the spatial gradient matrix. The final optical flow vector is:
where K is the number of iterations performed to achieve convergence.
Step 2-3: averaging the flow velocities of the corresponding sub-grids in all the aerial views, and taking the obtained flow velocity matrix as the flow velocity of the view field;
and step 3: sequentially carrying out data splicing and visualization processing on the flow velocity of a plurality of fields of view to obtain a visual flow velocity image of the surface of the flow field to be detected:
(1) the data splicing of the streaming speed comprises the following steps:
a1: matching the transverse coordinates and the longitudinal coordinates of the flow speed data;
a2: because the video images acquired by the video acquisition equipment may have overlapping conditions, for the overlapped flow field measurement areas, a plurality of groups of flow velocity data exist at the same coordinate position. Therefore, in this case, the average value of the plurality of sets of flow velocity data at the same coordinate point is determined as the flow velocity data at the coordinate position. If the coordinate point does not have the view field overlapping phenomenon, the flow speed data obtained by calculation is directly used as the flow speed data of the coordinate position without processing.
(2) The visualization processing method comprises the following steps:
b1: creating a gridding virtual flow field which is in the same proportion with the flow field to be detected;
b2: mapping the flow velocity data to the virtual flow field according to the coordinate corresponding relation;
b3: and drawing a visual flow velocity image of the surface of the flow field to be measured through the virtual flow field, wherein the obtained visual flow velocity image is shown in fig. 6.
By adopting the method, the system to be detected simultaneously measures 425 test points in the measurement flow field to obtain 425 speed measurement values (unit: m/s). And the actual value measured by the acoustic Doppler velocimeter is used as a reference value, the comparison result of the two values is shown in fig. 7, the ordinate of the curve L-AOM represents the absolute value (absolute error) of the difference between the measured average value of 425 test points obtained by the system and the reference value, and the slope represents the relative error. The ordinate of L-PA and L-PB is the absolute value of the difference between the measured value and the reference value of two single test points randomly selected by the system.
According to statistics of the whole flow velocity measurement result and the measurement result of a single measurement point, the measurement result of the acoustic Doppler velocimeter is used as a reference value, the relative error in the range of [ 0.01-0.05 ] m/s is less than 10%, the relative error in the range of (0.05-1.5 ] m/s is less than 5%, and the angular deviation of flow velocity measurement in the whole test range is less than 0.5 degrees.
Claims (5)
1. A surface flow field measurement method based on particle image velocimetry is characterized by comprising the following steps:
step 1: distributing tracer particles with unlimited material and size on the surface of a flow field to be detected;
step 2: at least one view field is arranged in the flow field to be measured, and the flow velocity of each view field is obtained by adopting the following method:
step 2-1: carrying out video image sampling on the view field for multiple times, then carrying out view field correction, and establishing a corresponding aerial view;
the specific steps of correcting the visual field and establishing the aerial view are as follows:
step 2-1-1: carrying out distortion correction processing on an original video image;
step 2-1-2: selecting four pixel points on the image after distortion correction, forming a rectangular measurement area R by taking the four pixel points as vertexes, then obtaining actual flow field coordinate points corresponding to the four pixel points, and forming an actual flow field measurement area R by taking the four actual flow field coordinate points0;
Step 2-1-3: calculating the actual flow field measurement area as R0The ratio T of the width to the length of the aerial view plane R is constructed, and the aerial view plane R with any size and the ratio T of the width to the length of the aerial view plane R is constructed1;
Step 2-1-4: according to the rectangular measurement area R and the aerial view plane R1Calculating a perspective transformation matrix H corresponding to the pixel coordinate of the image after the distortion correction and the aerial view coordinate;
step 2-1-5: establishing a bird's-eye view according to the perspective transformation matrix H;
step 2-2: dividing each aerial view into a plurality of sub-grids, calculating the velocity vector of the tracer particles in each sub-grid through a pyramid Lucas-Kanade optical flow algorithm, and taking the velocity vector as the flow velocity of the center position of the sub-grid;
the specific calculation steps of the tracer particle velocity vector are as follows:
I. extracting the coordinate u ═ u of the center point of the tracer particlexuy]T;
II. Finding a point z ═ z in the second frame image J by using pyramid Lucas-Kanade (LK) optical flow algorithmxzy]TSo that z is u + d is ux+dxuy+dy]T(ii) a Vector d ═ dxdy]TNamely the optical flow of the point u, the specific steps are as follows:
s1: establishing a pyramid of the image I and the image J:andwherein L is the number of pyramid layers, then ILPictures representing the L-th layer of the image I, JLA picture representing the lth layer of image J; l ismRepresenting the height of the pyramid;
s2: tracking pyramid features;
1) at the highest level LmIs equal to 3, according toComputing an optical flow estimator for an initialization pyramidAnd by minimizing the mismatch functionTo obtain a residual optical flow estimatorMinimizing mismatching functionsThe calculation formula of (a) is as follows:
wherein u isxAnd uyIs the coordinate of the center point u of the grid, wxAnd wyIs two integers;
2) the highest layer LmThe result of the calculation is 3 as an initial value, according to the formulaTo the next layer, i.e. Lm-1 ═ 2 layer image, Lm-1 layer image on the basis of this initial value, a transfer-initialized optical flow estimator for this layer is calculatedAnd by minimizing the mismatch functionTo obtain a residual optical flow estimator
3) Then put L intom-transfer initialization estimator of 1-2 layersAnd residual optical flow estimatorIs an initial value and is transmitted to the next layer, namely Lm-2-1 layer picture until the last layer, i.e. the original video picture layer, is passed, in accordance withCalculate Lm-2-layer optical flow estimatorAnd by minimizing the mismatch functionTo obtain a residual optical flow estimator
4) Mixing L withmThe calculation results of-2-1 layer are passed to the next layer, i.e. Lm-3-0 layers, and calculating by the same calculation method to obtain Lm-optical flow estimate g of 3 layers0=2(g1+d1) And minimizing the mismatch function ε0(d0) To obtain d0And finally, obtaining an optical flow result d through optical flow calculation:
d=g0+d0,
s3: and (3) an iterative process: at each level of the pyramid, the goal is to calculate the optical flow d such that the mismatching function is minimizedMinimum; assuming k as an iterative index, initialized to 1 at the beginning, the solution for minimization can be obtained by computing the LK optical flow:
wherein the content of the first and second substances,is the image mismatch vector, G is the spatial gradient matrix, and the final optical flow vector dLComprises the following steps:
wherein K is the number of iterations performed when convergence is reached;
step 2-3: averaging the flow velocities of the corresponding sub-grids in all the aerial views, and taking the obtained flow velocity matrix as the flow velocity of the view field;
and step 3: and sequentially carrying out data splicing and visualization processing on the flow velocity of the plurality of fields to obtain a visualized flow velocity image of the surface of the flow field to be measured.
2. The method for measuring the surface flow field based on particle image velocimetry of claim 1, characterized in that: and 2-1, acquiring the video image by video acquisition equipment, wherein the acquisition angle and the erection position of the video acquisition equipment are arbitrary.
3. The method for measuring the surface flow field based on particle image velocimetry of claim 1, characterized in that: step 2-1-2, obtaining an actual flow field measurement region R according to the pixel coordinates of the rectangular measurement region R0The specific steps of the actual coordinates of the corresponding points in the process are as follows:
step s 1: marking four points with known actual coordinates in an actual flow field;
step s 2: acquiring pixel coordinates corresponding to the four points in the image after the distortion correction;
step s 3: and determining the coordinate of the corresponding position of the actual flow field according to the proportional relation of the pixel coordinates of the image after the distortion correction.
4. The method for measuring the surface flow field based on particle image velocimetry of claim 1, characterized in that: the step of performing data splicing on the flow rate in the step 3 is as follows:
a1: matching the transverse coordinates and the longitudinal coordinates of the flow speed data;
a2: when multiple groups of flow velocity data exist in the same coordinate point, an average value is obtained and is used as the flow velocity data of the coordinate point; and if the coordinate point does not exist, directly taking the flow speed data obtained by calculation as the flow speed data of the coordinate point.
5. The method for measuring surface flow field based on particle image velocimetry of claim 1 or 4, characterized in that: the step 3 of visualizing the surface flow velocity of the flow field comprises the following steps:
b1: creating a gridding virtual flow field which is in the same proportion with the flow field to be detected;
b2: mapping the flow velocity data to the virtual flow field according to the coordinate corresponding relation;
b3: and drawing a visual flow velocity image of the surface of the flow field to be measured through the virtual flow field according to the obtained flow velocity data.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710242846.XA CN107102165B (en) | 2017-04-14 | 2017-04-14 | Surface flow field measuring method based on particle image velocimetry |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710242846.XA CN107102165B (en) | 2017-04-14 | 2017-04-14 | Surface flow field measuring method based on particle image velocimetry |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107102165A CN107102165A (en) | 2017-08-29 |
CN107102165B true CN107102165B (en) | 2020-03-20 |
Family
ID=59676008
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710242846.XA Active CN107102165B (en) | 2017-04-14 | 2017-04-14 | Surface flow field measuring method based on particle image velocimetry |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107102165B (en) |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109932281B (en) * | 2017-12-19 | 2021-08-17 | 中国科学院沈阳自动化研究所 | Vision-based liquid viscosity on-line measuring method |
CN109584314B (en) * | 2018-12-27 | 2020-07-10 | 重庆交通大学 | Method and device for measuring water surface flow field and electronic equipment |
CN111257588B (en) * | 2020-01-17 | 2020-11-17 | 东北石油大学 | ORB and RANSAC-based oil phase flow velocity measurement method |
CN112147365B (en) * | 2020-09-30 | 2021-06-04 | 中国水利水电科学研究院 | River flow rate video monitoring device and method based on deep learning |
CN112884806B (en) * | 2021-01-12 | 2022-09-02 | 昆明理工大学 | Video stream measuring method and device based on combination of block matching and intensive reverse search |
CN117952173B (en) * | 2024-03-26 | 2024-06-11 | 浙江大学 | PIV and velocity field data set construction method and device for deep learning model training |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6879708B2 (en) * | 2001-05-24 | 2005-04-12 | Case Western Reserve University | Planar particle/droplet size measurement technique using digital particle image velocimetry image data |
CN101629965B (en) * | 2009-08-18 | 2011-05-11 | 清华大学深圳研究生院 | Multi-grid processing method in particle image velocimetry (PIV) |
CN103604947B (en) * | 2013-11-28 | 2015-06-17 | 华中科技大学 | Flow field state measuring method with adaptive adjusted time resolution |
CN104297516B (en) * | 2014-11-06 | 2017-07-18 | 中国科学院、水利部成都山地灾害与环境研究所 | A kind of two-dimentional velocity field mearsurement method of flow surface |
-
2017
- 2017-04-14 CN CN201710242846.XA patent/CN107102165B/en active Active
Also Published As
Publication number | Publication date |
---|---|
CN107102165A (en) | 2017-08-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107102165B (en) | Surface flow field measuring method based on particle image velocimetry | |
CN106871787B (en) | Large space line scanning imagery method for three-dimensional measurement | |
CN109559355B (en) | Multi-camera global calibration device and method without public view field based on camera set | |
Jordt-Sedlazeck et al. | Refractive structure-from-motion on underwater images | |
CN109919911B (en) | Mobile three-dimensional reconstruction method based on multi-view photometric stereo | |
Wieneke | Improvements for volume self-calibration | |
CN108038902A (en) | A kind of high-precision three-dimensional method for reconstructing and system towards depth camera | |
Liu et al. | Novel calibration method for non-overlapping multiple vision sensors based on 1D target | |
CN110378969B (en) | Convergent binocular camera calibration method based on 3D geometric constraint | |
KR20130138247A (en) | Rapid 3d modeling | |
CN106971408B (en) | A kind of camera marking method based on space-time conversion thought | |
CN103903263B (en) | A kind of 360 degrees omnidirection distance-finding method based on Ladybug panorama camera image | |
CN114283203B (en) | Calibration method and system of multi-camera system | |
CN110782498B (en) | Rapid universal calibration method for visual sensing network | |
CN106489062B (en) | System and method for measuring the displacement of mobile platform | |
CN108663043A (en) | Distributed boss's POS node relative pose measurement method based on single camera auxiliary | |
CN108154535B (en) | Camera calibration method based on collimator | |
CN107845096B (en) | Image-based planet three-dimensional information measuring method | |
CN113920201A (en) | Polar line geometric constraint fisheye camera calibration method | |
CN117092621A (en) | Hyperspectral image-point cloud three-dimensional registration method based on ray tracing correction | |
CN110672024A (en) | Method for measuring object distance by using object projection in video | |
CN114663520A (en) | Double-camera combined calibration method and system for ultra-large range vision measurement | |
Kim et al. | Environment modelling using spherical stereo imaging | |
CN114972536B (en) | Positioning and calibrating method for aviation area array swing scanning type camera | |
CN117058194A (en) | Real-time object motion track and gesture detection method based on circular non-coding points |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
TR01 | Transfer of patent right |
Effective date of registration: 20210513 Address after: 200333 room 5081, 5th floor, No.5 Lane 600, Yunling West Road, Putuo District, Shanghai Patentee after: Shanghai Lisha Technology Co.,Ltd. Address before: 400 000 No. 174 Zhengjie, Shapingba District, Chongqing Patentee before: Chongqing University |
|
TR01 | Transfer of patent right |