CN117576168A - Water flow speed estimation method and system based on traditional optical flow calculation - Google Patents

Water flow speed estimation method and system based on traditional optical flow calculation Download PDF

Info

Publication number
CN117576168A
CN117576168A CN202311525100.1A CN202311525100A CN117576168A CN 117576168 A CN117576168 A CN 117576168A CN 202311525100 A CN202311525100 A CN 202311525100A CN 117576168 A CN117576168 A CN 117576168A
Authority
CN
China
Prior art keywords
point
module
calculating
camera
water flow
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311525100.1A
Other languages
Chinese (zh)
Inventor
王晓龙
张晏玮
朱望
王根一
安国成
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Eccom Network System Co ltd
Original Assignee
Eccom Network System Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Eccom Network System Co ltd filed Critical Eccom Network System Co ltd
Priority to CN202311525100.1A priority Critical patent/CN117576168A/en
Publication of CN117576168A publication Critical patent/CN117576168A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01PMEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
    • G01P5/00Measuring speed of fluids, e.g. of air stream; Measuring speed of bodies relative to fluids, e.g. of ship, of aircraft
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/269Analysis of motion using gradient-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20112Image segmentation details
    • G06T2207/20164Salient point detection; Corner detection
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/30Assessment of water resources

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a water flow velocity estimation method and a system based on traditional optical flow calculation, comprising the following steps: step S1: acquiring continuous video water flow images { Img0, img1, …, imgn }; step S2: preprocessing the acquired image to obtain a preprocessed image; step S3: drawing a speed measuring line, wherein the direction of the speed measuring line is consistent with the direction of water flow; step S4: calculating a motion vector optical flow field of the image based on the preprocessed image, and screening the motion vector according to the direction of the tachometer line to obtain an effective motion vector; step S5: calibrating the camera to obtain a camera focal length, and keeping various parameters of the camera unchanged after calibration is completed; step S6: measuring and calculating the scale of any point on the image according to the camera calibration result; step S7: estimating the actual distance of the motion according to the motion vector coordinates tracked by the optical flow and the scale; step S8: and calculating the actual distance of the water flow movement according to the actual distance, and calculating the final water flow velocity based on the actual distance and the movement time of the water flow movement.

Description

Water flow speed estimation method and system based on traditional optical flow calculation
Technical Field
The invention relates to the technical field of hydrologic water flow rate detection, in particular to a water flow rate estimation method and system based on traditional optical flow calculation.
Background
The water resource is effectively utilized in a large number of rivers, mountains and rivers, which has important significance for the economic development of China. However, the water resources are also abundant and are often accompanied by the threat of flood disasters. The natural environment and the topography condition cannot be changed, but the water resource is developed and utilized, meanwhile, effective supervision and control measures are adopted, disaster occurrence is prevented in advance or pre-disaster early warning work is carried out, and loss and danger caused by flood disasters can be greatly reduced.
The water flow velocity is taken as important hydrologic information, is the basis for calculating river flow velocity estimation research, and the estimation measurement of the river flow velocity is always a hotspot problem of water conservancy industry research.
The traditional water flow velocity measurement method not only comprises a buoy method, but also comprises a flow meter measurement method, a specific drop area method and the like, but the method can be used in a specific scene, and the method needs manual in-situ measurement and has no safety and convenience. Nowadays, with the development of wireless network cameras and video monitoring technologies, intelligent monitoring plays a role in more and more fields, and even many occasions can replace manpower, so that the manpower resource effort is greatly reduced, and meanwhile, the production and life efficiency and safety are improved. In this context, flow rate measurement techniques based on video monitoring are widely studied both at home and abroad.
In various image speed measurement, the method for motion estimation based on optical flow calculation can reduce measurement cost and realize long-term real-time measurement. The invention provides a water flow velocity estimation algorithm based on traditional optical flow calculation, which is inspired from traditional rigid body motion tracking detection, and uses a traditional sparse optical flow algorithm for calculating the water flow velocity without auxiliary conditions.
Patent document CN115690154A (application number: 202211324411.7) discloses a low water flow rate detection method and electronic equipment, belongs to the technical field of water flow speed measurement, and solves the technical problem that the flow rate detection is inaccurate in the low flow rate or water surface characteristics less scenes in the prior art. A low water flow rate detection method comprising: performing internal parameter calibration on the camera under different zoom values to obtain a relation function of the zoom value and internal parameter; carrying out inter-frame dense optical flow tracking on every two adjacent frames in the multi-frame continuous images to obtain a plurality of pairs of matching feature points, and generating a motion track of each pixel in the multi-frame continuous images based on the plurality of pairs of matching feature points; based on the motion trail of each pixel in the multi-frame continuous image, taking the matching characteristic points of the first and the last frames of the multi-frame continuous image to sequentially perform slope screening and clustering screening to obtain screened matching characteristic points; and converting the image coordinates of the screened matched feature points into world coordinates based on the relation function of the zoom and the internal parameter. The patent uses dense optical flow to track the motion trail of pixels, and screens the motion trail of high-quality features by the slope of the motion trail and a clustering method. In the aspect of camera calibration, the patent calibrates the camera under different zoom to obtain a relation function of the zoom of the camera and an internal reference, and then maps the image coordinates into world coordinates. Compared with a dense optical flow tracking algorithm, the sparse optical flow tracking method is faster in speed, in the aspect of feature point screening, not only is a backward tracking method adopted to kick out the tracked unstable feature points, but also a manual demarcation speed measurement line is used to guide the direction of water flow movement, interference motion vectors are screened out, tracking accuracy is improved, and the influence of external factors on water flow speed measurement is reduced. In the aspect of camera calibration, the invention can obtain the internal reference focal length of the camera by one calibration, can be used in a plurality of scenes, does not need one scene to calibrate, and reduces the labor cost.
Disclosure of Invention
Aiming at the defects in the prior art, the invention aims to provide a water flow speed estimation method and system based on traditional optical flow calculation.
The invention provides a water flow velocity estimation method based on traditional optical flow calculation, which comprises the following steps:
step S1: acquiring video continuous water flow images { Img0, img1, & gt, imgn };
step S2: preprocessing the acquired image to obtain a preprocessed image;
step S3: drawing a speed measuring line, wherein the direction of the speed measuring line is consistent with the direction of water flow;
step S4: calculating a motion vector optical flow field of the image based on the preprocessed image, and screening the motion vector according to the direction of the tachometer line to obtain an effective motion vector;
step S5: calibrating the camera to obtain a camera focal length, and keeping various parameters of the camera unchanged after calibration is completed;
step S6: measuring and calculating the scale of any point on the image according to the camera calibration result;
step S7: estimating the actual distance of the motion according to the motion vector coordinates tracked by the optical flow and the scale;
step S8: and calculating the actual distance of the water flow movement according to the actual distance, and calculating the final water flow velocity based on the actual distance and the movement time of the water flow movement.
Preferably, the step S2 employs: the image is noise reduced and enhanced using techniques including image graying, median filtering, and histogram equalization.
Preferably, the step S3 employs:
defining a velocimetry line L (P0, P1) along the water flow direction, and calibrating the vector direction of the optical flow field by using the velocimetry line; wherein P0 represents the starting point coordinate of the speed measurement line, and P1 represents the ending point coordinate; the rotation angle θ of the tachometer line is:
preferably, the step S4 employs:
step S4.1: performing corner detection on the previous frame image Img0 to obtain a corner point set FP0{ p00, p01, & gt, p0n };
step S4.2: tracking the corner point set FP0 detected by the image Img0 on the second frame image Img1 one by one to obtain a tracking corner point set FP1{ p10, p11, & gt, p1n };
step S4.3: performing sparse optical flow tracking on the corner point set FP1 tracked by the image Img1 on the image Img0 in the reverse direction to obtain a verification corner point set PF0r { pr0, pr1, & gt, prn };
step S4.4: calculating Euclidean distances D { D0, D1, & gt, dn } of corner points corresponding to a detection corner set FP0 and a verification corner set FP0r on an image Img0, judging whether the corner points are inferior tracking points if so, deleting the corner points corresponding to positions in FP0 and FP1, and obtaining a screened (FP 0', FP 1') point pair;
step S4.5: a motion vector set { (p 00', p 10'), (p 01', p 11'), (p 0n ', p1 n') } based on the filtered (FP 0', FP 1') point pairs; the Euclidean distance of each group of point pairs is the pixel distance of the point motion;
step S4.6: calculating the rotation angle p_theta of each group of motion vectors (p 0n ', p1 n'), if the difference between the p_theta and the rotation angle theta of the tachometer line is larger than a preset threshold value, judging the group of motion vectors as interference vectors, and deleting the current group of motion vectors;
step S4.7: the motion vectors after screening are effective motion vectors, and the Euclidean distance of each group of motion vectors is estimated as the pixel distance of the motion of the point.
Preferably, the step S5 employs:
step S5.1: drawing equal four lattices on the plane white paper, and recording the actual length and width (Wr, hr) of one lattice;
step S5.2: the paper is tiled on a table top, a camera vertically shoots the paper to obtain a calibration chart, and the vertical distance d between the camera and the paper is recorded;
step S5.3: performing contour detection on the calibration graph to obtain the length and width of each grid pixel { (w 0, h 0), (w 1, h 1), (w 2, h 2), (w 3, h 3) }; the average length and width (w, h) of the next grid of the pixel coordinate system is obtained by calculating the average value of the length and width of each grid pixel;
step S5.4: obtaining a camera focal length F according to a similar triangle principle and a focal length calculation formula;
or->
Preferably, the step S6 employs:
step S6.1: calculating an offset angle_y of any P point on the video in the y direction relative to the center of the visual field;
wherein F represents the focal length calibrated by the camera, and y represents the y coordinate of the P point;
step S6.2: calculating an offset angle_x of the P point relative to the center of the visual field in the x direction; the offset angle is positive and negative, and under the coordinate system of the camera, if the camera is offset towards the positive x direction, the positive x direction is positive and the negative x direction is negative;
wherein θ—x represents the horizontal offset angle of the camera;
step S6.3: calculating focal depth focal_length_y of the P point in the y direction;
step S6.4: calculating focal depth focal_length of the point P;
step S6.5: respectively calculating offset angles angle_x_real and angle_y_real of the P point relative to the camera in the x and y directions;
angle_x_real=θ_x+angle_x
angle_y_real=θ_y+angle_y
wherein θ_y represents the camera vertical offset angle;
step S6.6: the actual depth real_distance_y of the P point in the y direction;
step S6.7: calculating the actual depth real_distance of the P point;
step S6.8: finally, the obtained P point position has the scale of focal_length: real_distance.
Preferably, the step S7 employs: the scale of the center point of the motion vector is regarded as the whole vector scale to estimate the actual distance of the motion vector.
Preferably, the step S8 employs:
step S8.1: carrying out actual distance estimation on each effective motion vector to obtain the actual motion distance D { D0, D1, & gt, dn } of the characteristic point, and averaging all the actual motion distances to obtain the actual distance D of the water flow motion in the period of time;
step S8.2: calculating the actual motion time of the two frames of images;
according to the present invention, there is provided a water flow velocity estimation system based on conventional optical flow calculation, comprising:
module M1: acquiring video continuous water flow images { Img0, img1, & gt, imgn };
module M2: preprocessing the acquired image to obtain a preprocessed image;
module M3: drawing a speed measuring line, wherein the direction of the speed measuring line is consistent with the direction of water flow;
module M4: calculating a motion vector optical flow field of the image based on the preprocessed image, and screening the motion vector according to the direction of the tachometer line to obtain an effective motion vector;
module M5: calibrating the camera to obtain a camera focal length, and keeping various parameters of the camera unchanged after calibration is completed;
module M6: measuring and calculating the scale of any point on the image according to the camera calibration result;
module M7: estimating the actual distance of the motion according to the motion vector coordinates tracked by the optical flow and the scale;
module M8: and calculating the actual distance of the water flow movement according to the actual distance, and calculating the final water flow velocity based on the actual distance and the movement time of the water flow movement.
Preferably, the module M2 employs: noise reduction and enhancement processing are carried out on the image by using the method including image graying, median filtering and histogram equalization;
the module M3 employs:
defining a velocimetry line L (P0, P1) along the water flow direction, and calibrating the vector direction of the optical flow field by using the velocimetry line; wherein P0 represents the starting point coordinate of the speed measurement line, and P1 represents the ending point coordinate; the rotation angle θ of the tachometer line is:
the module M4 employs:
module M4.1: performing corner detection on the previous frame image Img0 to obtain a corner point set FP0{ p00, p01, & gt, p0n };
module M4.2: tracking the corner point set FP0 detected by the image Img0 on the second frame image Img1 one by one to obtain a tracking corner point set FP1{ p10, p11, & gt, p1n };
module M4.3: performing sparse optical flow tracking on the corner point set FP1 tracked by the image Img1 on the image Img0 in the reverse direction to obtain a verification corner point set PF0r { pr0, pr1, & gt, prn };
module M4.4: calculating Euclidean distances D { D0, D1, & gt, dn } of corner points corresponding to a detection corner set FP0 and a verification corner set FP0r on an image Img0, judging whether the corner points are inferior tracking points if so, deleting the corner points corresponding to positions in FP0 and FP1, and obtaining a screened (FP 0', FP 1') point pair;
module M4.5: a motion vector set { (p 00', p 10'), (p 01', p 11'), (p 0n ', p1 n') } based on the filtered (FP 0', FP 1') point pairs; the Euclidean distance of each group of point pairs is the pixel distance of the point motion;
module M4.6: calculating the rotation angle p_theta of each group of motion vectors (p 0n ', p1 n'), if the difference between the p_theta and the rotation angle theta of the tachometer line is larger than a preset threshold value, judging the group of motion vectors as interference vectors, and deleting the current group of motion vectors;
module M4.7: the screened motion vectors are effective motion vectors, and the Euclidean distance of each group of motion vectors is estimated as the pixel distance of the motion of the point;
the module M5 employs:
module M5.1: drawing equal four lattices on the plane white paper, and recording the actual length and width (Wr, hr) of one lattice;
module M5.2: the paper is tiled on a table top, a camera vertically shoots the paper to obtain a calibration chart, and the vertical distance d between the camera and the paper is recorded;
module M5.3: performing contour detection on the calibration graph to obtain the length and width of each grid pixel { (w 0, h 0), (w 1, h 1), (w 2, h 2), (w 3, h 3) }; the average length and width (w, h) of the next grid of the pixel coordinate system is obtained by calculating the average value of the length and width of each grid pixel;
module M5.4: obtaining a camera focal length F according to a similar triangle principle and a focal length calculation formula;
or->
The module M6 employs:
module M6.1: calculating an offset angle_y of any P point on the video in the y direction relative to the center of the visual field;
wherein F represents the focal length calibrated by the camera, and y represents the y coordinate of the P point;
module M6.2: calculating an offset angle_x of the P point relative to the center of the visual field in the x direction; the offset angle is positive and negative, and under the coordinate system of the camera, if the camera is offset towards the positive x direction, the positive x direction is positive and the negative x direction is negative;
wherein θ—x represents the horizontal offset angle of the camera;
module M6.3: calculating focal depth focal_length_y of the P point in the y direction;
module M6.4: calculating focal depth focal_length of the point P;
module M6.5: respectively calculating offset angles angle_x_real and angle_y_real of the P point relative to the camera in the x and y directions;
angle_x_real=θ_x+angle_x
angle_y_real=θ_y+angle_y
wherein θ_y represents the camera vertical offset angle;
module M6.6: the actual depth real_distance_y of the P point in the y direction;
module M6.7: calculating the actual depth real_distance of the P point;
module M6.8: finally, the proportion of the P point position is focal_length;
the module M7 employs: taking the scale of the motion vector center point as the whole vector scale to estimate the actual distance of the motion vector;
the module M8 employs:
module M8.1: carrying out actual distance estimation on each effective motion vector to obtain the actual motion distance D { D0, D1, & gt, dn } of the characteristic point, and averaging all the actual motion distances to obtain the actual distance D of the water flow motion in the period of time;
module M8.2: calculating the actual motion time of the two frames of images;
where fps represents the video frame rate; f (F) n Representing the number of frames of the nth frame, F 0 The number of frames of the 0 th frame is indicated.
Compared with the prior art, the invention has the following beneficial effects:
1. the invention provides a complete set of realization flow of a water flow speed estimation algorithm, and realizes the efficient tracking of water flow characteristic points and the rejection of inferior characteristic points in a standard mode through an improved lk optical flow tracking algorithm;
2. according to the method, aiming at abnormal rainstorm, windy and other weather conditions, manually defined velocimetry lines are constructively set, motion vectors in abnormal directions are accurately removed, and the influence of external factors such as water surface waves, ripples, rapid flow waves and the like on water flow estimation is solved in a targeted manner;
3. in the application level, in order to reduce the calibration cost, the invention improves the camera calibration flow based on the axiom of the similar triangle and the camera focal length calculation formula, and the improved flow has higher accuracy requirement on input parameters, but the calibration cost is lower.
Drawings
Other features, objects and advantages of the present invention will become more apparent upon reading of the detailed description of non-limiting embodiments, given with reference to the accompanying drawings in which:
fig. 1 is a flow chart of a water flow velocity estimation method based on conventional optical flow calculation.
Fig. 2 is a schematic diagram of scale measurement.
Detailed Description
The present invention will be described in detail with reference to specific examples. The following examples will assist those skilled in the art in further understanding the present invention, but are not intended to limit the invention in any way. It should be noted that variations and modifications could be made by those skilled in the art without departing from the inventive concept. These are all within the scope of the present invention.
Example 1
In order to reduce the influence of external factors on a water flow velocity measurement technology based on a video analysis method and improve the accuracy of a water flow velocity estimation detection method, a water flow velocity estimation method and a water flow velocity estimation system based on traditional optical flow calculation are provided.
The invention realizes the estimation of the water flow speed based on a sparse optical flow tracking algorithm, a camera calibration algorithm, an actual distance measuring and calculating algorithm and the like, and the water flow speed estimation method based on the traditional optical flow calculation is shown in fig. 1 and comprises the following steps:
an image input step: taking front and back two frames of water flow images Img0 and Img1 of a video, which are arbitrarily adjacent, and recording the corresponding frames F0 and F1 of the images and the frame rate fps of the video;
an image preprocessing step: carrying out noise reduction and enhancement treatment on the image by using image graying, median filtering and histogram equalization;
a step of demarcating a speed measurement line: manually drawing a velocimetry line, wherein the velocimetry line direction is required to be consistent with the water flow direction.
And Lk sparse optical flow tracking: and calculating a motion vector optical flow field of the two frames of images by using a sparse optical flow tracking algorithm, and screening the motion vector according to the direction of the velocimetry line to obtain an effective motion vector.
The camera calibration step: and calibrating the camera to obtain a camera focal length, and keeping various parameters of the camera unchanged after calibration is completed.
Measuring and calculating a proportion scale: and measuring and calculating the scale (pixel distance: actual distance) of any point on the image according to the camera calibration result.
The actual distance measuring and calculating step: and estimating the actual distance of the motion according to the motion vector coordinates tracked by the optical flow and the scale measuring and calculating method.
A water flow speed estimation step: and calculating according to the actual distance to obtain the actual distance of the water flow movement, and dividing the actual distance by the movement time to obtain the final water flow velocity in the period.
The image input step includes:
the optical flow tracking algorithm calculates the moving speed and direction of each object on the image between successive frames, so it is necessary to prepare at least two successive or adjacent water flow images as inputs { Img0, img1,..imgn }, and record the number of frames { F0, F1,..fn }, and the video frame rate fps for each input image, and the actual moving time of the two frame images is T (0, n):
the step of demarcating the velocimetry line comprises the following steps: demarcating a speed measuring line L along the water flow direction: and (P0, P1) to calibrate the vector direction of the optical flow field so as to reduce the influence of external information such as wind power, ripples and the like on the speed measurement. Wherein, P0 represents the line of speed measurement start point coordinates, and P1 represents the end point coordinates. The rotation angle θ of the tachometer line is:
the Lk sparse optical flow tracking step comprises the following steps:
in video moving object tracking, the sparse optical flow tracking algorithm is a classical moving object tracking algorithm, and can draw a tracking track and a moving direction of a moving object. The invention uses an improved sparse optical flow tracking algorithm to complete the tracking of water flow characteristics, and the specific implementation steps are as follows:
step 4.1: taking the associated front and rear two frames of water flow images in the input image, and marking the water flow images as Img0 and Img1;
step 4.2: carrying out preprocessing such as graying processing and necessary image noise reduction and enhancement on the image; specifically, image noise reduction is carried out by adopting median filtering, and image enhancement is carried out by adopting histogram equalization;
step 4.3: performing corner detection on the previous frame image Img0 by using a shi-Tomasi corner detection algorithm to obtain a corner set FP0{ p00, p01, & gt, p0n };
step 4.4: tracking the corner point set FP0 detected by the Img0 on the second frame image Img1 one by using a sparse optical flow tracking algorithm to obtain a tracking corner point set FP1{ p10, p11, & gt, p1n };
step 4.5: to verify the accuracy of corner tracking, performing sparse optical flow tracking on a corner set FP1 tracked by Img1 on Img0 in a reverse direction to obtain a verification corner set PF0r { pr0, pr1, & gt, prn };
step 4.6: calculating Euclidean distances D { D0, D1, & gt, of corner points corresponding to the detection corner set FP0 and the verification corner set FP0r on the Img0, and judging whether dn } is larger than a specified threshold T or not, wherein T can be set by oneself; if yes, judging that the corner point is an inferior tracking point, and deleting the corner points of the corresponding positions in FP0 and FP 1.
Step 4.7: the (FP 0', FP 1') point pairs screened in step 4.6 constitute a set of motion vectors of two-frame images { (p 00', p 10'), (p 01', p 11'), (p 0n ', p1 n') }. The Euclidean distance of each group of point pairs is the pixel distance of the point motion;
step 4.8: calculating the rotation angle p_theta of each group of motion vectors (p 0n ', p1 n'), if the difference between the p_theta and the rotation angle theta of the velocimetry line is larger than a specified threshold, namely, the direction of the motion vectors and the direction of the velocimetry line have great difference, and if the threshold can be specified by the user, judging the group of motion vectors as interference vectors and eliminating the interference vectors;
step 4.9: the motion vector screened in the step 4.8 is an effective motion vector, and the Euclidean distance of each group of motion vectors can be estimated as the pixel distance of the motion of the point.
The camera calibration step comprises the following steps:
the motion tracked by lk sparse optical flow is the motion of the pixel layer, and the length of the optical flow vector only represents the pixel distance of the motion and does not represent the real motion distance. Therefore, the pixel coordinates need to be mapped to the real world coordinates to obtain the actual motion distance. The invention provides an improved camera calibration algorithm based on a well-known similar triangle principle and a focal length calculation formula of a camera to realize the mapping from pixel coordinates to real world coordinates, and the specific realization steps are as follows:
step 5.1: an equal square lattice is drawn on plain white paper, a calibration plate can be directly used here, and the actual length and width (Wr, hr) of one square lattice can be recorded.
Step 5.2: and (3) tiling the paper on a table top, vertically shooting the paper by a camera to obtain a calibration chart, and recording the vertical distance d between the camera and the paper.
Step 5.3: and carrying out contour detection on the calibration graph to obtain the pixel length and width of each grid { (w 0, h 0), (w 1, h 1), (w 2, h 2), (w 3, h 3) }. The average length and width (w, h) of the next grid in the pixel coordinate system is obtained by calculating the average value of the length and width of each grid pixel.
Step 5.4: obtaining a camera focal length F according to a similar triangle principle and a focal length calculation formula;
or->
The step of measuring and calculating the scale comprises the following steps: :
as shown in fig. 2, the camera coordinate system is defined by the center point of the camera vertical to the ground, and the pixel coordinate system is defined by the upper left corner of the picture as the center. Wherein:
d: vertical distance of camera from ground
θ_x: horizontal offset angle of camera
θ_y: vertical offset angle of camera
O: center of field of view of camera shooting
Ox: horizontal component of O-point
Oy: vertical component of O-point
P: any point on video
The calculation of the scale (pixel distance: actual distance) of the position where any point P (x, y) is located on the image is as follows: wherein, assume that the image size is uniform 1920×1080;
step 6.1: and calculating an offset angle_y of the P point relative to the center of the visual field in the y direction, wherein the offset angle is positive and negative, and the y positive direction and the y negative direction are negative under a camera coordinate system.
F represents the focal length of camera calibration
Step 6.2: and calculating an offset angle angle_x of the P point relative to the center of the visual field in the x direction, wherein the offset angle is positive and negative, and if the camera is offset towards the x positive direction in the camera coordinate system, the x positive direction is positive and the reverse direction is negative. The camera is reversely deviated towards x, otherwise;
step 6.3: calculating focal depth focal_length_y of the P point in the y direction:
step 6.4: calculating focal depth focal_length of the point P:
step 6.5: the offset angles angle_x_real and angle_y_real of the P point relative to the camera in the x and y directions are calculated respectively:
angle_x_real=θ_x+angle_x
angle_y_real=θ_y+angle_y
step 6.6: the actual depth of the P point in the y-direction real_distance_y:
step 6.7: further calculating to obtain the actual depth real_distance of the P point as follows:
step 6.8: the finally obtained P-point position has a scale of focal_length: real_distance.
The actual distance estimation step includes:
knowing that the scale (pixel distance: actual distance) of each position point in the image is different, the actual distance of the displacement of the corner points from P1 (x 1, y 1) to P2 (x 2, y 2) is now calculated. Since the pixel distance of the corner moving between two frames is very short, it is reasonably assumed that the motion of the P1- > P2 point is linear motion, and then the linear formula of L (P1-P2) is:
then the integral formula for the actual displacement distance of P1-P2 is calculated as:
the water flow speed estimating step includes:
and estimating each effective motion vector by using an actual distance estimation method to obtain the actual motion distance D { D0, D1, & gt, dn } of the characteristic point, and averaging all the actual motion distances to obtain the actual distance D of the water flow motion in the period. Knowing the actual motion time of two frames of images as T (0, n), the water flow velocity during this period is finally estimated as speed=d/T (0, n).
The present invention also provides a water flow velocity estimation system based on the conventional optical flow calculation, which may be implemented by executing the flow steps of the water flow velocity estimation method based on the conventional optical flow calculation, that is, a person skilled in the art may understand the water flow velocity estimation method based on the conventional optical flow calculation as a preferred embodiment of the water flow velocity estimation system based on the conventional optical flow calculation.
Those skilled in the art will appreciate that the invention provides a system and its individual devices, modules, units, etc. that can be implemented entirely by logic programming of method steps, in addition to being implemented as pure computer readable program code, in the form of logic gates, switches, application specific integrated circuits, programmable logic controllers, embedded microcontrollers, etc. Therefore, the system and various devices, modules and units thereof provided by the invention can be regarded as a hardware component, and the devices, modules and units for realizing various functions included in the system can also be regarded as structures in the hardware component; means, modules, and units for implementing the various functions may also be considered as either software modules for implementing the methods or structures within hardware components.
The foregoing describes specific embodiments of the present invention. It is to be understood that the invention is not limited to the particular embodiments described above, and that various changes or modifications may be made by those skilled in the art within the scope of the appended claims without affecting the spirit of the invention. The embodiments of the present application and features in the embodiments may be combined with each other arbitrarily without conflict.

Claims (10)

1. A water flow velocity estimation method based on traditional optical flow calculation, comprising:
step S1: acquiring video continuous water flow images { Img0, img1, & gt, imgn };
step S2: preprocessing the acquired image to obtain a preprocessed image;
step S3: drawing a speed measuring line, wherein the direction of the speed measuring line is consistent with the direction of water flow;
step S4: calculating a motion vector optical flow field of the image based on the preprocessed image, and screening the motion vector according to the direction of the tachometer line to obtain an effective motion vector;
step S5: calibrating the camera to obtain a camera focal length, and keeping various parameters of the camera unchanged after calibration is completed;
step S6: measuring and calculating the scale of any point on the image according to the camera calibration result;
step S7: estimating the actual distance of the motion according to the motion vector coordinates tracked by the optical flow and the scale;
step S8: and calculating the actual distance of the water flow movement according to the actual distance, and calculating the final water flow velocity based on the actual distance and the movement time of the water flow movement.
2. The method for estimating a water flow velocity based on the conventional optical flow calculation according to claim 1, wherein the step S2 employs: the image is noise reduced and enhanced using techniques including image graying, median filtering, and histogram equalization.
3. The method for estimating a water flow velocity based on the conventional optical flow calculation according to claim 1, wherein the step S3 employs:
defining a velocimetry line L (P0, P1) along the water flow direction, and calibrating the vector direction of the optical flow field by using the velocimetry line; wherein P0 represents the starting point coordinate of the speed measurement line, and P1 represents the ending point coordinate; the rotation angle θ of the tachometer line is:
4. the method for estimating a water flow velocity based on the conventional optical flow calculation according to claim 1, wherein the step S4 employs:
step S4.1: performing corner detection on the previous frame image Img0 to obtain a corner point set FP0{ p00, p01, & gt, p0n };
step S4.2: tracking the corner point set FP0 detected by the image Img0 on the second frame image Img1 one by one to obtain a tracking corner point set FP1{ p10, p11, & gt, p1n };
step S4.3: performing sparse optical flow tracking on the corner point set FP1 tracked by the image Img1 on the image Img0 in the reverse direction to obtain a verification corner point set PF0r { pr0, pr1, & gt, prn };
step S4.4: calculating Euclidean distances D { D0, D1, & gt, dn } of corner points corresponding to a detection corner set FP0 and a verification corner set FP0r on an image Img0, judging whether the corner points are inferior tracking points if so, deleting the corner points corresponding to positions in FP0 and FP1, and obtaining a screened (FP 0', FP 1') point pair;
step S4.5: a motion vector set { (p 00', p 10'), (p 01', p 11'), (p 0n ', p1 n') } based on the filtered (FP 0', FP 1') point pairs; the Euclidean distance of each group of point pairs is the pixel distance of the point motion;
step S4.6: calculating the rotation angle p_theta of each group of motion vectors (p 0n ', p1 n'), if the difference between the p_theta and the rotation angle theta of the tachometer line is larger than a preset threshold value, judging the group of motion vectors as interference vectors, and deleting the current group of motion vectors;
step S4.7: the motion vectors after screening are effective motion vectors, and the Euclidean distance of each group of motion vectors is estimated as the pixel distance of the motion of the point.
5. The method for estimating a water flow velocity based on the conventional optical flow calculation according to claim 1, wherein the step S5 employs:
step S5.1: drawing equal four lattices on the plane white paper, and recording the actual length and width (Wr, hr) of one lattice;
step S5.2: the paper is tiled on a table top, a camera vertically shoots the paper to obtain a calibration chart, and the vertical distance d between the camera and the paper is recorded;
step S5.3: performing contour detection on the calibration graph to obtain the length and width of each grid pixel { (w 0, h 0), (w 1, h 1), (w 2, h 2), (w 3, h 3) }; the average length and width (w, h) of the next grid of the pixel coordinate system is obtained by calculating the average value of the length and width of each grid pixel;
step S5.4: obtaining a camera focal length F according to a similar triangle principle and a focal length calculation formula;
or->
6. The method for estimating a water flow velocity based on the conventional optical flow calculation according to claim 1, wherein the step S6 employs:
step S6.1: calculating an offset angle_y of any P point on the video in the y direction relative to the center of the visual field;
wherein F represents the focal length calibrated by the camera, and y represents the y coordinate of the P point;
step S6.2: calculating an offset angle_x of the P point relative to the center of the visual field in the x direction; the offset angle is positive and negative, and under the coordinate system of the camera, if the camera is offset towards the positive x direction, the positive x direction is positive and the negative x direction is negative;
wherein θ—x represents the horizontal offset angle of the camera;
step S6.3: calculating focal depth focal_length_y of the P point in the y direction;
step S6.4: calculating focal depth focal_length of the point P;
step S6.5: respectively calculating offset angles angle_x_real and angle_y_real of the P point relative to the camera in the x and y directions;
angle_x_real=θ_x+angle_x
angle_y_real=θ_y+angle_y
wherein θ_y represents the camera vertical offset angle;
step S6.6: the actual depth real_distance_y of the P point in the y direction;
step S6.7: calculating the actual depth real_distance of the P point;
step S6.8: finally, the obtained P point position has the scale of focal_length: real_distance.
7. The method for estimating a water flow velocity based on the conventional optical flow calculation according to claim 1, wherein the step S7 employs: the scale of the center point of the motion vector is regarded as the whole vector scale to estimate the actual distance of the motion vector.
8. The method for estimating a water flow velocity based on the conventional optical flow calculation according to claim 1, wherein the step S8 employs:
step S8.1: carrying out actual distance estimation on each effective motion vector to obtain the actual motion distance D { D0, D1, & gt, dn } of the characteristic point, and averaging all the actual motion distances to obtain the actual distance D of the water flow motion in the period of time;
step S8.2: calculating the actual motion time of the two frames of images;
where fps represents the video frame rate; f (F) n Representing the number of frames of the nth frame, F 0 The number of frames of the 0 th frame is indicated.
9. A water flow velocity estimation system based on conventional optical flow calculation, comprising:
module M1: acquiring video continuous water flow images { Img0, img1, & gt, imgn };
module M2: preprocessing the acquired image to obtain a preprocessed image;
module M3: drawing a speed measuring line, wherein the direction of the speed measuring line is consistent with the direction of water flow;
module M4: calculating a motion vector optical flow field of the image based on the preprocessed image, and screening the motion vector according to the direction of the tachometer line to obtain an effective motion vector;
module M5: calibrating the camera to obtain a camera focal length, and keeping various parameters of the camera unchanged after calibration is completed;
module M6: measuring and calculating the scale of any point on the image according to the camera calibration result;
module M7: estimating the actual distance of the motion according to the motion vector coordinates tracked by the optical flow and the scale;
module M8: and calculating the actual distance of the water flow movement according to the actual distance, and calculating the final water flow velocity based on the actual distance and the movement time of the water flow movement.
10. The system for estimating the velocity of water flow based on conventional optical flow calculation according to claim 9, wherein said module M2 employs: noise reduction and enhancement processing are carried out on the image by using the method including image graying, median filtering and histogram equalization;
the module M3 employs:
defining a velocimetry line L (P0, P1) along the water flow direction, and calibrating the vector direction of the optical flow field by using the velocimetry line; wherein P0 represents the starting point coordinate of the speed measurement line, and P1 represents the ending point coordinate; the rotation angle θ of the tachometer line is:
the module M4 employs:
module M4.1: performing corner detection on the previous frame image Img0 to obtain a corner point set FP0{ p00, p01, & gt, p0n };
module M4.2: tracking the corner point set FP0 detected by the image Img0 on the second frame image Img1 one by one to obtain a tracking corner point set FP1{ p10, p11, & gt, p1n };
module M4.3: performing sparse optical flow tracking on the corner point set FP1 tracked by the image Img1 on the image Img0 in the reverse direction to obtain a verification corner point set PF0r { pr0, pr1, & gt, prn };
module M4.4: calculating Euclidean distances D { D0, D1, & gt, dn } of corner points corresponding to a detection corner set FP0 and a verification corner set FP0r on an image Img0, judging whether the corner points are inferior tracking points if so, deleting the corner points corresponding to positions in FP0 and FP1, and obtaining a screened (FP 0', FP 1') point pair;
module M4.5: a motion vector set { (p 00', p 10'), (p 01', p 11'), (p 0n ', p1 n') } based on the filtered (FP 0', FP 1') point pairs; the Euclidean distance of each group of point pairs is the pixel distance of the point motion;
module M4.6: calculating the rotation angle p_theta of each group of motion vectors (p 0n ', p1 n'), if the difference between the p_theta and the rotation angle theta of the tachometer line is larger than a preset threshold value, judging the group of motion vectors as interference vectors, and deleting the current group of motion vectors;
module M4.7: the screened motion vectors are effective motion vectors, and the Euclidean distance of each group of motion vectors is estimated as the pixel distance of the motion of the point;
the module M5 employs:
module M5.1: drawing equal four lattices on the plane white paper, and recording the actual length and width (Wr, hr) of one lattice;
module M5.2: the paper is tiled on a table top, a camera vertically shoots the paper to obtain a calibration chart, and the vertical distance d between the camera and the paper is recorded;
module M5.3: performing contour detection on the calibration graph to obtain the length and width of each grid pixel { (w 0, h 0), (w 1, h 1), (w 2, h 2), (w 3, h 3) }; the average length and width (w, h) of the next grid of the pixel coordinate system is obtained by calculating the average value of the length and width of each grid pixel;
module M5.4: obtaining a camera focal length F according to a similar triangle principle and a focal length calculation formula;
or->
The module M6 employs:
module M6.1: calculating an offset angle_y of any P point on the video in the y direction relative to the center of the visual field;
wherein F represents the focal length calibrated by the camera, and y represents the y coordinate of the P point;
module M6.2: calculating an offset angle_x of the P point relative to the center of the visual field in the x direction; the offset angle is positive and negative, and under the coordinate system of the camera, if the camera is offset towards the positive x direction, the positive x direction is positive and the negative x direction is negative;
wherein θ—x represents the horizontal offset angle of the camera;
module M6.3: calculating focal depth focal_length_y of the P point in the y direction;
module M6.4: calculating focal depth focal_length of the point P;
module M6.5: respectively calculating offset angles angle_x_real and angle_y_real of the P point relative to the camera in the x and y directions;
angle_x_real=θ_x+angle_x
angle_y_real=θ_y+angle_y
wherein θ_y represents the camera vertical offset angle;
module M6.6: the actual depth real_distance_y of the P point in the y direction;
module M6.7: calculating the actual depth real_distance of the P point;
module M6.8: finally, the proportion of the P point position is focal_length;
the module M7 employs: taking the scale of the motion vector center point as the whole vector scale to estimate the actual distance of the motion vector;
the module M8 employs:
module M8.1: carrying out actual distance estimation on each effective motion vector to obtain the actual motion distance D { D0, D1, & gt, dn } of the characteristic point, and averaging all the actual motion distances to obtain the actual distance D of the water flow motion in the period of time;
module M8.2: calculating the actual motion time of the two frames of images;
where fps represents the video frame rate; f (F) n Representing the number of frames of the nth frame, F 0 The number of frames of the 0 th frame is indicated.
CN202311525100.1A 2023-11-15 2023-11-15 Water flow speed estimation method and system based on traditional optical flow calculation Pending CN117576168A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311525100.1A CN117576168A (en) 2023-11-15 2023-11-15 Water flow speed estimation method and system based on traditional optical flow calculation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311525100.1A CN117576168A (en) 2023-11-15 2023-11-15 Water flow speed estimation method and system based on traditional optical flow calculation

Publications (1)

Publication Number Publication Date
CN117576168A true CN117576168A (en) 2024-02-20

Family

ID=89887426

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311525100.1A Pending CN117576168A (en) 2023-11-15 2023-11-15 Water flow speed estimation method and system based on traditional optical flow calculation

Country Status (1)

Country Link
CN (1) CN117576168A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118037780A (en) * 2024-04-10 2024-05-14 武汉大水云科技有限公司 River surface flow measuring method and measuring device based on video scanning

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118037780A (en) * 2024-04-10 2024-05-14 武汉大水云科技有限公司 River surface flow measuring method and measuring device based on video scanning

Similar Documents

Publication Publication Date Title
CN109166077B (en) Image alignment method and device, readable storage medium and computer equipment
CN101236656B (en) Movement target detection method based on block-dividing image
CN103325112B (en) Moving target method for quick in dynamic scene
KR101996992B1 (en) Apparatus and Method for Measuring Flow Velocity of River using Optical Flow Image Processing
CN107133969B (en) A kind of mobile platform moving target detecting method based on background back projection
CN103729858B (en) A kind of video monitoring system is left over the detection method of article
CN107146240A (en) The video target tracking method of taking photo by plane detected based on correlation filtering and conspicuousness
CN113052876B (en) Video relay tracking method and system based on deep learning
CN117576168A (en) Water flow speed estimation method and system based on traditional optical flow calculation
CN107248174A (en) A kind of method for tracking target based on TLD algorithms
CN108804992B (en) Crowd counting method based on deep learning
CN109872483A (en) A kind of invasion warning photoelectric monitoring system and method
CN104200492B (en) Video object automatic detection tracking of taking photo by plane based on profile constraints
CN111160210A (en) Video-based water flow velocity detection method and system
CN111879292B (en) Coastline dynamic monitoring method, coastline dynamic monitoring equipment and storage medium
Sakaino Camera-vision-based water level estimation
CN104079800A (en) Shaking preventing method for video image in video surveillance
CN103778436A (en) Pedestrian gesture inspecting method based on image processing
CN106960445A (en) A kind of cloud motion vector calculating method based on pyramid light stream
WO2023236886A1 (en) Cloud occlusion prediction method based on dense optical flow method
CN111914695A (en) Tidal bore monitoring method based on machine vision
CN115761563A (en) River surface flow velocity calculation method and system based on optical flow measurement and calculation
CN102737384A (en) Automatic spherical camera tracing method
JP4108593B2 (en) Inundation monitoring device
CN115880643B (en) Social distance monitoring method and device based on target detection algorithm

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination