CN116934808A - River surface flow velocity measurement method based on water surface floater target tracking - Google Patents

River surface flow velocity measurement method based on water surface floater target tracking Download PDF

Info

Publication number
CN116934808A
CN116934808A CN202310923396.6A CN202310923396A CN116934808A CN 116934808 A CN116934808 A CN 116934808A CN 202310923396 A CN202310923396 A CN 202310923396A CN 116934808 A CN116934808 A CN 116934808A
Authority
CN
China
Prior art keywords
target
image
track
flow velocity
pipeline
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310923396.6A
Other languages
Chinese (zh)
Inventor
木正鹏
张振
沈洁
刘博远
张子凯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hohai University HHU
Original Assignee
Hohai University HHU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hohai University HHU filed Critical Hohai University HHU
Priority to CN202310923396.6A priority Critical patent/CN116934808A/en
Publication of CN116934808A publication Critical patent/CN116934808A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20092Interactive image processing based on input by user
    • G06T2207/20104Interactive definition of region of interest [ROI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30241Trajectory
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/30Assessment of water resources

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Geometry (AREA)
  • Image Analysis (AREA)

Abstract

The application discloses a river surface flow velocity measurement method based on target tracking of water surface floaters, which comprises the following steps: performing distortion correction on an original video image sequence, and setting a region of interest (ROI); the pixel-level self-adaptive segmentation algorithm and the average-reduction image method which introduce a foreground counting mechanism are fused to carry out multi-target segmentation on the image after distortion correction; screening candidate targets to remove false targets; reconstructing a floater movement track by utilizing a multi-feature fusion target matching method, and storing image coordinate information of a floater target centroid on each track; converting the image coordinates on each track into world coordinates by utilizing direct linear transformation, and calculating a flow velocity value according to the world coordinates of adjacent points on each track; and (3) carrying out gridding treatment on all the flow velocity values by adopting an inverse distance weight interpolation method to obtain a uniform two-dimensional flow velocity vector field. The method is suitable for river flow rate measurement in a multi-floater scene, and can improve the accuracy of a flow measurement algorithm in the measurement scene.

Description

River surface flow velocity measurement method based on water surface floater target tracking
Technical Field
The application belongs to the technical field of intelligent water conservancy video monitoring, relates to a method for measuring river surface flow velocity, and particularly relates to a river surface flow velocity measuring method based on target tracking of water surface floaters.
Background
In recent years, with the rapid development of video monitoring and computer vision technologies, image flow measurement technologies have been widely focused and applied in the field of hydrologic monitoring due to the advantages of non-contact, low cost and high safety, and a series of image flow measurement technologies including large-scale particle image velocimetry (LSPIV), particle Tracking Velocimetry (PTV), space-time image velocimetry (STIV), optical Tracking Velocimetry (OTV) and the like have been derived. Dividing a region to be detected in an image into a plurality of regions by a large-scale particle image velocimetry technology, carrying out correlation analysis on the regions in the front and rear frame images, wherein the displacement of two windows with the largest correlation coefficient corresponds to the average speed of the region, and the two windows require a water surface floating object or stable water surface mode, so that a two-dimensional surface flow velocity field can be obtained; the particle tracking velocity measurement technology identifies and tracks the motion of trace particles on the water surface of a river, and reconstructs the motion trail of the particles, so that a two-dimensional flow velocity field is obtained, trace particles which are required to be scattered manually or float on the water surface exist, and the measured flow velocity has higher spatial resolution; the time-space image velocity measurement technology utilizes the continuity of the motion of the water flow tracer, adopts velocity measurement lines parallel to the forward flow direction as analysis areas, detects texture direction characteristics related to the motion of the tracer in time-space images formed by image space and sequence time, directly estimates one-dimensional time-average motion vectors in the appointed space direction, only needs stable natural water surface modes such as wave, vortex and the like on the water surface, and has the advantages of high spatial resolution and strong real-time performance; the optical tracking speed measurement technology detects the angular point characteristics of the water surface target, and the tracking algorithm is combined to reconstruct the motion trail of the characteristic points, so that a sparse flow velocity field is obtained, and floating objects or manually scattered trace particles are required to exist on the water surface to provide stable characteristic points, so that a two-dimensional flow velocity field with higher spatial resolution can be obtained.
Under the scene that the river surface flow velocity is greater than 0.5m/s, the motion of the water body can be well represented by a natural water surface mode, floats or trace particles, so that the image method flow measurement technology can obtain good measurement results. However, in some scenes with low flow velocity and stormy waves on the river surface, the natural water surface mode is easily influenced by environmental noise such as water wave fluctuation formed by wind power interference, so that the natural water surface mode can not accurately represent the movement of the water body. In contrast, the motion of the water body represented by the rigid floating object accords with the measurement principle of the buoy method in the hydrological test method, and is more suitable for measuring the flow velocity under the condition. However, the large-scale particle image velocimetry and the space-time image velocimetry cannot identify the effective flow velocity vector estimated by the motion of the rigid floater target in the measurement result, the characteristic points tracked by the optical tracking velocimetry cannot filter the influence of water ripple fluctuation, the existing particle tracking velocimetry focuses on the research on the problems of particle density, particle shielding and the like under the condition of manually scattering the tracer particles, the particle detection method mainly aims at the artificial tracer particles with clear targets and good backgrounds, the detection problem of the floater under the natural scene is difficult to solve, the particle tracking method mainly aims at the particle matching problem under the conditions of high particle density, particle shielding and the like, the particle tracking method is relatively complex, the calculated amount is large, and the particle tracking method is not suitable for the assumption that the floater density is generally sparse and the shielding problem does not exist under the natural scene.
Disclosure of Invention
The application aims to: aiming at the river surface flow velocity measurement problem, the river surface flow velocity measurement method based on the water surface floater target tracking is provided, the surface flow velocity estimation is carried out by utilizing the motion trail of the floater target in a natural scene, and the method is suitable for river surface flow velocity measurement in the scenes of low flow velocity, stormy waves and the like.
The technical scheme is as follows: in order to achieve the above purpose, the application provides a river surface flow velocity measurement method based on object tracking of a water surface floater, which comprises the following steps:
s1: performing distortion correction on an original video image sequence, and setting a region of interest (ROI);
s2: a pixel-level self-adaptive segmentation algorithm (PBAS) introducing a foreground counting mechanism and a mean-subtraction image method are fused to carry out multi-target segmentation on the image after distortion correction;
s3: aiming at the segmented targets, a self-adaptive pipeline filtering method is used, candidate target screening is carried out by utilizing the motion characteristics of the targets between adjacent frames, and false targets are removed;
s4: reconstructing a floater movement track by utilizing a multi-feature fusion target matching method, filtering according to the length and the angle of the track, and storing the image coordinate information of the center of mass of a floater target on each track;
s5: converting the image coordinates on each track into world coordinates by using Direct Linear Transformation (DLT), and calculating a flow velocity value according to the world coordinates of adjacent points on each track;
s6: and (3) carrying out gridding treatment on all the flow velocity values by adopting an inverse distance weight interpolation method to obtain a uniform two-dimensional flow velocity vector field.
Further, in the step S1, the internal reference calibration is performed on the camera in the laboratory according to the Zhang Zhengyou calibration method, so as to obtain the pixel size S after calibration, and the internal reference matrix K and the distortion parameter matrix D are respectively as follows:
D=[k 1 k 2 p 1 p 2 ] (18)
wherein C is x Representing the abscissa of the principal point of the image, C y Representing the ordinate of the principal point of the image, f x Representing the equivalent focal length of the camera on the x-axis of the image plane, f y Representing the equivalent focal length of the camera on the y-axis of the image plane, the focal length f of the camera is calculated according to the pixel size s of the image sensor:
f=(f x +f y )·s/2 (19)
k 1 representing a first radial distortion coefficient, p 1 Representing a first tangential distortion coefficient, k 2 Representing a second radial distortion coefficient, p 2 Representing a second tangential distortion coefficient. The image is distortion corrected by the following formula:
camera coordinates (x, y) of the undistorted image are converted into image coordinates (u, v) by the following formula:
the camera coordinates (x ', y') of the distorted image are converted into image coordinates (u ', v') by the same method:
after the distortion correction of the image is completed, the ROI area is set for subsequent measurement.
Further, in the step S2, a pixel-level adaptive segmentation algorithm (PBAS) introducing a foreground counting mechanism and a subtractive mean image method are fused to perform multi-target segmentation, and the process is as follows:
firstly, segmenting a floating object foreground and a river water surface background by utilizing a PBAS algorithm, inputting the video image sequence, and comparing the background model of the image with a current pixel point to determine the classification of the foreground and the background; wherein the background model is defined by an array of N most recently observed pixel values:
B(x i )={B 1 (x i ),...,B k (x i ),...,B N (x i )} (23)
at the same time, the PBAS algorithm compares the current pixel to the background model,to determine the classification of foreground and background, if pixel x i Pixel value I (x i ) At least #min of the N background values, which is less than the determination threshold R (x i ) Then determine x i By way of background, the calculation process is:
wherein f=0 and f=1 denote that the pixel point is a background point and a foreground point, respectively, dist (I (x i ),Bk(x i ) A distance between the current point and the background model; for the pixel point which is currently judged to be the background, randomly and uniformly selecting a certain index k epsilon 1 k (x i ) With probability p=1/T (x i ) Is replaced with the current pixel value I (x i ) The method comprises the steps of carrying out a first treatment on the surface of the At the same time, with probability p=1/T (x i ) Randomly selecting adjacent pixels y i ∈N(x i ) Background model B at the neighboring pixels k (y i ) Is represented by its current pixel value I (y i ) Replacement;
except in the background model B (x i ) In addition to storing the most recently observed array of pixel values, an array D (x) of minimum decision distances is created i )={D 1 (x i ),...,D N (x i ) Every time B is executed k (x i ) At the time of updating of (a), the minimum distance d currently observed min (x i )=min k dist(I(x i ),B k (x i ) Writing the array, thereby creating a history of minimum decision distances, the average of these historyFor measuring the degree of dynamic change of the background; thus, the determination threshold R (x) is dynamically adjusted according to the following strategy i ):
Wherein R is inc/dec And R is scale Is a fixed parameter for R (x i ) Dynamic control of (2);
for update probability T (x i ) Also byAnd (3) performing control:
wherein T is inc ,T dec Is a fixed parameter; simultaneously setting the upper and lower limits of the update probability as T lower <T<T upper
Meanwhile, in order to restrain the frequent detection of the water surface background as the foreground due to the large noise interference in the area with severe water wave fluctuation, a foreground counting mechanism COUNT is introduced t (x i ):
F(xi)=0,if COUNT t (x i )=COUNT max (28)
Wherein COUNT is max The maximum number of times that a pixel point is detected as a foreground is 2 times of the frame rate; the foreground counting mechanism ensures that the time for detecting the water surface background as the foreground is not overlong in the area with severe water wave fluctuation, and the foreground counting mechanism is not triggered when only floaters pass in other normal areas;
then, dividing the floater target by using an average-subtraction image method, and calculating an average image of the video image sequence:
wherein N is the total frame number of the video image sequence, S is a fixed parameter;
subtracting the average image from each frame image to obtain a saliency map:
I s =I k -I m (30)
partitioning the saliency map into a target and a background by using a maximum inter-class variance method (otsu);
finally fusing the segmentation detection results of the two methods:
I mask =I pbas ∩I mean (31)
wherein I is pbas To introduce the segmentation result of the PBAS algorithm of the foreground counting mechanism, I mean To subtract the segmentation result of the average image method, I mask And finally outputting a multi-target segmentation result.
Further, in the step S3, a self-adaptive pipeline filtering method is used, and candidate targets are screened by utilizing the motion characteristics of the targets between adjacent frames, so as to remove false targets, and the process is as follows:
for the multi-target segmentation result finally output in the step S2, the centroid (x) of each target in the first frame segmentation result i ,y i ) Regarding as a pipeline center, establishing a space pipeline penetrating through a multi-frame image sequence, wherein the pipeline length L represents a time span required to be continuously detected, the pipeline diameter d represents a comparison range between two adjacent frames, setting according to the size of a target, judging the target in a corresponding first frame image as a real foreground target when the frequency of the target appearing in the pipeline reaches a threshold value requirement, otherwise regarding as a false detection result and eliminating; when a new target which does not belong to the existing pipeline appears, creating a new pipeline for the target;
meanwhile, aiming at the rectangle that the pipeline created by the traditional pipeline filtering is fixed at the center, when the motion of the target in the multi-frame image sequence is not standard linear motion, a larger pipeline diameter d is needed to ensure that the target is positioned in the pipeline, and under the condition that the number of the targets is large, only one moving target is difficult to ensure in one pipeline, so that the pipeline is difficult to distinguish, the center of the pipeline is self-adaptive, and when the target positioned in the pipeline is found in the k-1 frame through the pipeline created in the k-1 frame, the center of the pipeline in the k-1 frame and the center of the pipeline in the center of mass of the target in the k frame are corrected:
wherein the method comprises the steps ofFor the center coordinate of the kth frame pipe, +.>Is the center coordinate of the k-1 frame pipeline, x k ,y k The position coordinates of the kth frame target, Δx, Δy, are correction amounts of the pipe center.
Further, in the step S4, a motion track of the floater is reconstructed by using a multi-feature fusion target matching method, filtering is performed according to the length and the angle of the track, and image coordinate information of the center of mass of the floater target on each track is stored, wherein the process is as follows:
selecting a fixed frame interval for the image sequence of the target segmentation result subjected to the pipeline filtering processing in the step S3, and counting the perimeter L, the area S and the circularity e of a floater target in the selected image, wherein the perimeter L is obtained by calculating the number of pixels of a target contour line in the segmentation result, the area S is obtained by calculating the number of pixels in a target area, and the circularity e is calculated according to the perimeter and the area characteristics of the target, and the maximum value of the circularity e is 1 and is calculated as follows:
taking the target to be matched in the previous frame image as a father target, and taking the target in the subsequent frame image as a child target; the relative errors of the father target and the child target are calculated by combining the three characteristics to estimate the similarity of the father target and the child target:
wherein lambda is 1 ,λ 2 ,λ 3 Weights representing the target circumference, area and circularity, respectively, and satisfy lambda 123 =1, giving each feature the same weight, considering that the degree of influence of each feature is different for each object in a continuous sequence of images and each image, i.eDelta is the relative error between the target to be matched in the previous frame image and the target in the subsequent frame image, and delta epsilon [0, 1), and the higher the target similarity is, the more delta approaches to 0; the relative error of the parent and child targets should satisfy delta < delta min ,δ min A threshold value for relative error; and the child target should be located within a rectangular search window of the parent target, the size of the search window being determined by the prior maximum flow rate;
searching all the floater targets meeting the conditions in the rear frame image as candidate targets, and calculating the distance between the floater targets and the parent targets in the front frame image, wherein the target with the smallest distance is the child target matched with the parent targets; traversing all the floater targets in the selected images, adding the barycenter coordinates of the child targets meeting the matching relation with the father targets in the previous frame images to the same track, regarding the child targets which do not meet the matching relation as newly-appearing targets, and adding the newly-created targets to the newly-created track; the motion trail of each floater target is a set of barycenter coordinates and is stored in a two-dimensional array form;
filtering is carried out according to the length and the angle of the tracks, and each track has certain directivity according to the prior flow direction information of the river, namely, the following conditions are satisfied:
wherein x is start ,y start Is the abscissa and ordinate, x, of the start point of the trajectory end ,y end Is the abscissa and ordinate of the track end point, and θ is the angle threshold; meanwhile, the track with shorter length is removed:
l is the preset shortest track length;
thus, the reconstruction of the motion trail of the floating object is completed.
Further, the step S5 converts the image coordinates on each track into world coordinates using Direct Linear Transformation (DLT):
wherein z is c Is an arbitrary parameter, (u, v) is an image coordinate, (x) w ,y w ,z w ) The world coordinates are obtained by solving a set of control points with known space point coordinates and image point coordinates, wherein the L matrix is a perspective projection matrix;
and calculating the actual physical distance from the world coordinates of two adjacent points on each track:
x 1 ,y 1 and x 2 ,y 2 For world coordinates of two adjacent points on the track, calculating a time interval t of the two adjacent points through a video frame rate, and according to a speed formula:
v=s/t (40)
calculating the actual speed determined by two adjacent points on each track; according to the image coordinate information of two adjacent points, the direction of the actual speed is calculated:
preserving the magnitude v, direction ang and world coordinates (x 1 ,y 1 ) The method comprises the steps of carrying out a first treatment on the surface of the And traversing all tracks, calculating the actual flow velocity of all adjacent points on each track, and storing a calculation result.
Further, the step S6 performs gridding processing on all the flow velocity values by adopting an inverse distance weight interpolation method to obtain a uniform two-dimensional flow velocity vector field, and the process is as follows:
dividing a video image into uniform rectangular speed measurement grids; interpolation of flow velocity values within each grid to the grid center using inverse distance weighted interpolation:
where N is the number of flow vectors per grid, d i Is the velocity vectorThe distance between the interpolation point j and the interpolation point w is a weight coefficient;
and performing interpolation calculation on each evenly divided velocity measurement grid to obtain a gridded flow velocity field.
In the application, the application scene is assumed to be a natural river, and a natural floater target exists on the surface of the natural river. On the premise of this, firstly, camera internal parameters are marked in a laboratory, distortion correction is carried out on images by the camera internal parameters, then, a pixel-level self-adaptive segmentation algorithm (PBAS) introducing a foreground counting mechanism and an average-subtraction image method are fused, multi-target segmentation is carried out, target screening is carried out through a pipeline filtering method, false targets are removed, target segmentation results are matched, a floater motion track is reconstructed, track reconstruction results are filtered, then, image coordinates on each track are converted into world coordinates by utilizing Direct Linear Transformation (DLT), flow velocity values are calculated, and finally, gridding treatment is carried out by utilizing an inverse distance weight interpolation method, so that a uniform two-dimensional flow velocity vector field is obtained.
The beneficial effects are that: compared with the prior art, the method has the advantages that the floating object existing on the surface of the natural river is used as the tracer for representing the movement of the river water body, the movement track is reconstructed by tracking the movement of the floating object, the flow rate of the river surface is further obtained, the tracer particles are not required to be manually scattered, the problem that the effective flow velocity vector estimated by the movement of the rigid floating object in the measurement result cannot be accurately identified in the scene that the natural water surface modes such as low flow rate, stormy waves and the like of the conventional flow measurement algorithm cannot be accurately represented by the movement of the water body is solved, and therefore the accuracy of the flow measurement algorithm in the measurement scene is improved.
Drawings
FIG. 1 is a flow chart of the method of the present application;
FIG. 2 is an original image to be distortion corrected;
FIG. 3 is a distortion corrected image;
FIG. 4 is a flow chart of multi-object segmentation of a single frame image;
FIG. 5 is a schematic diagram of a pipeline filtering method;
FIG. 6 is a specific example diagram of pipeline filtering;
FIG. 7 is a schematic diagram of flow field meshing;
fig. 8 is a specific example diagram of a meshed flow field.
Detailed Description
The present application is further illustrated in the accompanying drawings, with the understanding that these examples are intended to be illustrative of the application only and are not intended to limit the scope of the application, and that various modifications to the application, equivalent to those skilled in the art, will fall within the scope of the application as defined in the appended claims after reading the application.
The application provides a river surface flow velocity measuring method based on water surface floater target tracking, which is shown in fig. 1 and comprises the following steps:
s1: the original video image sequence shown in fig. 2 is subjected to distortion correction, and the camera is subjected to internal reference calibration in a laboratory according to a Zhang Zhengyou calibration method, so that the pixel size s, the internal reference matrix K and the distortion parameter matrix D are obtained after calibration, and are respectively as follows:
D=[k 1 k 2 p 1 p 2 ] (44)
wherein C is x Representing the abscissa of the principal point of the image, C y Representing the ordinate of the principal point of the image, f x Representing the equivalent focal length of the camera on the x-axis of the image plane, f y Representing the equivalent focal length of the camera on the y-axis of the image plane, the focal length f of the camera is calculated according to the pixel size s of the image sensor:
f=(f x +f y )·s/2 (45)
k 1 representing a first radial distortion coefficient, p 1 Representing a first tangential distortion coefficient, k 2 Representing a second radial distortion coefficient, p 2 Representing a second tangential distortion coefficient. The image is distortion corrected by the following formula:
camera coordinates (x, y) of the undistorted image are converted into image coordinates (u, v) by the following formula:
the camera coordinates (x ', y') of the distorted image are converted into image coordinates (u ', v') by the same method:
the distortion correction of the image is completed, the undistorted image shown in fig. 3 is obtained, and the ROI area is set in the undistorted image for subsequent measurement.
In the application, the ROI area is manually selected and set. Since the photographing angle of the camera is not changed, the ROI is set only once, and the purpose of the ROI is to determine the measurement region.
S2: the process of multi-target segmentation by fusing a pixel-level adaptive segmentation algorithm (PBAS) introducing a foreground counting mechanism and a subtraction average image method is shown in fig. 4:
firstly, segmenting a floating object foreground and a river water surface background by utilizing a PBAS algorithm, inputting the video image sequence, and comparing the background model of the image with a current pixel point to determine the classification of the foreground and the background; wherein the background model is defined by an array of N most recently observed pixel values:
B(x i )={B 1 (x i ),...,B k (x i ),...,B N (x i )} (49)
meanwhile, the PBAS algorithm determines the classification of the foreground and the background by comparing the current pixel with the background model, if the pixel is x i Pixel value I (x i ) At least #min of the N background values, which is less than the determination threshold R (x i ) Then determine x i By way of background, the calculation process is:
wherein, f=0 and f=1 respectively represent that the pixel point is a background point and a foreground point, dist (I (xi), bk (xi)) represents the distance between the current point and the background model; for the pixel point which is currently judged to be the background, randomly and uniformly selecting a certain index k epsilon 1 k (x i ) With probability p=1/T (x i ) Is replaced with the current pixel value I (x i ) The method comprises the steps of carrying out a first treatment on the surface of the At the same time, with probability p=1/T (x i ) Randomly selecting adjacent pixels y i ∈N(x i ) Background model B at the neighboring pixels k (y i ) Is represented by its current pixel value I (y i ) Replacement;
except in the background model B (x i ) In addition to storing the most recently observed array of pixel values, an array D (x) of minimum decision distances is created i )={D 1 (x i ),...,D N (x i ) Every time B is executed k (x i ) At the time of updating of (a), the most currently observedSmall distance d min (x i )=min k dist(I(x i ),B k (x i ) Writing the array, thereby creating a history of minimum decision distances, the average of these historyFor measuring the degree of dynamic change of the background; thus, the determination threshold R (x) is dynamically adjusted according to the following strategy i ):
Wherein R is inc/dec And R is scale Is a fixed parameter for R (x i ) Dynamic control of (2);
for update probability T (x i ) Also byAnd (3) performing control:
wherein T is inc ,T dec Is a fixed parameter; simultaneously setting the upper and lower limits of the update probability as T lower <T<T upper
Meanwhile, in order to restrain the frequent detection of the water surface background as the foreground due to the large noise interference in the area with severe water wave fluctuation, a foreground counting mechanism COUNT is introduced t (x i ):
F(xi)=0,if COUNT t (x i )=COUNT max (54)
Wherein COUNT is max The maximum number of times that a pixel point is detected as a foreground is 2 times of the frame rate; the foreground counting mechanism ensures that the water is in the waterIn the area with severe ripple fluctuation, the time for detecting the water surface background as the foreground is not too long, and in other normal areas, only when floaters pass, the floaters are judged as the foreground, and a foreground counting mechanism is not triggered;
then, dividing the floater target by using an average-subtraction image method, and calculating an average image of the video image sequence:
wherein N is the total frame number of the video image sequence, S is a fixed parameter;
subtracting the average image from each frame image to obtain a saliency map:
I s =I k -I m (56)
partitioning the saliency map into a target and a background by using a maximum inter-class variance method (otsu);
finally fusing the segmentation detection results of the two methods:
I mask =I pbas ∩I mean (57)
wherein I is pbas To introduce the segmentation result of the PBAS algorithm of the foreground counting mechanism, I mean To subtract the segmentation result of the average image method, I mask And finally outputting a multi-target segmentation result.
S3: using the adaptive pipeline filtering method, candidate target screening is performed by utilizing the motion characteristics of the target between adjacent frames, and false targets are removed, wherein the process is as shown in fig. 5:
for the multi-target segmentation result finally output in the step S2, the centroid (x) of each target in the first frame segmentation result i ,y i ) Regarding as a pipeline center, establishing a space pipeline penetrating through a multi-frame image sequence, wherein the pipeline length L represents the time span of continuous detection, the pipeline diameter d represents the comparison range between two adjacent frames, and according to the size setting of the target, when the number of times of occurrence of the target in the pipeline reaches a threshold requirement, the target in the corresponding first frame image is judged to be a real foreground target, otherwise, regarding as a false detection result and regarding the real foreground target as a false detection resultRemoving; when a new target which does not belong to the existing pipeline appears, creating a new pipeline for the target;
meanwhile, aiming at the rectangle that the pipeline created by the traditional pipeline filtering is fixed at the center, when the motion of the target in the multi-frame image sequence is not standard linear motion, a larger pipeline diameter d is needed to ensure that the target is positioned in the pipeline, and under the condition that the number of the targets is large, only one moving target is difficult to ensure in one pipeline, so that the pipeline is difficult to distinguish, the center of the pipeline is self-adaptive, and when the target positioned in the pipeline is found in the k-1 frame through the pipeline created in the k-1 frame, the center of the pipeline in the k-1 frame and the center of the pipeline in the center of mass of the target in the k frame are corrected:
wherein the method comprises the steps ofFor the center coordinate of the kth frame pipe, +.>Is the center coordinate of the k-1 frame pipeline, x k ,y k The position coordinate of the kth frame target is deltax, deltay which is the correction amount of the center of the pipeline; the processing result of the pipeline filtering is shown in fig. 6.
S4: reconstructing a floater movement track by utilizing a multi-feature fusion target matching method, filtering according to the length and the angle of the track, and storing the image coordinate information of the center of mass of a floater target on each track, wherein the process comprises the following steps:
selecting a fixed frame interval for the image sequence of the target segmentation result subjected to pipeline filtering processing in the step S3, and counting the perimeter L, the area S and the circularity e of a floater target in the selected image, wherein the perimeter L is obtained by calculating the number of pixels of a target contour line in the segmentation result, the area S is obtained by calculating the number of pixels in a target area, and the circularity e is calculated according to the perimeter and the area characteristics of the target, and the maximum value of the circularity e is 1 and is calculated as follows:
taking the target to be matched in the previous frame image as a father target, and taking the target in the subsequent frame image as a child target; the relative errors of the father target and the child target are calculated by combining the three characteristics to estimate the similarity of the father target and the child target:
wherein lambda is 1 ,λ 2 ,λ 3 Weights representing the target circumference, area and circularity, respectively, and satisfy lambda 123 =1, giving each feature the same weight, considering that the degree of influence of each feature is different for each object in a continuous sequence of images and each image, i.eDelta is the relative error between the target to be matched in the previous frame image and the target in the subsequent frame image, and delta epsilon [0, 1), and the higher the target similarity is, the more delta approaches to 0; the relative error of the parent and child targets should satisfy delta < delta min ,δ min A threshold value for relative error; and the child target should be located within a rectangular search window of the parent target, the size of the search window being determined by the prior maximum flow rate;
searching all the floater targets meeting the conditions in the rear frame image as candidate targets, and calculating the distance between the floater targets and the parent targets in the front frame image, wherein the target with the smallest distance is the child target matched with the parent targets; traversing all the floater targets in the selected images, adding the barycenter coordinates of the child targets meeting the matching relation with the father targets in the previous frame images to the same track, regarding the child targets which do not meet the matching relation as newly-appearing targets, and adding the newly-created targets to the newly-created track; the motion trail of each floater target is a set of barycenter coordinates and is stored in a two-dimensional array form;
filtering is carried out according to the length and the angle of the tracks, and each track has certain directivity according to the prior flow direction information of the river, namely, the following conditions are satisfied:
wherein x is start ,y start Is the abscissa and ordinate, x, of the start point of the trajectory end ,y end Is the abscissa and ordinate of the track end point, and θ is the angle threshold; meanwhile, the track with shorter length is removed:
l is the preset shortest track length;
thus, the reconstruction of the motion trail of the floating object is completed.
S5: the image coordinates on each track are converted to world coordinates using Direct Linear Transformation (DLT):
wherein z is c Is an arbitrary parameter, (u, v) is an image coordinate, (x) w ,y w ,z w ) The world coordinates are obtained by solving a set of control points with known space point coordinates and image point coordinates, wherein the L matrix is a perspective projection matrix;
and calculating the actual physical distance from the world coordinates of two adjacent points on each track:
x 1 ,y 1 and x 2 ,y 2 For world coordinates of two adjacent points on the track, calculating a time interval t of the two adjacent points through a video frame rate, and according to a speed formula:
v=s/t (66)
calculating the actual speed determined by two adjacent points on each track; according to the image coordinate information of two adjacent points, the direction of the actual speed is calculated:
preserving the magnitude v, direction ang and world coordinates (x 1 ,y 1 ) The method comprises the steps of carrying out a first treatment on the surface of the And traversing all tracks, calculating the actual flow velocity of all adjacent points on each track, and storing a calculation result.
S6: and (3) carrying out gridding treatment on all the flow velocity values by adopting an inverse distance weight interpolation method to obtain a uniform two-dimensional flow velocity vector field, wherein the process is as shown in fig. 7:
dividing a video image into uniform rectangular speed measurement grids; interpolation of flow velocity values within each grid to the grid center using inverse distance weighted interpolation:
where N is the number of flow vectors per grid, d i Is the velocity vectorThe distance between the interpolation point j and the interpolation point w is a weight coefficient;
and (3) performing interpolation calculation on each evenly divided velocity measurement grid, obtaining the correct interpolation flow velocity shown by the solid line vector arrow by the grid with the track, and obtaining the gridding flow velocity field shown in fig. 8 by taking the broken line vector arrow as a missing mark by the grid without the track.
The grid calculation results of the flow velocity field are shown in table 1, the grid serial numbers are 1-16 numbers from the left upper corner of the image, from left to right and from top to bottom, the flow velocity in the X direction represents the flow velocity component in the horizontal right direction, the flow velocity in the Y direction represents the flow velocity component in the vertical downward direction, and the absence of the flow velocity indicates that no track exists in the grid.
TABLE 1
/>

Claims (9)

1. The river surface flow velocity measuring method based on the object tracking of the water surface floating object is characterized by comprising the following steps:
s1: performing distortion correction on an original video image sequence, and setting a region of interest (ROI);
s2: a pixel-level self-adaptive segmentation algorithm (PBAS) introducing a foreground counting mechanism and a mean-subtraction image method are fused to carry out multi-target segmentation on the image after distortion correction;
s3: aiming at the segmented targets, a self-adaptive pipeline filtering method is used, candidate target screening is carried out by utilizing the motion characteristics of the targets between adjacent frames, and false targets are removed;
s4: reconstructing a floater movement track by utilizing a multi-feature fusion target matching method, filtering according to the length and the angle of the track, and storing the image coordinate information of the center of mass of a floater target on each track;
s5: converting the image coordinates on each track into world coordinates by using Direct Linear Transformation (DLT), and calculating a flow velocity value according to the world coordinates of adjacent points on each track;
s6: and (3) carrying out gridding treatment on all the flow velocity values by adopting an inverse distance weight interpolation method to obtain a uniform two-dimensional flow velocity vector field.
2. The river surface flow velocity measuring method based on the object tracking of the water surface floating objects according to claim 1, wherein in the step S1, the camera is calibrated in a laboratory according to a Zhang Zhengyou calibration method, corresponding internal reference matrixes and distortion parameter matrixes are obtained, distortion correction is performed on an original video image sequence according to the internal reference matrixes and the distortion parameter matrixes, and an ROI area is set.
3. The river surface flow velocity measurement method based on the object tracking of the floating object on the water surface according to claim 1, wherein in the step S2, a pixel-level adaptive segmentation algorithm (PBAS) and a subtractive mean image method, which introduce a foreground counting mechanism, are fused to perform multi-object segmentation, and the process is as follows:
firstly, a PBAS algorithm is utilized to segment a floating object foreground and a river water surface background of a video image sequence, the PBAS algorithm starts to execute the creation of a background model from the reading of a first frame image, the floating object judgment threshold value and the background model update rate are dynamically adjusted in the segmentation process of the foreground and the background, the classification of the foreground and the background is determined by comparing a current pixel with the background model, and the background model is defined by an array of N recently observed pixel values:
B(x i )={B 1 (x i ),...,B k (x i ),...,B N (x i )}
if the pixel value I (x) i ) At least #min of the N background values, which is less than the determination threshold R (x i ) Then determine x i By way of background, the calculation process is:
wherein, f=0 and f=1 respectively represent that the pixel point is a background point and a foreground point, dist (I (xi), bk (xi)) represents the distance between the current point and the background model; meanwhile, in order to restrain the frequent detection of the water surface background as the foreground due to the large noise interference in the area with severe water wave fluctuation, a foreground counting mechanism COUNT is introduced t (x i ):
F(xi)=0,if COUNT t (x i )=COUNT max (3)
Wherein COUNT is max The maximum number of times that a pixel point is detected as a foreground is 2 times of the frame rate; the foreground counting mechanism ensures that the time for detecting the water surface background as the foreground is not overlong in the area with severe water wave fluctuation, and the foreground counting mechanism is not triggered when only floaters pass in other normal areas;
then, dividing the floater target by using an average-subtraction image method, and calculating an average image of the video image sequence:
wherein N is the total frame number of the video image sequence, S is a fixed parameter;
subtracting the average image from each frame image to obtain a saliency map:
I s =I k -I m (5)
partitioning the saliency map into a target and a background by using a maximum inter-class variance method (otsu);
finally fusing the segmentation detection results of the two methods:
I mask =I pbas ∩I mean (6)
wherein I is pbas To introduce the segmentation result of the PBAS algorithm of the foreground counting mechanism, I mean To subtract the segmentation result of the average image method, I mask And finally outputting a multi-target segmentation result.
4. The river surface flow velocity measurement method based on the object tracking of the floating object on the water surface according to claim 1, wherein in the step S3, an adaptive pipeline filtering method is used, candidate object screening is performed by utilizing the motion characteristics of the object between adjacent frames, and the false object is removed, which comprises the following steps:
for the most part in step S2The final output multi-target segmentation results, the centroid (x) i ,y i ) Regarding as a pipeline center, establishing a space pipeline penetrating through a multi-frame image sequence, wherein the pipeline length L represents a time span required to be continuously detected, the pipeline diameter d represents a comparison range between two adjacent frames, setting according to the size of a target, judging the target in a corresponding first frame image as a real foreground target when the frequency of the target appearing in the pipeline reaches a threshold value requirement, otherwise regarding as a false detection result and eliminating; when a new target which does not belong to the existing pipeline appears, creating a new pipeline for the target;
meanwhile, aiming at the rectangle that the pipeline created by the traditional pipeline filtering is fixed at the center, when the motion of the target in the multi-frame image sequence is not standard linear motion, the pipeline diameter d is needed to ensure that the target is positioned in the pipeline, the center of the pipeline is self-adaptive, and when the target positioned in the pipeline is found in the kth frame through the pipeline created in the kth-1 frame, the center of the pipeline in the kth-1 frame and the center of the pipeline in the center of the target in the kth frame are corrected:
wherein the method comprises the steps ofFor the center coordinate of the kth frame pipe, +.>Is the center coordinate of the k-1 frame pipeline, x k ,y k The position coordinates of the kth frame target, Δx, Δy, are correction amounts of the pipe center.
5. The river surface flow velocity measuring method based on the object tracking of the floating objects on the water surface according to claim 4, wherein in the step S4, the object matching method of multi-feature fusion is utilized to reconstruct the motion track of the floating objects, filtering is performed according to the length and the angle of the track, and the image coordinate information of the mass center of the object of the floating objects on each track is stored, and the method comprises the following steps:
selecting a fixed frame interval for the image sequence of the target segmentation result subjected to the pipeline filtering processing in the step S3, and counting the perimeter L, the area S and the circularity e of a floater target in the selected image, wherein the perimeter L is obtained by calculating the number of pixels of a target contour line in the segmentation result, the area S is obtained by calculating the number of pixels in a target area, the circularity e is calculated according to the perimeter and the area characteristics of the target, the maximum value of the circularity e is 1, and the calculation formula is as follows:
taking the target to be matched in the previous frame image as a father target, and taking the target in the subsequent frame image as a child target; the relative errors of the father target and the child target are calculated by combining the three characteristics to estimate the similarity of the father target and the child target:
wherein lambda is 1 ,λ 2 ,λ 3 Weights representing the target circumference, area and circularity, respectively, and satisfy lambda 123 =1, giving each feature the same weight, considering that the degree of influence of each feature is different for each object in a continuous sequence of images and each image, i.eDelta is the relative error between the target to be matched in the previous frame image and the target in the subsequent frame image, and delta epsilon [0, 1); relative error fullness of parent and child targetsFoot delta < delta min ,δ min A threshold value for relative error; and the child target should be located within a rectangular search window of the parent target, the size of the search window being determined by the prior maximum flow rate;
searching all the floater targets meeting the conditions in the rear frame image as candidate targets, and calculating the distance between the floater targets and the parent targets in the front frame image, wherein the target with the smallest distance is the child target matched with the parent targets; traversing all the floater targets in the selected images, adding the barycenter coordinates of the child targets meeting the matching relation with the father targets in the previous frame images to the same track, regarding the child targets which do not meet the matching relation as newly-appearing targets, and adding the newly-created targets to the newly-created track; the motion trail of each floater target is a set of barycenter coordinates and is stored in a two-dimensional array form;
filtering is carried out according to the length and the angle of the tracks, and each track meets the following conditions by prior flow direction information of the river:
wherein x is start ,y start Is the abscissa and ordinate, x, of the start point of the trajectory end ,y end Is the abscissa and ordinate of the track end point, and θ is the angle threshold; meanwhile, the track with shorter length is removed:
l is the preset shortest track length;
thus, the reconstruction of the motion trail of the floating object is completed.
6. The river surface flow velocity measurement method according to claim 1, wherein in the step S5, the image coordinates on each track are converted into world coordinates by Direct Linear Transformation (DLT), and the actual physical distance is calculated from the world coordinates of two adjacent points on each track:
x 1 ,y 1 and x 2 ,y 2 For world coordinates of two adjacent points on the track, calculating a time interval t of the two adjacent points through a video frame rate, and according to a speed formula:
v=s/t (14)
calculating the actual speed determined by two adjacent points on each track; according to the image coordinate information of two adjacent points, the direction of the actual speed is calculated:
preserving the magnitude v, direction ang and world coordinates (x 1 ,y 1 ) The method comprises the steps of carrying out a first treatment on the surface of the And traversing all tracks, calculating the actual flow velocity of all adjacent points on each track, and storing a calculation result.
7. The river surface flow velocity measurement method based on the object tracking of the floating objects on the water surface according to claim 1, wherein the step S6 performs gridding processing on all flow velocity values by adopting an inverse distance weight interpolation method to obtain a uniform two-dimensional flow velocity vector field, and the process is as follows:
dividing a video image into uniform rectangular speed measurement grids; interpolation of flow velocity values within each grid to the grid center using inverse distance weighted interpolation:
where N is the number of flow vectors per grid, d i Is the velocity vectorThe distance between the interpolation point j and the interpolation point w is a weight coefficient;
and performing interpolation calculation on each evenly divided velocity measurement grid to obtain a gridded flow velocity field.
8. The river surface flow rate measuring method based on the object tracking of the floating objects on the water surface according to claim 2, wherein the specific process of the step S1 is as follows:
performing internal reference calibration on the camera in a laboratory according to a Zhang Zhengyou calibration method to obtain a pixel size s, an internal reference matrix K and a distortion parameter matrix D after calibration, wherein the internal reference matrix K and the distortion parameter matrix D are respectively as follows:
D=[k 1 k 2 p 1 p 2 ] (18)
wherein C is x Representing the abscissa of the principal point of the image, C y Representing the ordinate of the principal point of the image, f x Representing the equivalent focal length of the camera on the x-axis of the image plane, f y Representing the equivalent focal length of the camera on the y-axis of the image plane, the focal length f of the camera is calculated according to the pixel size s of the image sensor:
f=(f x +f y )·s/2 (19)
k 1 representing a first radial distortion coefficient, p 1 Representing a first tangential distortion coefficient, k 2 Representing a second radial distortion coefficient, p 2 Representing a second tangential distortion coefficient. The image is distortion corrected by the following formula:
camera coordinates (x, y) of the undistorted image are converted into image coordinates (u, v) by the following formula:
the camera coordinates (x ', y') of the distorted image are converted into image coordinates (u ', v') by the same method:
after the distortion correction of the image is completed, the ROI area is set for subsequent measurement.
9. The river surface flow velocity measurement method based on the object tracking of the floating objects on the water surface according to claim 3, wherein the specific method for dynamically adjusting the floating object judgment threshold and the background model update rate in the foreground and the background segmentation process in the step S2 is as follows:
for the pixel point currently determined as the background, randomly and uniformly selecting a certain index k epsilon 1 i ) With probability p=1/T (x i ) Is replaced with the current pixel value I (x i ) The method comprises the steps of carrying out a first treatment on the surface of the At the same time, with probability p=1/T (x i ) Randomly selecting adjacent pixels y i ∈N(x i ) Background model B at the neighboring pixels k (y i ) Is represented by its current pixel value I (y i ) Replacement;
except in the background model B (x i ) In addition to storing the most recently observed array of pixel values, an array D (x) of minimum decision distances is created i )={D 1 (x i ),...,D N (x i ) Every time B is executed k (x i ) At the time of updating of (a), the minimum distance d currently observed min (x i )=min k dist(I(x i ),B k (x i ) Writing the array, thereby creating a history of minimum decision distances, the average of these historyFor measuring backgroundThe degree of dynamic change; thus, the determination threshold R (x) is dynamically adjusted according to the following strategy i ):
Wherein R is inc/dec And R is scale Is a fixed parameter for R (x i ) Dynamic control of (2);
for update probability T (x i ) Also byAnd (3) performing control:
wherein T is inc ,T dec Is a fixed parameter; simultaneously setting the upper and lower limits of the update probability as T lower <T<T upper
CN202310923396.6A 2023-07-26 2023-07-26 River surface flow velocity measurement method based on water surface floater target tracking Pending CN116934808A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310923396.6A CN116934808A (en) 2023-07-26 2023-07-26 River surface flow velocity measurement method based on water surface floater target tracking

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310923396.6A CN116934808A (en) 2023-07-26 2023-07-26 River surface flow velocity measurement method based on water surface floater target tracking

Publications (1)

Publication Number Publication Date
CN116934808A true CN116934808A (en) 2023-10-24

Family

ID=88387562

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310923396.6A Pending CN116934808A (en) 2023-07-26 2023-07-26 River surface flow velocity measurement method based on water surface floater target tracking

Country Status (1)

Country Link
CN (1) CN116934808A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118037780A (en) * 2024-04-10 2024-05-14 武汉大水云科技有限公司 River surface flow measuring method and measuring device based on video scanning

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118037780A (en) * 2024-04-10 2024-05-14 武汉大水云科技有限公司 River surface flow measuring method and measuring device based on video scanning

Similar Documents

Publication Publication Date Title
CN109434251B (en) Welding seam image tracking method based on particle filtering
CN111291327B (en) Multi-beam seabed substrate classification method based on divide-and-conquer idea
CN106991686B (en) A kind of level set contour tracing method based on super-pixel optical flow field
CN115761550A (en) Water surface target detection method based on laser radar point cloud and camera image fusion
CN107742306B (en) Moving target tracking algorithm in intelligent vision
CN109242019B (en) Rapid detection and tracking method for optical small target on water surface
CN110555868A (en) method for detecting small moving target under complex ground background
CN108647693B (en) Sea surface infrared target detection method based on binary significance characteristics
CN111681262B (en) Method for detecting infrared dim target under complex background based on neighborhood gradient
CN110827262A (en) Weak and small target detection method based on continuous limited frame infrared image
CN113960624A (en) Laser radar echo underwater topography detection method based on self-adaptive DBSCAN
CN109284663A (en) A kind of sea obstacle detection method based on normal state and uniform Mixture Distribution Model
CN110706208A (en) Infrared dim target detection method based on tensor mean square minimum error
CN108038856B (en) Infrared small target detection method based on improved multi-scale fractal enhancement
CN114758219A (en) Trace identification method based on spectral data and infrared temperature data fusion
Xing et al. An adaptive change threshold selection method based on land cover posterior probability and spatial neighborhood information
CN112329764A (en) Infrared dim target detection method based on TV-L1 model
CN115272405A (en) Robust online learning ship tracking method based on twin network
CN112613565B (en) Anti-occlusion tracking method based on multi-feature fusion and adaptive learning rate updating
Zhen et al. Design and evaluation of an FFT-based space-time image velocimetry (STIV) for time-averaged velocity measurement
CN108985375B (en) Multi-feature fusion tracking method considering particle weight spatial distribution
Liu et al. A multi-scale feature pyramid SAR ship detection network with robust background interference
CN117495918A (en) River water surface optical flow estimation method based on illumination self-adaptive ORB operator
CN110826575A (en) Underwater target identification method based on machine learning
CN116934808A (en) River surface flow velocity measurement method based on water surface floater target tracking

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination