CN117889923A - Open channel flow measurement method and device based on space-time image texture feature extraction - Google Patents

Open channel flow measurement method and device based on space-time image texture feature extraction Download PDF

Info

Publication number
CN117889923A
CN117889923A CN202410066349.9A CN202410066349A CN117889923A CN 117889923 A CN117889923 A CN 117889923A CN 202410066349 A CN202410066349 A CN 202410066349A CN 117889923 A CN117889923 A CN 117889923A
Authority
CN
China
Prior art keywords
space
texture
image
time image
open channel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410066349.9A
Other languages
Chinese (zh)
Inventor
王剑平
冯壹峰
张果
金建辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Kunming University of Science and Technology
Original Assignee
Kunming University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kunming University of Science and Technology filed Critical Kunming University of Science and Technology
Priority to CN202410066349.9A priority Critical patent/CN117889923A/en
Publication of CN117889923A publication Critical patent/CN117889923A/en
Pending legal-status Critical Current

Links

Abstract

The invention discloses an open channel flow measurement method and device based on space-time image texture feature extraction in the technical field of river hydrologic measurement, the method comprises the steps of drawing velocimetry lines along the direction of an open channel to be measured corresponding to velocimetry points, and generating space-time images at different times by utilizing the velocimetry lines; performing texture image preprocessing on the generated space-time image, wherein the texture image preprocessing comprises second-order frequency domain differential inverse sharpening enhancement, DFT-based background noise suppression, radon transformation preliminary detection of texture angles, and a steerable second-order Gaussian adjustable direction filter for performing track enhancement and noise denoising on the texture space-time image; and estimating the texture angle of the space-time image after enhancement and denoising by using a two-dimensional autocorrelation function of the texture image, and calculating the flow field speed distribution of the open channel video to be detected by using the obtained texture angle. The invention can effectively remove noise of texture images, realize the measurement of texture angles and improve the accuracy of monitoring the surface flow velocity of open channel video.

Description

Open channel flow measurement method and device based on space-time image texture feature extraction
Technical Field
The invention relates to an open channel flow measurement method and device based on space-time image texture feature extraction, and belongs to the technical field of river hydrologic measurement.
Background
Under the background of numerous river channels in China, river flow rate measurement is one of important tasks in the fields of hydrologic protection, flood disaster prevention and monitoring, water resource utilization and the like. At present, the river flow velocity measurement of a river in China is generally carried out in a complex environment, but the traditional contact type flow velocity measurement mode has large danger coefficient of a working environment and high technical requirements for operators due to difficult equipment arrangement, so that the adverse condition of hydrologic monitoring can be improved by using the non-contact type river velocity measurement method. Along with the continuous development of hydrologic digitization and the improvement of network video monitoring technology, a plurality of hydrologic observation stations in China in recent years already start to use a camera monitoring system to monitor river flow rate, so that the development of the field of hydrologic flow rate and flow monitoring is greatly accelerated.
The space-time image river flow velocity velocimetry (STIV) is a novel method in non-contact river flow velocity measurement, has the characteristics of strong real-time performance, high spatial resolution, low algorithm time consumption and the like, and plays an important role in real-time river flow velocity measurement. The space-time image velocimetry is characterized in that a group of velocimetry lines are arranged on the basis of the river direction to be measured, texture main direction angles of space-time images generated by the velocimetry lines in time and space in video are analyzed to estimate one-dimensional time average flow velocity at the velocimetry points, and comprehensive flow velocity information of each velocimetry point of the section is obtained.
The method mainly comprises three parts of space-time image generation, texture main direction angle detection and river flow velocity and flow calculation. However, due to the fact that the natural river is rugged, when the velocimetry line is drawn, the inadequacy of the timing point position selection can lead to the incapability of paralleling the velocimetry line with the river flow direction, so that the texture angle of the generated space-time image is inaccurate, and the measurement of the flow velocity of the river at the back is affected. The generated space-time image can generate a large amount of background noise and fringe texture interference due to the limitation of video recording conditions or weather lighting conditions and the like, and the detection of the main direction angle of the texture is seriously influenced;
the traditional space-time image direction angle detection method has a brightness gradient tensor method, however, the tensor analysis method is very easy to influence the judgment effect due to noise interference. The frequency domain filtering direction angle analysis method is easy to remove noise in the space-time image, identifies clear space-time image textures, and further realizes surface flow rate monitoring, but useful information is often filtered in the frequency domain, and limitations still exist when facing complex natural river environments. Therefore, the reliable pretreatment mode is characterized in that the method can enhance the extraction of the details of the space-time images and reduce the interference of noise, and is an important point of the space-time image flow measurement method.
Disclosure of Invention
The invention aims to overcome the defects in the prior art, and provides an open channel flow measurement method and device based on space-time image texture feature extraction, which can effectively remove noise of texture images, accurately measure the main direction angle of textures of the space-time images, and further improve the accuracy of monitoring the surface flow velocity of open channel video.
In order to achieve the above purpose, the invention is realized by adopting the following technical scheme:
in a first aspect, the present invention provides an open channel flow measurement method based on spatio-temporal image texture feature extraction, including:
reading the change condition of gray values on a velocimetry line within the corresponding time frame number of the open channel video to be detected, and generating an original space-time image;
performing enhancement and noise reduction pretreatment on the original space-time image to obtain a space-time image retaining a specific texture direction angle;
detecting the texture direction angle by adopting a two-dimensional autocorrelation function of the brightness of the space-time image according to the space-time image with the reserved specific texture direction angle, and obtaining the optimal texture main direction angle;
and calculating the flow speed and the total flow of the open channel to be measured according to the acquired main direction angle of the texture.
Further, the reading the change condition of gray values on the velocimetry line within the corresponding time frame number of the open channel video to be detected, generating an original space-time image, includes:
acquiring an open channel video to be detected, and setting a group of velocity measurement lines along the direction of the open channel according to a video image;
extracting gray value information on a velocimetry line frame by frame;
and synthesizing a piece of texture space-time image of L multiplied by M pixels according to gray value information on the velocimetry line, wherein the horizontal axis L is the pixel distance of the velocimetry line, and the vertical axis M is the frame number.
Further, the setting a group of velocimetry lines along the open channel direction according to the video image includes:
for an open channel video to be detected, a fixed time interval deltat is selected, namely, a required M-frame open channel video image sequence is acquired, and a group of velocity measurement lines with single pixels and L pixels are arranged at the section to be detected along the direction of the open channel of the video according to an equal division principle.
Further, the performing the pre-processing of enhancing and denoising the original spatiotemporal image to obtain the spatiotemporal image retaining the specific texture direction angle includes:
the generated original space-time image is subjected to enhancement pretreatment by adopting a second-order frequency domain differential unsharp enhancement algorithm, so that a space-time image after texture enhancement is obtained;
processing the space-time image after the texture enhancement by adopting a two-dimensional discrete Fourier transform background noise suppression method to obtain a preliminary noise-reduced space-time image;
and obtaining a primary texture main direction angle by adopting a Radon transformation method on the preliminary noise-reduced space-time image, and then transmitting the obtained primary texture main direction angle as a direction parameter input to a stationary direction filter function to obtain a space-time image retaining a specific texture direction angle.
Further, the enhancement pretreatment is performed on the generated original space-time image by adopting a second-order frequency domain differential anti-sharpening enhancement algorithm to obtain a space-time image after texture enhancement, which comprises the following steps:
the linear stripe enhancement of the original space-time image is realized by using a second-order frequency domain differential inverse sharpening enhancement algorithm, which is expressed as follows:
g(x,y)=f(x,y)+λ(f t (x,y)-f d (x,y));
wherein: g (x, y) is a space-time image after texture enhancement; f (x, y) is the original spatiotemporal image of the input, f t (x, y) is the homomorphically filtered image of the original spatio-temporal image, f d (x, y) is an image of an original space-time image after Gaussian low-pass filtering, lambda is a detail enhancement factor, and brightness detail enhancement is realized by controlling the contribution degree of a differential component to the original space-time image by lambda, wherein the expression is as follows:
wherein, alpha and beta are constants, 0< alpha <1,60< beta <120, K is a differential curvature value, and the expression is:
wherein K represents the second order differential curvature value of the high gradient region and the flat region,equal to the second derivative in the gradient direction, +.>A second derivative value equal to the direction perpendicular to the gradient, expressed as:
wherein f x ,f y ,f xx ,f yy ,f xy Representing the first, second and mixed second derivatives of the original spatiotemporal image f (x, y) pair in the x, y directions, respectively.
Further, the space-time image after the texture enhancement is processed by adopting a two-dimensional discrete Fourier transform background noise suppression method, so that a preliminary noise-reduced space-time image is obtained; comprising the following steps:
performing two-dimensional discrete Fourier transform on the space-time image G (x, y) with the enhanced texture to obtain a frequency domain image G (u, v);
after the frequency domain image G (u, v) is subjected to spectrum centering display, a two-dimensional Butterworth filter is adopted to filter the frequency domain image, so that a processed frequency domain image G' (u, v) is obtained;
performing two-dimensional inverse discrete Fourier transform based on the processed frequency domain image G' (u, v) to obtain a background noise image f DFT (x,y);
Combining the texture enhanced spatiotemporal image g (x, y) with the background noise image f DFT (x, y) performing differential operation to obtain preliminary noise-reduced space-time image
Further, the obtaining the primary texture main direction angle by using the Radon transformation method for the texture space-time image with the primary noise reduction includes:
a straight line L in the (x, y) plane space is expressed as: xcos θ+ysinθ=ρ, where ρ represents the distance from the point (x, y) to the origin, θ represents the angle between the perpendicular to the line from the origin and the x-axis;
mapping the line L to a point (ρ) of Radon spaceThe value of the point theta (rho, theta) is that the linear L area passes through the space-time image of preliminary noise reductionIs added up, then Radon of the continuous space is transformed into:
wherein R (ρ, θ) represents an integrated value of a given (ρ, θ), i.e., a straight line L having a position ρ along the θ direction;a space-time image for preliminary noise reduction; />The representation is: the point (x, y) on the straight line L satisfies δ (p-xcos θ -ysinθ) =1, and the other points δ (p-xcos θ -ysinθ) =0 that are not on the straight line L;
and in the R (rho, theta) transformation domain, finding the corresponding angle when the line integral value is maximum, namely the main direction angle of the primary texture.
Further, the step of transmitting the obtained primary texture main direction angle as a direction parameter input to a steerabable direction filter function to obtain a space-time image retaining the specific texture direction angle includes:
a Gaussian function is used as a kernel function of a steerable filter, and a group of direction-adjustable basis functions of second derivatives of the Gaussian function are designed;
calculating a steering formula and an interpolation function of an adjustable direction filter of the second derivative;
texture spatiotemporal image to preliminary noise reductionConvolving with the combined basis filter bank, multiplying each by the corresponding interpolation function, and summing the partial results to obtain a spatio-temporal image G retaining the angle of the specific texture direction 2 (x,y)。
Further, detecting the texture direction angle by adopting a two-dimensional autocorrelation function of the brightness of the space-time image according to the space-time image retaining the specific texture direction angle, and obtaining the optimal texture main direction angle; comprising the following steps:
solving a two-dimensional autocorrelation function image for the space-time image retaining the specific texture direction angle;
amplifying the central point information of the logarithmic polar coordinates for the two-dimensional autocorrelation function image;
calculating a required direction angle extremum in the logarithmic polar coordinate image;
and obtaining an angle maximum value in the average direction distribution, namely the optimal main direction angle of the texture.
In a second aspect, the present invention also provides an open channel flow measurement device based on space-time image texture feature extraction, including:
the image acquisition module is used for reading the change condition of gray values on the velocimetry line within the corresponding time frame number of the open channel video to be detected and generating an original space-time image;
the image preprocessing module is used for carrying out enhancement and noise reduction preprocessing on the original space-time image to obtain a space-time image retaining a specific texture direction angle;
the main direction angle identification module: the method comprises the steps of detecting a texture direction angle by adopting a two-dimensional autocorrelation function of the brightness of a space-time image according to the space-time image with a specific texture direction angle reserved, and obtaining an optimal texture main direction angle;
the flow velocity and flow calculation module is used for calculating the flow velocity and the total flow of the open channel to be measured according to the acquired main direction angle of the texture.
Compared with the prior art, the invention has the beneficial effects that:
according to the open channel flow measurement based on space-time image texture feature extraction, a space-time image is generated according to a set of velocity lines, texture detail enhancement is carried out on the space-time image to obtain a texture enhanced image, then frequency domain filtering is carried out to obtain a background denoising image, the space-time image after enhancing denoising is subjected to texture main direction angle identification, noise of the texture image can be effectively removed, the texture main direction angle of the space-time image is measured, and further accuracy of open channel video surface flow velocity monitoring can be improved. The texture enhancement filtering method is combined, the measurement problem of river surface flow rate is converted into the identification problem of the main direction angle of the texture of the space-time image, the non-contact flow rate measurement is realized, and the efficiency of river surface flow rate measurement can be greatly improved.
Drawings
FIG. 1 is a flow chart of an open channel flow measurement method based on space-time image texture feature extraction provided by an embodiment of the invention;
FIG. 2 is a schematic view of a river video scene provided by an embodiment of the present invention;
FIG. 3 is a schematic view of an original spatiotemporal image according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of a spatio-temporal image after enhancement of second order frequency domain differential sharpening according to an embodiment of the present invention;
FIG. 5 is a Radon transform domain image according to an embodiment of the present invention;
fig. 6 is a two-dimensional autocorrelation function image in accordance with an embodiment of the present invention.
Detailed Description
The following detailed description of the technical solutions of the present invention is made by the accompanying drawings and specific embodiments, and it should be understood that the specific features of the embodiments and embodiments of the present application are detailed descriptions of the technical solutions of the present application, and not limiting the technical solutions of the present application, and the technical features of the embodiments and embodiments of the present application may be combined with each other without conflict.
The term "and/or" is herein merely an association relationship describing an associated object, meaning that there may be three relationships, e.g., a and/or B, may represent: a exists alone, A and B exist together, and B exists alone. In addition, the character "/" herein generally indicates that the front and rear associated objects are an "or" relationship.
The invention provides an open channel flow measurement method and device based on space-time image texture feature extraction, wherein the method comprises the following steps:
reading the change condition of gray values on a velocimetry line within the corresponding time frame number of the open channel video to be detected, and generating an original space-time image;
performing enhancement and noise reduction pretreatment on the original space-time image to obtain a space-time image retaining a specific texture direction angle;
and detecting the texture direction angle by adopting a two-dimensional autocorrelation function of the brightness of the space-time image according to the space-time image with the reserved specific texture direction angle, and obtaining the optimal texture main direction angle.
And calculating and obtaining the flow velocity and the total flow of the open channel to be measured according to the obtained main direction angle of the texture.
In practical applications, the following examples are given for specific methods and apparatuses.
Example 1:
fig. 1 is a flowchart of an open channel flow measurement method based on space-time image texture feature extraction according to the present embodiment. The flow chart merely shows the logical sequence of the method according to the present embodiment, and the steps shown or described may be performed in a different order than shown in fig. 1 in other possible embodiments of the invention without mutual conflict.
The open channel flow measurement method based on space-time image texture feature extraction provided in this embodiment may be applied to a terminal, and may be performed by a space-time image direction angle recognition device estimated based on a two-dimensional autocorrelation function, where the device may be implemented by software and/or hardware, and the device may be integrated in the terminal, for example: any smart phone, tablet computer or computer device with communication function.
The open channel measured in this embodiment is a river, and referring to fig. 1, the method of this embodiment specifically includes the following steps:
s1: generating a space-time image with corresponding size in fixed time through a velocimetry line
Shooting a river video to be measured, as shown in fig. 2; for river video to be detected, a fixed time interval deltat is selected, namely, a required M-frame river video image sequence is acquired, a group of velocimetry lines with single pixels and L pixels in width are arranged at a section to be detected along the direction of the river of the video according to an equal division principle, gray value information on the velocimetry lines is extracted frame by frame, finally, a texture space-time image of L multiplied by M pixels is synthesized, wherein the horizontal axis is the pixel distance of the velocimetry lines, the vertical axis is the frame number, and the original space-time image is shown in fig. 3.
S2: preprocessing the generated space-time image to obtain a texture filtering image after enhancement and noise reduction
S21, using a second-order frequency domain differential inverse sharpening enhancement algorithm to realize linear stripe enhancement on space-time images, wherein the step of enhancing the stripe is as follows:
step one: as shown in formula (1):
g(x,y)=f(x,y)+λ(f t (x,y)-f d (x,y)) (1)
wherein g (x, y) is a space-time image after texture enhancement; f (x, y) is the original spatiotemporal image of the input, f t (x, y) is the homomorphically filtered image of the original spatio-temporal image, f d (x, y) is the image of the original space-time image after Gaussian low-pass filtering, and lambda is a detail enhancement factor:
the brightness detail enhancement is realized by controlling the contribution degree of the differential component to the original space-time image through lambda;
wherein, alpha and beta are constants, 0< alpha <1,60< beta <120, K is differential curvature value.
Step two: calculating a differential curvature value, wherein the differential curvature value has the following formula expression (3);
wherein K represents the second order differential curvature value of the high gradient region and the flat region,equal to the second derivative in the gradient direction, +.>The specific formulas (4) and (5) are as follows:
wherein f x ,f y ,f xx ,f yy ,f xy Representing the first derivative, the second derivative and the mixed second derivative of the original spatiotemporal image f (x, y) in the x and y directions respectively, the variable derivation process is as follows:
for the image f (x, y) being a continuous and conductive binary function, the function being in a certain field parallel to the x-axis, e.g.Only with the change of the x-coordinate, the function f (x, y) can be expanded into taylor series at the neighboring point (i, j):
at pixel points (i-1, j) and (i+1, j), x is equal to x, respectively i,j -1 and x i,j +1, thus having x-x i,j Equal to-1 and 1, respectively, then substitution is made to obtain the formula:
since the values of the terms above the third power of the taylor expansion are small and can be ignored, the two equations are simplified as:
the difference formula along the x-axis direction can be obtained by combining the two formulas:
(f xx ) i,j =f i+1,j -2f i,j +f i-1,j ; (12)
then the same applies to the differential equation along the y-axis:
(f yy ) i,j =f i,j+1 -2f i,j +f i,j-1 ; (14)
simultaneously (f) x ) i,j And (f) y ) i,j The second order mixed partial derivative difference formula is obtained as follows:
substituting the calculation result into the formula (4) and the formula (5) to obtain a differential curvature value K, obtaining an enhancement factor lambda, multiplying a homomorphic filtering and Gaussian low-pass filtering difference image, and finally adding an original space-time image to obtain a space-time image g (x, y) after texture enhancement, as shown in fig. 4.
S22: image noise suppression using two-dimensional discrete Fourier transform background noise suppression method
In this embodiment, the space-time image G (x, y) after texture enhancement is converted from a space domain to a frequency domain, that is, the two-dimensional discrete fourier transform is performed on the G (x, y) image, so as to obtain a frequency domain image G (u, v);
the background noise suppression method comprises the following steps:
step i: performing two-dimensional discrete Fourier transform on the image: the two-dimensional discrete fourier transform is shown in equation (16):
where G (x, y) is an input texture-enhanced spatiotemporal image, G (u, v) is a two-dimensional DFT transform denoted as f (x, y), M, N is the number of rows and columns of the spatiotemporal image, u and v are the frequencies subjected to the DFT transform function, u=0, 1,2,.. 2*M-1, v=0, 1,2,.. 2*N-1, and spectrum-centered display is performed on the transformed frequency domain image.
Step ii: the two-dimensional Butterworth filter is applied to the spectrum centered display image;
in this embodiment, the application of the two-dimensional butterworth filter to the spectrum-centered display image is as shown in formula (17):
G’(u,v)=G(u,v)·GLPF(u,v) (17)
wherein G' (u, v) is a frequency domain image after the frequency domain filtering process, GLPF (u, v) is a two-dimensional butterworth filter, as shown in formula (18):
wherein the D is 0 In order to be a cut-off frequency,the distance between the frequency point and the center of the frequency domain image reflects the suppression degree of the image frequency.
Step iii: according to the frequency domain image G' (u, v) after the frequency domain filtering treatment, carrying out two-dimensional inverse discrete Fourier transform to obtain low-frequency information of the enhanced space-time image, namely a background noise image;
in this embodiment, the background noise image obtained by the two-dimensional inverse discrete fourier transform is shown in formula (19):
combining the texture enhanced spatiotemporal image g (x, y) with the background noise image f DFT (x, y) performing a differential operation to obtain a preliminary noise-reduced spatiotemporal image containing only high-frequency detail portions of texture detail and residual noise
S23, obtaining a primary texture main direction angle of the texture space-time image by adopting a Radon transformation method
In this embodiment, radon transformation is used to obtain a primary texture main direction angle for the preliminary noise-reduced spatio-temporal image, as shown in formulas (20) (21):
equation (20) represents an imageLine integration on line L, i.e. the area of line L passing through the image +.>Is added up to the accumulated sum of pixel gray values. Wherein L is a straight line in a certain direction on a plane, ds is the differentiation of the straight line, R L Is a line integral value.
For easy solution, the straight line L uses an angular representation, namely: xcos θ+ysinθ=ρ, where ρ represents the distance from the point (x, y) to the origin, θ represents the angle between the perpendicular of the origin to the line and the x-axis. Then, a straight line xcos θ+ysinθ=ρ in the (x, y) plane space is mapped to a point (ρ, θ) in the Radon space, and the value of the point (ρ, θ) is straightLine L region passes through the imageAnd (3) adding up the summation values of the pixel gray values, and transforming the Radon of the continuous space into:
wherein R (ρ, θ) is a Radon transform domain image, representing an integrated value of a given (ρ, θ), i.e., a straight line L having a position ρ along the θ direction;for the preliminary noise-reduced spatiotemporal image,namely: the point (x, y) on the straight line L satisfies δ (p-xcos θ -ysinθ) =1, and the other points δ (p-xcos θ -ysinθ) =0 that are not on the straight line L.
If the image isIn the above formula, n straight lines in the same direction exist, that is, the straight lines have the same θ, and ρ is different, each (ρ, θ) corresponds to one line integral value, n line integral values can be obtained by applying the Radon transformation formula, the values of different line integral values are different, and the brightness of the presented pixels is also different. Then in the (ρ, θ) domain, the horizontal axis is θ angle, the vertical axis is distance ρ, and R (ρ, θ) represents the line integral value, as shown in fig. 5.
According to the Radon transformation straight line detection, it essentially maps all points in the original (x, y) plane into the Radon transformation domain, namely, makes all points in each position straight line in the (x, y) plane locate at a point in R (ρ, θ), and accumulates the brightness value of the straight line, when (x, y) contains a large number of straight lines in the same direction, there is an extreme point of brightness value in the R (ρ, θ), and the angle corresponding to the point is the direction angle of the straight line in the original (x, y). That is, according to the above, in the R (ρ, θ) transform domain, the angle corresponding to the maximum line integral value is found, that is, the primary texture principal direction angle that is sought.
S24: adopting a steerable second-order Gaussian adjustable direction filter to carry out track enhancement and noise denoising on the texture space-time image;
track enhancement and noise denoising steps are as follows:
step a: using a Gaussian function as a kernel function of the steerable filter;
in this embodiment, the gaussian kernel function is as follows (22):
step b: a set of direction-adjustable basis functions of the second derivative of the Gaussian function is designed, and the direction-adjustable basis functions comprise three basis functions
The second derivative of G with respect to x is as follows (23):
the second derivative of G with respect to y is as follows (24):
the second derivative of G with respect to x and y is as follows (25):
will G xx And G yy Direction base filters, respectively set to 0 DEG and 90 DEG, and G xy And G xx 、G yy The linear combination may represent some other angle filter, such as G xx -G xy 30 DEG G yy -G xy 60 DEG G yy +G xy 120 °;
the three basic filters used are
Step c: steering formula and interpolation function of adjustable direction filter for calculating second derivative
In this embodiment, the steering formula of the adjustable direction filter is as follows (26):
the interpolation function used is the following equation (27):
step d: spatiotemporal image based on preliminary noise reduction of inputConvolving with the combined basis filter bank, multiplying each by the corresponding interpolation function, and summing the partial results to obtain a space-time image G after the step filter 2 (x, y), i.e. preserving spatiotemporal image G at a particular texture direction angle 2 (x,y)。
In this embodiment, the steerabable filter is calculated as the following equation (28):
s3: constructing a two-dimensional autocorrelation function image according to the two-dimensional autocorrelation function of the brightness of the processed texture filtering image, and obtaining the principal direction angle of the texture
The main direction angle of the texture is obtained by the following steps:
step A: detecting the direction angle of the texture track in the space-time image by adopting a two-dimensional autocorrelation function of the brightness of the processed space-time image;
as shown in formula (29):
wherein R (τ) xy ) G is a two-dimensional autocorrelation function of luminance of a spatio-temporal image 2 (x, y) is an input spatiotemporal image preserving specific texture direction angle, τ xy Is a translation parameter.
Because the calculation amount of the autocorrelation function of the luminance of the space-time image is large, the calculation is not directly performed, the inverse Fourier transform of the power spectrum density function is adopted for indirect solving, namely the principle of Li Yongwei nm-Xin Qinding, and the theorem indicates that the inverse Fourier transform of the power spectrum density function is the autocorrelation function, and the specific process is as follows:
g to be input 2 The (x, y) image is subjected to two-dimensional Fourier transform and then is subjected to modular squaring to obtain G 2 And (3) carrying out inverse Fourier transform on the power spectrum density function of (x, y) to obtain a two-dimensional autocorrelation function image of the space-time image brightness.
As shown in formulas (30) (31):
the upper part is the pair G 2 (x, y) performing a two-dimensional fourier transform, wherein ζ, η represent horizontal and vertical coordinates in the frequency domain, respectively, and F (ζ, η) is a frequency component of the image in the transformed frequency domain.
R(τ xy )=F -1 [|F(ξ,η)| 2 ] (31)
The above expression is that F (ζ, η) is modulo square to obtain power spectrum density function, and finally the inverse Fourier transform is performed to obtain space-time image G 2 A two-dimensional autocorrelation function image of (x, y) luminance as shown in fig. 6.
According to the obtained two-dimensional autocorrelation function image, the two-dimensional autocorrelation function image is an oblique pattern corresponding to the actual texture gradient, and normalization is carried out to obtain
Step C: amplifying the central point information of the logarithmic polar coordinates for the two-dimensional autocorrelation function image;
in the present embodiment, the logarithmic polar transformation formulas (32) (33) are as follows:
θ 2 =arctan(τ yx ) (32)
wherein M is a strengthening coefficient.
Step D: calculating a required direction angle extremum in the logarithmic polar coordinate image;
the statistical direction average distribution formula (34) (35) is as follows:
max(ρ 2 )=Mlog[min(max(τ x ),max(τ y ))] (35)。
step E: calculate and calculate the average distribution mu (theta) 2 ) An angle maximum in (2);
the angle maxima formula (36) is as follows:
θ' is the main direction angle of the texture of the desired spatio-temporal image.
S4: according to the main direction angle of the texture, calculating the flow velocity and flow distribution of the corresponding speed measuring point of the river (river channel) video to be measured
In this embodiment, the actual distance of each pixel will be calculated by calculating the correlation between the actual rectangular coordinates and the image plane coordinates. The A, B and C, D parallel to the two banks of the river are selected to form a rectangle, the actual space rectangular coordinates of the four points can be obtained by manual measurement, and the conversion relation between the image plane coordinates and the actual space rectangular coordinates is as follows:
wherein m is 11 、m 12 、m 13 、m 21 、m 22 、m 23 、m 31 、m 32 、m 33 The transformation coefficients of the phase plane coordinates and the actual space rectangular coordinates are the image plane coordinates, and u and v are directly obtained from river videos to be measured.
In this embodiment, the total station is used to obtain the actual space rectangular coordinates of four ground control points (0, 0), (9.49,0), (22.5,48.7), (-24.9,45.2), the coordinates of the marker points found in the image are (236,902), (279,1323), (1052,896) and (465,111), the conversion coefficients of the phase plane coordinates and the actual space rectangular coordinates are obtained according to the eight points, and finally the conversion coefficients and the actual coordinates are used to obtain the actual river surface distance S corresponding to each pixel length in the image x
The calculation of the river surface flow rate corresponding to the river video to be measured is as follows:
wherein S is x Representing the actual river distance (in m/pixel) represented by each pixel length, st represents the time required per frame (in s/pixel), fps represents the frame rate (in pixel/s) calculated by projective transformation, and tan θ represents the tangent of the spatio-temporal image slope.
The river discharge is calculated by a flow velocity area method, and the following formula (38) is adopted:
wherein Q is river flow; η represents the surface flow velocity coefficient, V i Represents the interval surface flow rate, A i Representing the zone area.
The method can calculate the flow velocity V in the time period of shooting the video on each velocimetry line, and then calculate the river flow Q through a flow velocity area method.
Example 2
The embodiment provides an open channel flow measurement device based on space-time image texture feature extraction, the device comprises:
the image acquisition module is used for reading the change condition of gray values on the velocimetry line within the corresponding time frame number of the open channel video to be detected and generating an original space-time image;
the image preprocessing module is used for carrying out enhancement and noise reduction preprocessing on the original space-time image to obtain a space-time image retaining a specific texture direction angle;
the main direction angle identification module: the method comprises the steps of detecting a texture direction angle by adopting a two-dimensional autocorrelation function of the brightness of a space-time image according to the space-time image with a specific texture direction angle reserved, and obtaining an optimal texture main direction angle;
and the flow velocity and flow quantity calculation module is used for calculating the flow velocity and the total flow quantity of the open channel to be measured according to the detected main direction angle of the texture.
The open channel flow measurement device based on the space-time image texture feature extraction provided by the embodiment of the invention can execute the open channel flow measurement method based on the space-time image texture feature extraction provided by any embodiment of the invention, and has the corresponding functional modules and beneficial effects of the execution method.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The foregoing is merely a preferred embodiment of the present invention, and it should be noted that modifications and variations could be made by those skilled in the art without departing from the technical principles of the present invention, and such modifications and variations should also be regarded as being within the scope of the invention.

Claims (10)

1. The open channel flow measurement method based on space-time image texture feature extraction is characterized by comprising the following steps:
reading the change condition of gray values on a velocimetry line within the corresponding time frame number of the open channel video to be detected, and generating an original space-time image;
performing enhancement and noise reduction pretreatment on the original space-time image to obtain a space-time image retaining a specific texture direction angle;
detecting the texture direction angle by adopting a two-dimensional autocorrelation function of the brightness of the space-time image according to the space-time image with the reserved specific texture direction angle, and obtaining the optimal texture main direction angle;
and calculating the flow speed and the total flow of the open channel to be measured according to the acquired main direction angle of the texture.
2. The method for measuring the flow rate of the open channel based on the extraction of the texture features of the space-time image according to claim 1, wherein the step of reading the change condition of the gray value on the velocimetry line within the corresponding time frame number of the open channel video to be measured to generate the original space-time image comprises the following steps:
acquiring an open channel video to be detected, and setting a group of velocity measurement lines along the direction of the open channel according to a video image;
extracting gray value information on a velocimetry line frame by frame;
and synthesizing a piece of texture space-time image of L multiplied by M pixels according to gray value information on the velocimetry line, wherein the horizontal axis L is the pixel distance of the velocimetry line, and the vertical axis M is the frame number.
3. The method for measuring the open channel flow based on the space-time image texture feature extraction according to claim 2, wherein the step of setting a set of velocimetry lines along the open channel direction according to the video image comprises the steps of:
for an open channel video to be detected, a fixed time interval deltat is selected, namely, a required M-frame open channel video image sequence is acquired, and a group of velocity measurement lines with single pixels and L pixels are arranged at the section to be detected along the direction of the open channel of the video according to an equal division principle.
4. The open channel flow measurement method based on spatio-temporal image texture feature extraction according to claim 1, wherein the performing enhancement and noise reduction pretreatment on the original spatio-temporal image to obtain a spatio-temporal image retaining a specific texture direction angle comprises:
the generated original space-time image is subjected to enhancement pretreatment by adopting a second-order frequency domain differential unsharp enhancement algorithm, so that a space-time image after texture enhancement is obtained;
processing the space-time image after the texture enhancement by adopting a two-dimensional discrete Fourier transform background noise suppression method to obtain a preliminary noise-reduced space-time image;
and obtaining a primary texture main direction angle by adopting a Radon transformation method on the preliminary noise-reduced space-time image, and then transmitting the obtained primary texture main direction angle as a direction parameter input to a stationary direction filter function to obtain a space-time image retaining a specific texture direction angle.
5. The open channel flow measurement method based on space-time image texture feature extraction according to claim 4, wherein the performing enhancement pretreatment on the generated original space-time image by adopting a second-order frequency domain differential unsharp enhancement algorithm to obtain a space-time image after texture enhancement comprises the following steps:
the linear stripe enhancement of the original space-time image is realized by using a second-order frequency domain differential inverse sharpening enhancement algorithm, which is expressed as follows:
g(x,y)=f(x,y)+λ(f t (x,y)-f d (x,y));
wherein: g (x, y) is a space-time image after texture enhancement; f (x, y) is the original spatiotemporal image of the input, f t (x, y) is the homomorphically filtered image of the original spatio-temporal image, f d (x, y) is an image of an original space-time image after Gaussian low-pass filtering, lambda is a detail enhancement factor, and brightness detail enhancement is realized by controlling the contribution degree of a differential component to the original space-time image by lambda, wherein the expression is as follows:
wherein, alpha and beta are constants, 0< alpha <1,60< beta <120, K is a differential curvature value, and the expression is:
wherein K represents the second order differential curvature value of the high gradient region and the flat region,equal to the second derivative in the gradient direction, +.>A second derivative value equal to the direction perpendicular to the gradient, expressed as:
wherein f x ,f y ,f xx ,f yy ,f xy Representing the first, second and mixed second derivatives of the original spatiotemporal image f (x, y) pair in the x, y directions, respectively.
6. The open channel flow measurement method based on space-time image texture feature extraction according to claim 4, wherein the space-time image after texture enhancement is processed by a two-dimensional discrete Fourier transform background noise suppression method to obtain a preliminary noise-reduced space-time image; comprising the following steps:
performing two-dimensional discrete Fourier transform on the space-time image G (x, y) with the enhanced texture to obtain a frequency domain image G (u, v);
after the frequency domain image G (u, v) is subjected to spectrum centering display, a two-dimensional Butterworth filter is adopted to filter the frequency domain image, so that a processed frequency domain image G (u, v) is obtained;
from the processed frequency domain image G, (u, v), a two-dimensional discrete Fourier inverse is performedTransforming back to airspace to obtain background noise image f DFT (x,y);
Combining the texture enhanced spatiotemporal image g (x, y) with the background noise image f DFT (x, y) performing differential operation to obtain preliminary noise-reduced space-time image
7. The open channel flow measurement method based on space-time image texture feature extraction according to claim 4, wherein the obtaining the preliminary texture main direction angle by using a Radon transform method for the preliminary noise-reduced space-time image comprises:
a straight line L in the (x, y) plane space is expressed as: xcos θ+ysinθ=ρ, where ρ represents the distance from the point (x, y) to the origin, θ represents the angle between the perpendicular to the line from the origin and the x-axis;
mapping the straight line L into a point (rho, theta) of a Radon space, wherein the value of the point (rho, theta) is that the area of the straight line L passes through a preliminary noise-reducing space-time imageIs added up, then Radon of the continuous space is transformed into:
wherein R (ρ, θ) represents an integrated value of a given (ρ, θ), i.e., a straight line L having a position ρ along the θ direction;a space-time image for preliminary noise reduction; />The representation is: the point (x, y) on the straight line L satisfies δ (p-xcos θ -ysinθ) =1, and the other points δ not on the straight line L(p-xcosθ-ysinθ)=0;
And in the R (rho, theta) transformation domain, finding the corresponding angle when the line integral value is maximum, namely the main direction angle of the primary texture.
8. The open channel flow measurement method based on space-time image texture feature extraction according to claim 4, wherein the step of transferring the obtained primary texture main direction angle as a direction parameter input to a stationary direction filter function to obtain a space-time image retaining a specific texture direction angle comprises:
a Gaussian function is used as a kernel function of a steerable filter, and a group of direction-adjustable basis functions of second derivatives of the Gaussian function are designed;
calculating a steering formula and an interpolation function of an adjustable direction filter of the second derivative;
spatiotemporal image to be preliminarily denoisedConvolving with the combined basis filter bank, multiplying each by the corresponding interpolation function, and summing the partial results to obtain a spatio-temporal image G retaining the angle of the specific texture direction 2 (x,y)。
9. The open channel flow measurement method based on space-time image texture feature extraction according to claim 1, wherein the texture direction angle is detected by adopting a two-dimensional autocorrelation function of the brightness of the space-time image according to the space-time image with the specific texture direction angle reserved, and the optimal texture main direction angle is obtained; comprising the following steps:
solving a two-dimensional autocorrelation function image for the space-time image retaining the specific texture direction angle;
amplifying the central point information of the logarithmic polar coordinates for the two-dimensional autocorrelation function image;
calculating a required direction angle extremum in the logarithmic polar coordinate image;
and obtaining an angle maximum value in the average direction distribution, namely the optimal main direction angle of the texture.
10. An open channel flow measurement device based on space-time image texture feature extraction, characterized by comprising:
the image acquisition module is used for reading the change condition of gray values on the velocimetry line within the corresponding time frame number of the open channel video to be detected and generating an original space-time image;
the image preprocessing module is used for carrying out enhancement and noise reduction preprocessing on the original space-time image to obtain a space-time image retaining a specific texture direction angle;
the main direction angle identification module: the method comprises the steps of detecting a texture direction angle by adopting a two-dimensional autocorrelation function of the brightness of a space-time image according to the space-time image with a specific texture direction angle reserved, and obtaining an optimal texture main direction angle;
the flow velocity and flow calculation module is used for calculating the flow velocity and the total flow of the open channel to be measured according to the acquired main direction angle of the texture.
CN202410066349.9A 2024-01-17 2024-01-17 Open channel flow measurement method and device based on space-time image texture feature extraction Pending CN117889923A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410066349.9A CN117889923A (en) 2024-01-17 2024-01-17 Open channel flow measurement method and device based on space-time image texture feature extraction

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410066349.9A CN117889923A (en) 2024-01-17 2024-01-17 Open channel flow measurement method and device based on space-time image texture feature extraction

Publications (1)

Publication Number Publication Date
CN117889923A true CN117889923A (en) 2024-04-16

Family

ID=90642407

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410066349.9A Pending CN117889923A (en) 2024-01-17 2024-01-17 Open channel flow measurement method and device based on space-time image texture feature extraction

Country Status (1)

Country Link
CN (1) CN117889923A (en)

Similar Documents

Publication Publication Date Title
US11562498B2 (en) Systems and methods for hybrid depth regularization
CN112254801B (en) Micro-vibration vision measurement method and system
CN108230367A (en) A kind of quick method for tracking and positioning to set objective in greyscale video
CN110764087B (en) Sea surface wind direction inverse weighting inversion method based on interference imaging altimeter
CN112634335A (en) Method for extracting characteristic point pairs of robust remote sensing image facing to nonlinear radiation distortion
CN108932699A (en) Three-dimensional matching reconciliation filtering image denoising method based on transform domain
CN109002777B (en) Infrared small target detection method for complex scene
CN108038856B (en) Infrared small target detection method based on improved multi-scale fractal enhancement
CN113837952A (en) Three-dimensional point cloud noise reduction method and device based on normal vector, computer readable storage medium and electronic equipment
CN104318586B (en) Adaptive morphological filtering-based motion blur direction estimation method and device
CN115761563A (en) River surface flow velocity calculation method and system based on optical flow measurement and calculation
CN113899349A (en) Sea wave parameter detection method, equipment and storage medium
CN112669332B (en) Method for judging sea-sky conditions and detecting infrared targets based on bidirectional local maxima and peak value local singularities
Balasubramanian et al. Utilization of robust video processing techniques to aid efficient object detection and tracking
CN117889923A (en) Open channel flow measurement method and device based on space-time image texture feature extraction
CN110929574A (en) Infrared weak and small target rapid detection method
CN102551683B (en) Three-dimensional temperature imaging method and system
CN112927169B (en) Remote sensing image denoising method based on wavelet transformation and improved weighted kernel norm minimization
Schug et al. Three-dimensional shearlet edge analysis
Wang et al. Improved morphological band-pass filtering algorithm and its application in circle detection
Padmavathi et al. Performance evaluation of the various edge detectors and filters for the noisy IR images
CN114429593A (en) Infrared small target detection method based on rapid guided filtering and application thereof
Vignesh et al. Performance and Analysis of Edge detection using FPGA Implementation
Ning et al. Ship detection of infrared image in complex scene based on bilateral filter enhancement
CN112348853A (en) Particle filter tracking method based on infrared saliency feature fusion

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination