CN111914695B - Tidal bore monitoring method based on machine vision - Google Patents
Tidal bore monitoring method based on machine vision Download PDFInfo
- Publication number
- CN111914695B CN111914695B CN202010684533.1A CN202010684533A CN111914695B CN 111914695 B CN111914695 B CN 111914695B CN 202010684533 A CN202010684533 A CN 202010684533A CN 111914695 B CN111914695 B CN 111914695B
- Authority
- CN
- China
- Prior art keywords
- image
- tidal
- foreground
- line
- points
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 60
- 238000012544 monitoring process Methods 0.000 title claims abstract description 40
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 claims abstract description 41
- 238000001514 detection method Methods 0.000 claims abstract description 35
- 238000012545 processing Methods 0.000 claims abstract description 25
- 230000001629 suppression Effects 0.000 claims abstract description 20
- 238000003379 elimination reaction Methods 0.000 claims abstract description 18
- 230000008030 elimination Effects 0.000 claims abstract description 17
- 230000008859 change Effects 0.000 claims description 16
- 230000008569 process Effects 0.000 claims description 13
- 238000005516 engineering process Methods 0.000 claims description 10
- 238000010586 diagram Methods 0.000 claims description 8
- 239000011159 matrix material Substances 0.000 claims description 6
- 230000009466 transformation Effects 0.000 claims description 5
- 238000004364 calculation method Methods 0.000 claims description 4
- 238000000605 extraction Methods 0.000 claims description 4
- 230000001419 dependent effect Effects 0.000 claims description 3
- 230000000737 periodic effect Effects 0.000 claims description 3
- 238000012935 Averaging Methods 0.000 claims description 2
- 238000003384 imaging method Methods 0.000 claims description 2
- 230000000694 effects Effects 0.000 abstract description 6
- 238000005259 measurement Methods 0.000 description 10
- 238000011160 research Methods 0.000 description 4
- 238000012937 correction Methods 0.000 description 3
- 230000009286 beneficial effect Effects 0.000 description 2
- 238000009826 distribution Methods 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 239000000203 mixture Substances 0.000 description 2
- 238000005457 optimization Methods 0.000 description 2
- 230000000630 rising effect Effects 0.000 description 2
- 238000005070 sampling Methods 0.000 description 2
- 239000013598 vector Substances 0.000 description 2
- 241001504505 Troglodytes troglodytes Species 0.000 description 1
- 230000007797 corrosion Effects 0.000 description 1
- 238000005260 corrosion Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000009792 diffusion process Methods 0.000 description 1
- 239000003344 environmental pollutant Substances 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000000691 measurement method Methods 0.000 description 1
- 238000002156 mixing Methods 0.000 description 1
- 239000002245 particle Substances 0.000 description 1
- 231100000719 pollutant Toxicity 0.000 description 1
- 238000011897 real-time detection Methods 0.000 description 1
- 238000005096 rolling process Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000011144 upstream manufacturing Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/40—Scenes; Scene-specific elements in video content
- G06V20/49—Segmenting video sequences, i.e. computational techniques such as parsing or cutting the sequence, low-level clustering or determining units such as shots or scenes
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C13/00—Surveying specially adapted to open water, e.g. sea, lake, river or canal
- G01C13/002—Measuring the movement of open water
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/70—Denoising; Smoothing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/194—Segmentation; Edge detection involving foreground-background segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02A—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
- Y02A10/00—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE at coastal zones; at river basins
- Y02A10/40—Controlling or monitoring, e.g. of flood or hurricane; Forecasting, e.g. risk assessment or mapping
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Computing Systems (AREA)
- Life Sciences & Earth Sciences (AREA)
- Hydrology & Water Resources (AREA)
- Image Processing (AREA)
Abstract
The invention discloses a tidal bore monitoring method based on machine vision, which comprises the following steps: collecting a water surface video sequence image by using monitoring equipment; background elimination processing is carried out on the collected water surface video sequence image to obtain a foreground image; carrying out foreground ripple noise suppression processing on the foreground image; carrying out tide line detection on the foreground image subjected to the foreground ripple noise suppression treatment; and measuring the distance of the tidal line and estimating the speed based on the acquired tidal line information. The tidal bore online monitoring and early warning system can be used for online monitoring and early warning of tidal bore, has the advantages of high program efficiency, obvious background interference suppression, outstanding tidal head line profile, high detection precision, strong instantaneity and the like, solves a series of problems in the prior art, and has a better tidal bore monitoring effect.
Description
Technical Field
The invention belongs to the field of intelligent water conservancy, relates to a tidal bore monitoring method, and particularly relates to a tidal bore monitoring method based on machine vision.
Background
The tidal wave further aggravates the deformation after entering the estuary, and forms a rising tidal wave front peak line with the water level rising suddenly under certain conditions, namely the tidal wave. The tidal bore is a wave train which is propelled to the upstream of the river mouth, and when the strength is relatively small, the surface is not broken and is a wavy tidal bore; when the strength is increased, the surface is broken and developed into a whirlpool rolling tidal bore, namely a strong tidal bore. After the strong tidal surge meets the falling tide, the tidal surge is broken and involves a large amount of air, and the tidal head forms a milky water-air mixture which traces the origin and is straight forward, and the vigor is turbulent and is even spectacular. However, when the tide is spectacular, the tidal bore has some potential safety hazards, and brings inconvenience to the production along the shore and the tide observing people. Taking Qiantanjiang river in China as an example, according to incomplete statistics, since 1998, more than hundreds of people are swept by Qiantanjiang river tidal water, and at least hundreds of people lose lives. Tragedy of one time causes the flood disaster to be listed in a problem list which is solved firstly in Zhejiang province, and meanwhile, in order to prevent the 'gulping' event from happening again, government departments and local masses spontaneously organize 'shouting people' and recommend those who flood to temporarily swim and are in danger zones. The hydrological department also forecasts the tidal bore according to the tidal bore rule, but the hydrological department has a large error in forecasting time of the arrival of the tidal bore due to the influence of natural factors such as wind direction, geographical position and the like, and the intensity of the tidal bore cannot be forecasted. Therefore, potential safety hazards which may exist in the tidal bore of the Qiantangjiang river can be really solved only by accurately detecting the tidal bore, calculating the tidal bore strength and the moving speed, judging the danger coefficient of the tidal bore, and issuing early warning information in time. In addition, the strongly aerated tidal bore tide head has strong turbulent fluctuation characteristics, which inevitably has great influence on the circulating structure of the water area, the transportation of silt, the diffusion of pollutants, even the ecological balance of the whole estuary area and the like. Therefore, the tidal bore monitoring has important social significance and scientific research value.
Traditional tidal bore monitoring mode mainly has contact and non-contact two types, and contact measurement method adopts fluviograph and velocity of flow to the appearance, utilizes when the tidal bore reachs that the water level risees, the tidal bore flow is to the contrary with the river water and the hydrographic data such as tidal bore velocity of flow implement the control, but the shortcoming of this kind of contact equipment is: a large amount of silt can be brought in tidal bore, so that the measurement error is large, and meanwhile, the equipment can be continuously corroded to cause system failure when being in long-term contact with tidal water. The non-contact measurement adopts equipment erected on the bank, and the detection of the tidal bore is realized through signals such as radar, audio and video. The non-contact equipment has the advantages that equipment damage caused by tidal surge impact and tidal water corrosion can be avoided, and the equipment is relatively convenient to install and maintain. Most of the existing tidal bore detection modes are based on tidal bore audio features, and the tidal bore detection mode based on audio frequency is characterized in that signals of an audio data sensor arranged along the river bank of the Qiantang river are collected, then band-pass filtering is carried out on the collected signals, and the energy of tidal bore frequency band signals is calculated to judge whether tidal bore arrives.
However, the traditional tidal bore monitoring technology based on audio frequency cannot realize monitoring when the noise interference of many showy people is serious, and meanwhile, the realization method is single, and effective information cannot be provided for the next monitoring point, so that the prediction function of the subsequent monitoring point is realized. However, the video-based monitoring method is considered from the cost aspect, and takes the qian tang jiang as an example, the qian tang jiang shore has a perfect monitoring system at present, no additional control equipment is required, the investment cost is low, and in addition, the video-based tidal bore monitoring method can realize the tracking and measurement of the position and the motion information of the tidal bore by the image processing technology and the plane photogrammetry technology, and can provide relatively accurate forecast information for the next monitoring point.
The basic principle of tidal bore detection is aquatic moving target detection, and a plurality of aquatic moving target detection and tracking algorithms are proposed at home and abroad at present. Gloyer B and the like perform background modeling through a median method, train an image sequence, extract a gray median of each pixel point as a background model, and then perform differential operation on a frame to be detected and the model to judge whether each pixel point belongs to a moving target; wren et al propose single Gaussian background modeling, which assumes that pixel points in the background conform to Gaussian distribution within a period of time, and then compares the Gaussian model with the pixels to be measured. The method is simple, small in calculated amount and strong in real-time performance, has good effect in certain applications, but cannot adapt to outdoor complex background environment; aiming at the limitation of single Gaussian, Stauffer, Grimson and the like also provide a Gaussian mixture modeling method, namely a plurality of Gaussian distribution models are established for each pixel in an image so as to adapt to an outdoor complex and variable environment and have a certain optimization effect on noise, but each pixel in the environment has uncertainty, and the Gaussian models with fixed number are possibly inaccurate; barnich et al propose a non-parametric VIBE modeling algorithm based on a pixel model, which utilizes an 8-field gray value of each pixel to establish a sample set with a certain length for the pixel, compares the input pixel with the pixels in the sample set, judges whether the point belongs to a background point or not through a preset threshold, and randomly updates the sample set by using a matched pixel point. The algorithm is characterized in that a random updating strategy is adopted, the adaptability is strong, and good effect is achieved in most of actual scenes.
The tidal bore monitoring method based on the video mainly utilizes single-frame or multi-frame images to detect the position of a tidal bore tide head and calculate the movement speed of the tidal bore tide head. Ji Surang et al propose a method for detecting a tide line by using a single-frame picture, which is simple and rapid, and can extract the tide line in the picture with a simple background, but for a complex background environment, the method cannot obtain an accurate tide line; great, etc. have proposed the tidal bore detection method based on video identification, carry on the background modeling to the water surface through mixing the Gaussian model, reuse frame to be measured and background image difference operation to extract the tide head line, this kind of method has better detection results to the static background, but to the scene that the ripple of the water surface is big, the background is complicated, there are more noises that other moving objects form, cause the tide head line extracted to be inaccurate; gapeng et al propose to detect and track tidal bore by using a background modeling method and a particle filter model, and the algorithm can realize detection and dynamic tracking of tidal bore, but has high dependency on parameters and does not have the capability of real-time detection and the like.
Since the view angle of the surveillance camera includes not only the land but also the complex water surface, and the tidal bore belongs to a non-rigid target, these factors make the video-based tidal bore monitoring research troublesome. Although in the field of moving object detection research, expert scholars propose a plurality of algorithms, the algorithms are only suitable for specific environments, and further research is needed for tidal bore detection.
Disclosure of Invention
The invention aims to: in order to overcome the defects in the prior art, a tidal bore monitoring method based on machine vision is provided.
The technical scheme is as follows: in order to achieve the purpose, the invention provides a tidal bore monitoring method based on machine vision, which comprises the following steps:
s1: collecting a water surface video sequence image by using monitoring equipment;
s2: background elimination processing is carried out on the collected water surface video sequence image to obtain a foreground image;
s3: carrying out foreground ripple noise suppression processing on the foreground image;
s4: carrying out tidal head line detection on the foreground image subjected to the foreground ripple noise suppression processing in the step S3;
s5: based on the tidal line information acquired in step S4, the distance of the tidal line is measured and the velocity is estimated.
Further, the step S1 includes a video image acquisition step specifically: firstly, erecting a camera on a high pole to shoot the direction of a tide and measuring external parameters of the camera; and then, decoding the shot video according to a set sampling interval to take frames, carrying out distortion correction on each frame of image by using camera internal parameters obtained by pre-calibration in a laboratory to obtain a distortion-free gray image sequence, and then carrying out background elimination on the gray image sequence. The external parameters are measured and the external orientation parameters are solved through single-image space back intersection, and the internal parameters and the correction coefficients of the camera are calculated according to the Zhangyingyou calibration method.
Further, the specific process of the background elimination processing in step S2 is as follows:
a1: carrying out background modeling on the acquired gray level image to obtain a foreground image and a background image of the tidal bore image sequence;
a2: carrying out centralization and longitudinal gradient processing on the obtained foreground image;
a3: and D, updating the background model according to the gray value of the image processed in the step A2 so as to quickly respond to the sudden change of the gray value, and obtaining the foreground image after the local background change is suppressed.
Further, the background modeling in the step a1 specifically includes: carrying out background difference operation on the distortion-corrected non-distorted gray level image by using a formula (1), and considering that pixel points meeting the formula (1) are background points and pixel points not meeting the formula (1) are foreground points, thereby obtaining a foreground image and a background image:
I t+1 (x,y)-μ t (x,y)<kσ t (x,y) (1)
wherein, I t+1 (x, y) represents a gray value, μ, at t +1 frame image (x, y) t (x, y) represents the gray value of the pixel position in the background model, k is a constant, σ t (x, y) is the standard deviation.
Further, the formula of the centering process in the step a2 is as follows:
wherein, I (t) is the current frame image, mean (I (t)) is the average value of the sum of the gray levels of all pixel points of the current frame image, and I' (t) is the image after centralization;
the formula for the longitudinal gradient process is:
where row represents the total number of rows of image pixels, f (i) represents the pixel value of the ith row, and Grad (i) represents the gradient value of the ith row.
Further, the specific process of step a3 is as follows: firstly, counting the change size of the total gray value between adjacent frames and setting a threshold value, then calculating the change of the total gray value between the frame to be detected and the image of the previous frame before background difference detection, when the change exceeds the threshold value, reinitializing a background model, then detecting, and quickly responding to the sudden change of the gray value to obtain a foreground image containing a tide line and other interference factors in such a way.
Further, the foreground ripple noise suppression processing in step S3 includes two links of water surface flare elimination and water surface ripple suppression in sequence, specifically:
(1) water surface flare elimination: setting a rectangular window on the foreground image, enabling the foreground point to be located in the center of the window, counting the sum of gray values of pixel points in 8 fields in nine-grid squares around all the foreground points, then averaging the gray values of all the pixel points in all the foreground point nine-grid squares, and finally eliminating the foreground points smaller than the average gray value;
(2) water surface ripple suppression: establishing a variable E (x, y) recording the flare condition for each pixel point on the background image after modeling, and setting the initial value to be 0, if a certain pixel point (x, y) is judged as a background point at a certain moment and is judged as a foreground point at the next moment, adding a to the flare variable value E (x, y) of the pixel point, and otherwise, subtracting b. If the two adjacent detection results are the foreground, adding a/2 to the flare variable value E (x, y); if the adjacent two detections are backgrounds, the flare variable value E (x, y) is not modified; if the flare variable value E (x, y) is greater than a threshold value T, the point is removed from the foreground map and determined to be noise due to ripple perturbations or other periodic disturbances, where a, b are empirically set constants.
Further, the specific process of the tidal head line detection in step S4 is as follows: firstly, extracting the upper edge and the lower edge of a foreground image after the ripple noise is suppressed, and then fitting foreground points in the upper edge image and the lower edge image by adopting a least square method to obtain the straight line fitting of a tide line; considering the complex factors of the background environment, comparing the error magnitude of each point and the fitting straight line in a continuous iteration mode during fitting, if the error is large, judging the point to be a noise point, removing the noise point from the sample space psi, obtaining a new sample space psi ' by comparing the errors of all the old sample spaces and eliminating the point with the large error, wherein the number of the sample points in the sample space psi ' is less than or equal to the number of the sample points in the sample space psi, obtaining a tidal bore motion direction leading edge diagram by re-fitting the straight line by using the sample space psi ', finally performing straight line extraction on the tidal bore motion direction leading edge diagram according to a Hough transformation straight line detection technology, and establishing tidal bore line equation description.
Further, the expression of the sample space ψ is: psi ═ x 1 ,y 1 )(x 2 ,y 2 ),...,(x i ,y i ) And calculating the A, B, C and D values corresponding to the sample points, wherein A ═ Σ x i y i 、B=∑x i 、D=∑y i 。
Further, the specific process of measuring the distance of the tide line and estimating the speed in step S5 is as follows:
finding the tidal head line end point on the foreground map, and counting the image coordinates (x) of the tidal head line end point 1 ,y 1 )、(x 2 ,y 2 ) Then, the hypothetical object point (X, Y, Z), the projection center (X) are imaged based on the central perspective projection in photogrammetry C ,Y C ,Z C ) And (3) solving the universal coordinate (X) of the line end of the tidal head by utilizing a collinear equation given by the formula (4) according to the principle of collinear three points with the image point (X, y) 1 ,Y 1 )、(X 2 ,Y 2 ):
Wherein s is the pixel size of the image sensor, (x) o ,y o ) For the image principal point coordinates of the image, a rotation matrix composed of 9 coefficients consists of the azimuth angle kappa, the pitch angle omega and the roll angle of the camera relative to the horizontal planeRepresents:
wherein (X) C ,Y C ,Z C ,Omega, kappa) are called as the exterior orientation parameters of the camera and are obtained by the field calibration of the camera;
the specific implementation method comprises the following steps: with camera as origin of horizontal coordinates, i.e. falseLet X C =0、Y C 0; height Z of camera relative to water surface C Expressed as elevation Z of the camera relative to the water level reference point 0 The sum of the actual water level (tide level) l, i.e. Z C =Z 0 + l; measuring azimuth angle kappa, pitch angle omega and roll angle of camera relative to horizontal plane by adopting attitude sensor
The equation for solving a straight line by two points is simplified into a general equation:
wherein y represents a dependent variable of a linear equation, x represents an independent variable of the linear equation, and then a distance formula from a point to a straight line is utilized:
wherein dis is the distance from the camera to the tidal head line to be obtained,B=-1,(x 0 ,y 0 ) For the coordinate parameter of the exterior orientation of the camera solved by the single-phase space back intersection method, the calculation formula of the average velocity of the tidal head line motion is as follows:
wherein,represents the average speed of the tidal line motion, dis is the tidal line calculated for the current frameDistance dis0 is the tidal head line distance of the first picture selected during detection, num is the current frame number, num0 is the frame number of the image for detecting the first picture, and fps is the frame rate; thereby obtaining the distance from the tide line to the camera and calculating the motion average speed of the tide line according to the distance from the tide line to the camera.
The method comprises five links of video image acquisition, background elimination, foreground ripple noise suppression, tidal head line detection and tidal head line distance and speed measurement. Firstly, erecting a camera on a high pole to shoot the direction of a tide and measuring external parameters of the camera; then, decoding the shot video according to a set sampling interval to take frames, carrying out distortion correction on each frame of image by using camera internal parameters obtained by pre-calibration in a laboratory to obtain a distortion-free gray image sequence, then carrying out background elimination on the gray image sequence, carrying out background modeling on the gray image sequence by using a single Gaussian background modeling algorithm in a background elimination link, then carrying out background difference operation on a modeled background model and an image to be detected to obtain a foreground image, carrying out centralization and longitudinal gradient processing on the foreground image, and updating the background modeling model according to the gray value of the processed image to obtain the foreground image containing a tide head line and other interference factors; then, performing water surface flare elimination and ripple suppression on the foreground image based on the gray scale statistical characteristic and the water surface flare flicker characteristic to obtain a foreground image with ripple noise suppressed; then carrying out tide line detection on the obtained foreground image after the ripple noise is suppressed and establishing a linear equation of the foreground image; in the tidal head line distance measurement and speed estimation link, the distance and speed of the tidal head line are measured by combining the external parameters of the camera which are actually measured with the image-control-free planar photogrammetry technology.
Has the advantages that: compared with the prior art, the invention has the following advantages:
1. the real-time performance is strong. The background modeling method adopted by the invention carries out algorithm improvement based on a single Gaussian background modeling algorithm, and compared with a mixed Gaussian modeling method and a VIBE modeling method, the method has the advantages of small calculation amount and high program efficiency; the tidal bore monitoring method based on machine vision can realize tracking and measuring of the position and motion information of tidal bore through an image processing technology and a plane photogrammetry technology, and provides relatively accurate forecast information for the next monitoring point.
2. The stability is strong. Compared with the traditional single Gaussian background algorithm, the algorithm improvement based on the single Gaussian background algorithm is beneficial to adapting to the outdoor complex background environment by performing centralization processing, longitudinal gradient processing and background model updating on the image, and has an optimization effect on water surface ghost and reflection.
3. The detection precision is high. Performing water surface flare elimination and ripple suppression on the foreground image based on the gray statistical characteristics and the water surface flare flicker characteristics, reducing the interference of the noise foreground point on the extraction of the tide line, and being beneficial to extracting the tide line; the straight line fitting is adopted firstly, and then Hough transformation is carried out, so that the precise positioning of the tide head line is facilitated, and the tidal bore tracking function can be realized.
4. Distance measurement and speed estimation can be realized. The distance from the tide line to the camera can be calculated by combining the external parameters of the camera obtained by field calibration with an image-control-free plane photogrammetry technology; and calculating the average moving speed of the tidal bore according to the calculated distance from the tidal head line to the camera.
Drawings
FIG. 1 is a flow chart of the method of the present invention;
FIG. 2 is a flow chart of a conventional single Gaussian background modeling algorithm;
FIG. 3 is a flow chart of a single Gaussian background modeling algorithm modified in the present invention;
FIG. 4 is a hardware diagram of an experimental point measuring dome camera;
FIG. 5 is a graph showing the experimental effect of the method of the present invention.
Detailed Description
The invention is further elucidated with reference to the drawings and the embodiments.
The invention provides a tidal bore monitoring method based on machine vision, which sequentially comprises five links of video image acquisition, background elimination, foreground ripple noise suppression, tidal head line detection, tidal head line distance measurement and speed estimation as shown in figure 1.
The embodiment takes the observation point of the qian tang jiang haining park as an example to apply the method of the invention, and the specific process is as follows:
(1) a video image acquisition link:
the video image sequence in the embodiment is directly provided by the observation point of the qiantangjiang haining park, the hardware diagram of the experimental measurement point is specifically shown in fig. 4, a camera adopted when the observation point of the qiantangjiang haining park is used for shooting is a Kangwei view ball machine, the focal length of the camera lens is 6mm, the pitch angle omega is 3.5 degrees, and the acquired video image sequence is converted into a distortion-free gray image through field calibration of the camera.
(2) A background elimination step:
in this embodiment, the existing single-gaussian background modeling algorithm flow shown in fig. 2 is improved to obtain an improved single-gaussian background modeling algorithm flow shown in fig. 3, with reference to the flow of fig. 3, the background elimination step is divided into the following steps:
carrying out background difference operation on the undistorted gray-scale image by using a formula, and considering pixel points meeting the formula as background points and pixel points not meeting the formula as foreground points so as to obtain a foreground image, wherein the formula is as follows:
I t+1 (x,y)-μ t (x,y)<kσ t (x,y) (1)
wherein, I t+1 (x, y) represents a gray value, μ, at t +1 frame image (x, y) t (x, y) represents the mean of the gray values at image (x, y) in the background model, k is typically constant 2.5-3.0, σ t (x, y) is the standard deviation of the gray values at image (x, y) in the background model. In this example, k is 2.5. A background image and a foreground image are obtained, wherein the background image is specifically shown as (c) in fig. 5.
Secondly, the foreground image obtained by modeling is subjected to centralization treatment by using a formula, wherein the formula is as follows:
wherein, I (t) is a gray value matrix of all pixel points of the t-th frame image, mean (I (t)) is an average value matrix of the gray sum of all pixel points of the t-th frame image, I' (t) is a gray value matrix of all pixel points of the image after centering, and the specific processing result is shown in (a) of fig. 5;
then, the image after the centralization processing is processed by longitudinal gradient processing, and the gradient processing formula is as follows:
wherein row represents the total row number of the image pixels, f (i) represents gray value row vectors of all pixel points in the ith row, and grad (i) represents gradient value row vectors of all pixel points in the ith row, so as to obtain a foreground image after centering and gradient processing, as shown in (b) of fig. 5.
Updating the background model: firstly, counting the change size of the total gray value between adjacent frames of a foreground image sequence, setting a proper threshold value, then calculating the change of the total gray value between a frame to be detected and a previous frame image, when the change exceeds the threshold value, reinitializing a background model or improving the update rate alpha of the background model, and finally detecting by utilizing background differential operation to quickly respond to the sudden change of the gray value. In this way, a foreground image containing the head line and other disturbing factors is obtained, as shown in fig. 5 (c) in particular.
(3) And a foreground ripple noise suppression step:
eliminating water surface flare: firstly, a rectangular window is arranged on a foreground image containing a tide line and other interference factors, a foreground point is located at the center of the window, then the sum of gray values of pixel points in 8 fields in nine grids around all the foreground points is counted, then the gray values of all the pixel points in the nine-grid area of all the foreground points are averaged, and finally the foreground point smaller than the average gray value is eliminated. The processing result in this embodiment is shown in fig. 5 (d).
Secondly, water surface ripple suppression: firstly, establishing a variable E (x, y) for recording the flare condition for each pixel point on a background image after modeling, and defining the initial value to be 0, if a certain pixel point (x, y) is judged as a background point at a certain moment and is judged as a foreground point at the next moment, adding a to the flare variable value E (x, y) of the pixel point, and otherwise, subtracting b. If the two adjacent detection results are the foreground, adding a/2 to the flare variable value E (x, y); if the two adjacent detections are the background, the flare variable value E (x, y) is not modified. If the flare variable value E (x, y) is greater than the threshold value T, the point is removed from the foreground map and judged as noise due to ripple disturbances or other periodic disturbances. In this example, a is 10, b is 1, and the threshold T is 30, and the result of the water level ripple suppression processing is specifically shown in fig. 5 (e).
(4) And a tide line detection link:
firstly, extracting the upper edge and the lower edge of a foreground image after the ripple noise is suppressed, and then fitting foreground points in the upper edge image and the lower edge image by adopting a least square method to obtain straight line fitting of a tide line; considering the complex factors of the background environment, comparing the error magnitude of each point and the fitting straight line in a continuous iteration mode during fitting, if the error is large, judging the error as a noise point, removing the noise point from the sample space psi, and obtaining a new sample space psi 'by comparing the errors of all the points in the old sample space and eliminating the points with large errors, wherein the number of the sample points in the sample space psi' is less than or equal to the number of the sample points in the sample space psi. And (3) fitting a straight line again by using the sample space psi' to obtain a tidal bore motion direction front edge diagram, and finally performing straight line extraction on the tidal bore motion direction front edge diagram according to a Hough transformation straight line detection technology and establishing tidal bore line equation description.
The result of the straight line fitting in this embodiment is specifically shown in fig. 5 (g); the obtained front porch map is specifically shown as (f) in fig. 5; the foreground point of the leading edge map after Hough transformation corresponds to a curve, which is shown in (h) in fig. 5 specifically; the tide line detected by Hough transform is shown in detail in (I) of fig. 5.
(5) Measuring tidal line distance and estimating speed:
after the detection link of the tide head line is finished, searching the tide head line end point on the foreground image, and counting the image coordinate (x) of the tide head line end point 1 ,y 1 )、(x 2 ,y 2 ) Then, based on central perspective projection imaging in photogrammetry, hypothesis object points (X, Y, Z) and projection center (X) C ,Y C ,Z C ) Three points of (x, y) and image pointThe principle of line, the collinearity equation given by the formula (4) is used for solving the line end world coordinate (X) of the tidal head 1 ,Y 1 )、(X 2 ,Y 2 ):
Where s is the pixel size of the image sensor and (x) o ,y o ) For the image principal point coordinates of the image, a rotation matrix composed of 9 coefficients consists of the azimuth angle kappa, the pitch angle omega and the roll angle of the camera relative to the horizontal planeRepresents:
wherein (X) C ,Y C ,Z C ,Omega, kappa) are called as the exterior orientation parameters of the camera and are obtained by the field calibration of the camera;
the specific implementation method comprises the following steps: with camera as origin of horizontal coordinates, i.e. assuming X C =0、Y C 0; height Z of camera relative to water surface C Expressed as elevation Z of the camera relative to the water level reference point 0 The sum of the actual water level (tide level) l, i.e. Z C =Z 0 -l; measuring azimuth angle kappa, pitch angle omega and roll angle of camera relative to horizontal plane by adopting attitude sensorElevation Z of camera relative to water level reference point in this embodiment 0 10m, an azimuth angle kappa of 0 degrees, a pitch angle omega of 3.5 degrees and a roll angleIs0 deg..
The equation for solving a straight line by two points is simplified into a general equation:
wherein y represents a dependent variable of the linear equation, x represents an independent variable of the linear equation, and then a distance formula from a point to a straight line is utilized:
wherein dis is the distance from the camera to the tidal head line to be obtained,B=-1,(x 0 ,y 0 ) For the coordinate parameter of the exterior orientation of the camera solved by the single-phase space back intersection method, the calculation formula of the average velocity of the tidal head line motion is as follows:
wherein,representing the average speed of the motion of the tidal line, calculating the distance of the tidal line by dis for the current frame, selecting the distance of the tidal line of the first picture by dis0 when detecting, taking num as the current frame number, taking num0 as the image frame number of the first picture to be detected, and taking fps as the frame rate; thereby obtaining the distance from the tide line to the camera and calculating the motion average speed of the tide line according to the distance from the tide line to the camera.
In this embodiment, the frame rate is 30fps, the inter-frame time interval is 33.33ms, the distance from the head line to the camera is obtained, and the motion average speed of the head line is obtained according to the distance from the head line to the camera.
Finally, tidal head line movement measurement data as shown in the following table was obtained:
TABLE 1 tidal head line motion measurement data
Therefore, the tidal bore online monitoring and early warning device can be used for online monitoring and early warning of tidal bore.
Claims (10)
1. A tidal bore monitoring method based on machine vision is characterized by comprising the following steps:
s1: collecting a water surface video sequence image by using monitoring equipment;
s2: background elimination processing is carried out on the collected water surface video sequence image to obtain a foreground image;
s3: carrying out foreground ripple noise suppression processing on the foreground image;
s4: performing tidewash line detection on the foreground image subjected to the foreground moire noise suppression processing in the step S3;
s5: based on the tidal line information acquired in step S4, the distance of the tidal line is measured and the velocity is estimated.
2. The tidal bore monitoring method based on machine vision according to claim 1, wherein the acquisition process of the video sequence images of the water surface in step S1 is: erecting a camera on a high pole to shoot the direction of the tide; the camera is calibrated in a laboratory to obtain the internal parameters of the camera for correcting image distortion, and a non-distortion gray level image sequence is obtained.
3. The tidal bore monitoring method based on machine vision according to claim 2, wherein the background elimination process in step S2 comprises the following specific steps:
a1: carrying out background modeling on the acquired gray level image to obtain a foreground image and a background image of the tidal bore image sequence;
a2: carrying out centralization and longitudinal gradient processing on the obtained foreground image;
a3: and D, updating the background model according to the gray value of the image processed in the step A2 so as to quickly respond to the sudden change of the gray value, and obtaining the foreground image after the local background change is suppressed.
4. The tidal bore monitoring method based on machine vision according to claim 3, wherein the background modeling in the step A1 is specifically as follows: carrying out background difference operation on the distortion-corrected non-distorted gray level image by using a formula (1), and considering that pixel points meeting the formula (1) are background points and pixel points not meeting the formula (1) are foreground points, thereby obtaining a foreground image and a background image:
I t+1 (x,y)-μ t (x,y)<kσ t (x,y) (1)
wherein, I t+1 (x, y) represents a gray value, μ, at t +1 frame image (x, y) t (x, y) represents the gray value of the pixel position in the background model, k is a constant, σ t (x, y) is the standard deviation.
5. The tidal bore monitoring method based on machine vision according to claim 3, wherein the formula of the centering process in the step A2 is as follows:
wherein, I (t) is the current frame image, mean (I (t)) is the average value of the gray sum of all pixel points of the current frame image, and I' (t) is the image after centralization;
the formula for the longitudinal gradient process is:
where row represents the total number of rows of image pixels, f (i) represents the pixel value of the ith row, and Grad (i) represents the gradient value of the ith row.
6. The tidal bore monitoring method based on machine vision according to claim 3, wherein the specific process of the step A3 is as follows: firstly, counting the change size of the total gray value between adjacent frames and setting a threshold, then calculating the change of the total gray value between the frame to be detected and the image of the previous frame before background difference detection, when the change exceeds the threshold, reinitializing a background model, then detecting, and quickly responding to the sudden change of the gray value, thereby obtaining a foreground image containing a tide line and other interference factors.
7. The tidal bore monitoring method based on machine vision according to claim 1, wherein the foreground ripple noise suppression processing in step S3 sequentially includes two links of water surface flare elimination and water surface ripple suppression, specifically:
(1) water surface flare elimination: setting a rectangular window on the foreground image, enabling the foreground point to be located at the center of the window, counting the sum of gray values of pixel points in 8 fields in nine grids around all the foreground points, then averaging the gray values of all the pixel points in the nine grids of all the foreground points, and finally eliminating the foreground points smaller than the average gray value;
(2) water surface ripple suppression: establishing a variable E (x, y) recording the flare condition for each pixel point on the background image after modeling, and setting the initial value to be 0, if a certain pixel point (x, y) is judged as a background point at a certain moment and is judged as a foreground point at the next moment, adding a to the flare variable value E (x, y) of the pixel point, otherwise, subtracting b, and if the detection results of two adjacent times are foreground, adding a/2 to the flare variable value E (x, y); if the adjacent two detections are backgrounds, the flare variable value E (x, y) is not modified; if the flare variable value E (x, y) is greater than a threshold value T, the point is removed from the foreground map and determined to be noise due to ripple perturbations or other periodic disturbances, where a, b are empirically set constants.
8. The tidal bore monitoring method based on machine vision according to claim 1, wherein the specific process of tidal head line detection in step S4 is as follows: firstly, extracting the upper edge and the lower edge of a foreground image after the ripple noise is suppressed, and then fitting foreground points in the upper edge image and the lower edge image by adopting a least square method to obtain the straight line fitting of a tide line; considering the complex factors of the background environment, comparing the error magnitude of each point and a fitting straight line in a continuous iteration mode during fitting, if the error is large, judging the point to be a noise point, thus eliminating the noise point from the sample space psi, obtaining a new sample space psi ' by comparing the errors of all the points in the old sample space and eliminating the points with large errors, wherein the number of the sample points in the sample space psi ' is less than or equal to the number of the sample points in the sample space psi, obtaining a tidal bore motion direction front edge diagram by re-fitting straight lines by using the sample space psi ', finally carrying out straight line extraction on the tidal bore motion direction front edge diagram according to a Hough transformation straight line detection technology, and establishing tidal bore line equation description.
10. The tidal bore monitoring method based on machine vision according to claim 9, wherein the specific process of measuring the distance of the tide line and estimating the speed in step S5 is as follows:
finding the tidal head line end point on the foreground map, and counting the image coordinates (x) of the tidal head line end point 1 ,y 1 )、(x 2 ,y 2 ) Then, based on central perspective projection imaging in photogrammetry, hypothesis object points (X, Y, Z) and projection center (X) C ,Y C ,Z C ) Collinear with three points of image point (x, y)Using collinearity equation given by equation (4) to solve tidal head line-end world coordinate (X) 1 ,Y 1 )、(X 2 ,Y 2 ):
Where s is the pixel size of the image sensor and (x) o ,y o ) For the image principal point coordinates of the image, a rotation matrix composed of 9 coefficients consists of the azimuth angle kappa, the pitch angle omega and the roll angle of the camera relative to the horizontal planeRepresents:
wherein,the external orientation parameters are called as the external orientation parameters of the camera and are obtained through field calibration of the camera;
the equation for solving a straight line by two points is simplified into a general equation:
wherein y represents a dependent variable of a linear equation, x represents an independent variable of the linear equation, and then a distance formula from a point to a straight line is utilized:
wherein dis is the distance from the camera to the tidal head line to be obtained,B=-1,(x 0 ,y 0 ) For the coordinate parameter of the exterior orientation of the camera solved by the single-phase space back intersection method, the calculation formula of the average velocity of the tidal head line motion is as follows:
wherein,representing the average speed of the motion of the tidal line, calculating the distance of the tidal line by dis for the current frame, selecting the distance of the tidal line of the first picture by dis0 when detecting, taking num as the current frame number, taking num0 as the image frame number of the first picture to be detected, and taking fps as the frame rate; thereby obtaining the distance from the tide line to the camera and calculating the motion average speed of the tide line according to the distance from the tide line to the camera.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010684533.1A CN111914695B (en) | 2020-07-16 | 2020-07-16 | Tidal bore monitoring method based on machine vision |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010684533.1A CN111914695B (en) | 2020-07-16 | 2020-07-16 | Tidal bore monitoring method based on machine vision |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111914695A CN111914695A (en) | 2020-11-10 |
CN111914695B true CN111914695B (en) | 2022-08-26 |
Family
ID=73280962
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010684533.1A Active CN111914695B (en) | 2020-07-16 | 2020-07-16 | Tidal bore monitoring method based on machine vision |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111914695B (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113344953B (en) * | 2021-04-21 | 2023-07-04 | 中国计量大学 | Machine vision tidal bore flow velocity measurement method based on unmanned aerial vehicle |
CN113420693B (en) * | 2021-06-30 | 2022-04-15 | 成都新潮传媒集团有限公司 | Door state detection method and device, and car passenger flow statistical method and equipment |
CN114396921B (en) * | 2021-11-15 | 2023-12-08 | 中国计量大学 | Method for measuring tidal height and propagation speed of Yangtze river on basis of unmanned aerial vehicle |
CN114812514B (en) * | 2022-04-15 | 2023-06-16 | 浙江省水利河口研究院(浙江省海洋规划设计研究院) | Tidal bore tide head line form and tide head propulsion speed on-site measurement method |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108133483A (en) * | 2017-12-22 | 2018-06-08 | 辽宁师范大学 | Red tide method of real-time based on computer vision technique |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9158985B2 (en) * | 2014-03-03 | 2015-10-13 | Xerox Corporation | Method and apparatus for processing image of scene of interest |
-
2020
- 2020-07-16 CN CN202010684533.1A patent/CN111914695B/en active Active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108133483A (en) * | 2017-12-22 | 2018-06-08 | 辽宁师范大学 | Red tide method of real-time based on computer vision technique |
Also Published As
Publication number | Publication date |
---|---|
CN111914695A (en) | 2020-11-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111914695B (en) | Tidal bore monitoring method based on machine vision | |
WO2021093283A1 (en) | Sea surface small-area oil spill region detection system and detection method based on multi-sensing fusion | |
US11200684B2 (en) | Apparatus and method for measuring flow velocity of stream using optical flow image processing | |
CN110097093A (en) | A kind of heterologous accurate matching of image method | |
CN106228579A (en) | A kind of video image dynamic water table information extracting method based on geographical space-time scene | |
CN103778436A (en) | Pedestrian gesture inspecting method based on image processing | |
CN109711256B (en) | Low-altitude complex background unmanned aerial vehicle target detection method | |
WO2023236886A1 (en) | Cloud occlusion prediction method based on dense optical flow method | |
CN103646389A (en) | SAR slant range image match automatic extraction method based on geometric model | |
CN115983141B (en) | Deep learning-based method, medium and system for inverting wave height of ocean waves | |
CN103942786B (en) | The self adaptation block objects detection method of unmanned plane visible ray and infrared image | |
CN111985314B (en) | Smoke detection method based on ViBe and improved LBP | |
CN115639248A (en) | System and method for detecting quality of building outer wall | |
CN111881837A (en) | Video SAR moving target detection method based on shadow extraction | |
CN109671084B (en) | Method for measuring shape of workpiece | |
CN117576168A (en) | Water flow speed estimation method and system based on traditional optical flow calculation | |
CN116385496A (en) | Swimming movement real-time speed measurement method and system based on image processing | |
CN114782803A (en) | Method for monitoring transmission line galloping based on compression sampling and image recognition | |
CN113553958B (en) | Expressway green belt detection method and device | |
CN112686204B (en) | Video flow measurement method and device based on sparse pixel point tracking | |
CN116193103A (en) | Video picture jitter level assessment method | |
CN113496159B (en) | Multi-scale convolution and dynamic weight cost function smoke target segmentation method | |
CN111583298B (en) | Short-time cloud picture tracking method based on optical flow method | |
CN114913218A (en) | Smoke screen infrared shielding area measuring method based on minimum mode network | |
CN114926524A (en) | Method for improving measurement precision of infrared effective shielding area of smoke screen |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |