CN112861856A - Drainage monitoring method based on computer vision and water body monitoring method - Google Patents

Drainage monitoring method based on computer vision and water body monitoring method Download PDF

Info

Publication number
CN112861856A
CN112861856A CN202110174609.0A CN202110174609A CN112861856A CN 112861856 A CN112861856 A CN 112861856A CN 202110174609 A CN202110174609 A CN 202110174609A CN 112861856 A CN112861856 A CN 112861856A
Authority
CN
China
Prior art keywords
optical flow
dimensional
matrix
monitored
array
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110174609.0A
Other languages
Chinese (zh)
Other versions
CN112861856B (en
Inventor
甘小皓
钟璞星
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huimu Chongqing Technology Co ltd
Original Assignee
Huimu Chongqing Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huimu Chongqing Technology Co ltd filed Critical Huimu Chongqing Technology Co ltd
Priority to CN202110174609.0A priority Critical patent/CN112861856B/en
Publication of CN112861856A publication Critical patent/CN112861856A/en
Application granted granted Critical
Publication of CN112861856B publication Critical patent/CN112861856B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2411Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on the proximity to a decision surface, e.g. support vector machines
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/269Analysis of motion using gradient-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/41Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Data Mining & Analysis (AREA)
  • Computational Linguistics (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Health & Medical Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a drainage monitoring method and a water body monitoring method based on computer vision, wherein the drainage monitoring method comprises the following steps: firstly, acquiring a camera image, dividing a rectangular region set to be monitored drainBox and a preset water flow direction set flowLines in the image, and recording the coordinates of each region set to be monitored drainBox in the region set to be monitored as [ (x1, y1), (x2, y2) ], wherein (x1, y1) and (x2, y2) are respectively the coordinates of a diagonal point on the region set to be monitored; recording coordinates of each preset water flow direction flowLine in the preset water flow direction set flowLines as [ (fx1, f y1), (fx2, fy2) ], wherein (fx1, f y1) and (fx2, fy2) are coordinates of two points sequentially selected on the preset water flow direction flowLine. On the basis of an optical flow method, whether drainage exists is comprehensively judged according to the existence of water flow, the water flow direction and the proportion of the water flow in the same direction. The invention has the advantages of keeping continuous monitoring, preventing steal and emission, being beneficial to reducing the input of manpower and material resources and the like.

Description

Drainage monitoring method based on computer vision and water body monitoring method
Technical Field
The invention relates to the technical field of hydrological and hydraulic monitoring, in particular to a drainage monitoring method and a water body monitoring method based on computer vision.
Background
In the industries of water affairs, water conservancy, hydrology and the like, the drainage state needs to be monitored, for example, the drainage state of a sewage outlet of a factory is monitored, and once the drainage is found, corresponding early warning is generated; monitoring the dam-turning overflow of scenes such as rivers, dams and the like (whether the water level turns over the dam or not, and once the water flow turns over the dam, a corresponding alarm is generated); and if the water replenishing point is normal, replenishing water, and generating a corresponding alarm once the water replenishing state is abnormal. Therefore, a determination of the water flow condition within the monitored or observed area is needed. The current drainage monitoring mainly through artifical patrol, the control of making a video recording or unmanned aerial vehicle control, these monitoring methods need a large amount of manpower and materials, and can't accomplish to last the monitoring, can not stop the behavior of stealing and arranging.
Disclosure of Invention
Aiming at the defects of the prior art, the technical problems to be solved by the invention are as follows: how to provide a drainage monitoring method and a water body monitoring method based on computer vision, which can keep continuous monitoring, prevent drainage stealing and are beneficial to reducing the input of manpower and material resources.
In order to solve the technical problems, the invention adopts the following technical scheme:
a drainage monitoring method based on computer vision is characterized by comprising the following steps:
s1, firstly, obtaining a camera image, dividing a rectangular region set drainBox to be monitored and a preset water flow direction set flowLines in the image, and recording the coordinates of each region set drainBox to be monitored in the region set to be monitored as [ (x1, y1), (x2, y2) ], wherein (x1, y1) and (x2, y2) are the coordinates of diagonal points on the region set to be monitored; recording the coordinates of each preset water flow direction flowLine in the preset water flow direction sets flowLines as [ (fx1, f y1), (fx2, fy2) ], wherein (fx1, f y1) and (fx2, fy2) are coordinates of two points sequentially selected from the preset water flow direction flowLines;
s2, acquiring continuous M frames during monitoring, and obtaining a BGR digital matrix imgDatas formed by the BGR values of the pictures, wherein the dimension of the matrix imgDatas is N H W D, N is the number of frames, W is the width of the pictures, H is the height of the pictures, and D is a picture channel;
extracting position data of the region to be monitored drainBox from a BGR digital matrix imgDatas to obtain a region to be monitored matrix detFrames, wherein the dimension of the region to be monitored matrix detFrames is N H1W 1D, H1 is the picture height of the region to be monitored drainBox, and W1 is the picture width of the region to be monitored drainBox;
calculating an optical flow change value of each pixel point between two continuous frames by adopting an optical flow method for the first dimension of a matrix detFrames of a region to be monitored, obtaining an optical flow output matrix output Flows of the region to be monitored, wherein the dimension of the optical flow output matrix output Flows is ((N-1) H1) W1D, and converting the optical flow output matrix output Flows into an optical flow two-dimensional matrix output Flows _1 which is expressed as ((N-1) H1W 1) D;
traversing each optical flow data of the optical flow two-dimensional matrix outputflow _1, if the sum of absolute values of two corresponding values (optical flow data of each point is 2 values, and represents a direction vector of an optical flow) in the optical flow two-dimensional matrix outputflow _1 is larger than a set minimum value, retaining the optical flow data, otherwise, deleting the optical flow data, and recording a two-dimensional matrix formed by the retained optical flow data as an optical flow filtering two-dimensional matrix outputflow _ 2;
s3, judging the drainage state: converting the coordinates of flowLine in the preset water flow direction into unit vectors, and recording as flow direction unit vectors baseVec; filtering the projection of each group of vectors in the two-dimensional matrix outputFlows _2 on the flow direction unit vector baseVec to obtain a one-dimensional optical flow projection array vecPros; traversing the optical flow filtering two-dimensional matrix outputFlows _2, and calculating the modular length of each group of vectors to obtain a one-dimensional optical flow modular length array vecNorms;
forming a one-dimensional index array filterWhere by using element indexes larger than a set length in the optical flow modular norm array vecnorm, taking out corresponding element values from corresponding positions in an optical flow projection array vecPros and an optical flow filtering two-dimensional matrix outputflows _2 according to index values in the index array filterWhere, and forming a one-dimensional optical flow projection index array vecPros _1 and a two-dimensional optical flow filtering index matrix outputflows _ 3;
averaging the first dimensionality of the optical flow filtering index matrix outputFlowss _3 to obtain an optical flow average vector outputFlowsmean; calculating the cosine similarity between each vector in the optical flow filtering index matrix outputFlows _3 and the flow unit vector baseVec to form a one-dimensional cosine similarity degree group cosSims;
carrying out dimension expansion operation on the optical flow projection index array vecPros _1 and the cosine similarity array cosSims to obtain a two-dimensional optical flow projection index two-dimensional array vecPros _2 and a cosine similarity two-dimensional array cosSims _1, recombining the optical flow projection index two-dimensional array vecPros _2 and the cosine similarity two-dimensional array cosSims _1 according to a second dimension to obtain a two-dimensional optical flow output array outputData, and recording the total optical flow row number dataNum of the optical flow output array outputData;
reserving data of which the first data is larger than a set vector projection threshold and the second data is larger than a set cosine similarity threshold in each group of data of the optical flow output array outputData to obtain a two-dimensional optical flow output effective array outputData _1 and a corresponding optical flow effective total line number dataNum _ 1; calculating the ratio of the total effective optical flow row number dataNum _1 to the total optical flow row number dataNum, and recording the ratio as an effective data rate posDataRatio;
projecting the light flow average vector outputFlowsMean on the flow direction unit vector baseVec to obtain a light flow average projection value vecProMean; the cosine similarity of the average light flow vector outputFlowMean and the flow unit vector baseVec is obtained to obtain the average cosine similarity cosSimMean of the light flow;
if the average projection value vecPromean of the optical flow, the average cosine similarity cosSimMean of the optical flow and the effective data rate posDataRatio are all larger than the corresponding preset threshold values, it is judged that the area drainBox to be monitored is draining, and if not, it is judged that the area drainBox to be monitored is not draining.
A computer vision-based water body monitoring method is characterized by comprising the computer vision-based drainage monitoring method.
In conclusion, the invention has the advantages of being capable of keeping continuous monitoring, preventing steal and emission, being beneficial to reducing the input of manpower and material resources and the like.
Drawings
Fig. 1 and 2 are schematic views of a monitored area.
Fig. 3 is a schematic diagram of the division of the area to be monitored.
Fig. 4 is a photograph of an area of a body of water to be monitored.
Fig. 5 is a schematic view of the structure of the water surface area.
Fig. 6 is a BGR diagram of a water surface area.
FIG. 7 is an HSV map of a water surface area.
Detailed Description
The present invention is described in further detail below with reference to an embodiment of a computer vision-based water body monitoring method.
A water body monitoring method based on computer vision comprises a drainage monitoring method based on computer vision, and the method comprises the following steps:
as shown in fig. 1, there are 2 water discharge ports, the left side discharge port is discharging water, the right side discharge port is not discharging water, and the water discharge state of the two discharge ports is calculated by an algorithm based on computer vision. Before the judgment, the drainage state is defined, firstly, the drainage outlet is determined to have water drainage, and secondly, the water flow direction is consistent with the preset direction. Because it is considered that in certain situations, the drain is partially submerged as shown in fig. 6, it is possible to detect water flow at the drain, but not necessarily the drain is draining. The water flow is detected at the discharging openings at the left side and the right side, but actually the discharging opening at the left side is the water flow of the river channel, and the discharging opening at the right side obviously discharges sewage outwards. Therefore, it is necessary to set the drainage area and possible drainage direction in advance according to the field situation of the drain.
In this embodiment, a drainage monitoring method based on computer vision is described with reference to a scene in fig. 1 (which is also applicable to the scene in fig. 2), as shown in fig. 3, the scene is provided with 2 drainage port detection areas (rectangular mask areas), an arrow is a water flow direction preset for each of two drainage ports, and coordinates of the 2 drainage ports are recorded to obtain drainBoxes: [ [59,542,344,929], [966,316,1280,713] ], wherein the coordinates in each box are the pixel coordinates of the upper left corner and the lower right corner in that order. Recording the preset water flow directions of the two water outlets to obtain flowLines: [ [196,612,202,878], [1127,386,1120,680] ], wherein the coordinates in each box are in turn the coordinates of the water flow direction arrow from the start point to the end point. The water outlet coordinates and the water flow direction coordinates are arranged from left to right and are in one-to-one correspondence.
Preprocessing data
Taking this video stream as an example, first, continuous frame picture data of this video stream is read, and here, 10 continuous frames are taken to obtain a digital matrix composed of pictures BGR, whose dimension is N × H × W × D,
where N is the frame number, W is the picture width, H is the picture height, D is the picture channel (here BGR, where B is blue, G is green, and R is red), define the digital matrix as imgDatas, whose dimensions are: 10 × 1080 × 1920 × 3, the data structure is as follows:
[[[[56 70 29][57 71 30][57 71 30]...[222 203 176][222 203 176][222 203 176]]
...
[[125 135 165][125 135 165][127 135 165]...[47 56 53][47 56 53][47 56 53]]]
...
[[[28 32 0][31 35 0][29 36 0]...[222 203 176][222 203 176][222 203 176]]
...
[[119 138 165][118 137 164][120 140 165]...[48 57 54][48 57 54][47 56 53]]]]
traversing preset row ports, wherein the number of the row ports is 2, taking the first row port as an example, extracting the DrainBox of the first row port from the DrainBox and flowLines data obtained by the setting: [59,542,344,929], flowLine: [196,612,202,878]. Then, performing ROI operation, and extracting data of the position of the first water outlet from imgDatas obtained in the previous step, wherein the operation mode is as follows: [: 542:929,59:344 ], a 10 × 387 × 285 × 3 matrix, denoted detFrames, whose data structure is shown below:
[[[[143 164 179][162 183 198][139 161 173]...[119 134 103][111 129 90][110 130 83]]
...
[[186 179 212][191 184 217][197 190 223]...[111 137 191][113 139 193][118 145 196]]]
...
[[[126 146 163][138 158 175][121 142 157]...[127 138 122][90 105 77][95 115 73]]
...
[[162 159 175][163 161 181][156 152 177]...[139 153 202][127 138 188][117 128 178]]]]
the obtained detFrames is used as input, circulation is carried out from a first dimension (10) and an optical flow method is adopted, optical flow change values of pixel points between two continuous frames are sequentially obtained and spliced into a matrix to obtain output flows, the dimension of the output flows is (9) 387) 285 2, namely 3483 285 2, and the data structure of the output flows is shown as follows, wherein the data preprocessing and calculating process of the optical flow method can refer to the conventional calcoptical flow Farneback method in OpenCv, and the parameters of the output flows can also adopt default parameters.
[[[-0.6494114 -0.32461116]...[-0.4463605 -0.16205281]]
...
[[0.8821222 0.00421591]...[0.33376044 -3.0070875]]]
Performing dimension change operation on the obtained output flows, converting the obtained output flows into a two-dimensional matrix to obtain an output flows _1, wherein the dimension of the output flows _1 is 992655 × 2, and the data structure is as follows:
[[-0.6494114 -0.32461116]...[0.33376044 -3.0070875]]
filtering the obtained outputFlowss _1 data, wherein the logic of the obtained outputFlowss _1 data is to filter out points with relative stillness of two continuous frames in a monitoring area, namely points with optical flow change values close to 0, the filtering method is to traverse each optical flow data, judge a comparison result of the sum of absolute values of two values and a minimum value (0.00001 is given here), if the sum is greater than the minimum value, the obtained outputFlowss _1 data is retained, otherwise, the obtained outputFlowss _1 data is discarded; for example, the first piece of data:
[-0.6494114,-0.32461116],|-0.6494114|+|-0.32461116|=0.97402256>0.00001
this piece of data is retained.
After the above operation, a 975139 × 2 digit matrix is obtained, which is denoted as output flows _2, and is used as a spare.
Drainage determination
The calculation of the drainage state is performed using the output flows _2 obtained in the previous step and flowLine (here, [196,612,202,878]) set in advance as inputs.
First, flowLine data is converted into unit vectors, represented by baseVec, where the results are calculated [0.02255065, 0.9997457 ].
Traversing output Flows _2, calculating the projection of each vector to baseVec to obtain a one-dimensional array with the length of 975139, and recording the one-dimensional array as vecPros, wherein the data structure of the array is as follows:
[-0.33917326 -0.32963609 -0.31685013 ... -3.12023619 -3.1601487 -2.99879626]
and traversing the output flows _2, calculating the modular length of each group of vectors, and obtaining a one-dimensional array with the length of 975139, which is recorded as vecNorms, wherein the data structure of the array is as follows:
[0.7260217 0.7009283 0.64965796 ... 3.120616 3.162034 3.025553]
going through vecNorms, the index of the vecNorms element greater than 0.005, denoted as filterWhere, is calculated as a one-dimensional array of length 899619, and the data structure is shown below.
[0,1,2,...,975136,975137,975138]
According to the index value in the filterWhere, corresponding values are taken out from corresponding positions in vecPros and outputflows _2 and are respectively marked as vecPros _1 and outputflows _ 3. vecPros _1 is a one-dimensional array of length 899619, the data structure of which is as follows:
[-0.33917326 -0.32963609 -0.31685013 ... -3.12023619 -3.1601487 -2.99879626]
the output flows _3 is a two-dimensional array of 899619 × 2, and the data structure is as follows:
[[-0.6494114 -0.32461116]...[0.33376044 -3.0070875]]
and calculating the average value of the first dimension of the output flows _3 to obtain an average vector, and recording the average vector as an output flows mean, wherein the result is [0.00773862, 0.12921983], and the output flows mean represents the total water flow vector data in the drainage monitoring area for later use.
Calculating the cosine similarity between each vector in the output flows _3 and a unit vector (baseVec ═ 0.02255065, 0.9997457), obtaining a 899619 one-dimensional array, which is denoted as cosSims, and the data structure is as follows:
[-0.46716683 -0.47028506 -0.48771838 ... -0.9998783 -0.99940376 -0.99115642]
and performing dimension expansion operation on the vecPros _1 and the cosSims to obtain vecPros _2 and cosSims _1, and recombining the vecPros _2 and the cosSims _1 according to a second dimension to obtain outputData. vecPros _2 is a two-dimensional array of 899619 × 1, the data structure of which is shown below:
[[-0.33917326][-0.32963609][-0.31685013]...[-3.12023619][-3.1601487][-2.99879626]]
cosSims _1 is a two-dimensional array of 899619 × 1, and its data structure is as follows:
[[-0.46716683][-0.47028506][-0.48771838]...[-0.9998783][-0.99940376][-0.99115642]]
the outputData is a two-dimensional array of 899619 × 2, the data structure of which is as follows:
[[-0.33917326 -0.46716683][-0.32963609 -0.47028506]...[-3.1601487 -0.99940376][-2.99879626 -0.99115642]]
the 899619 thus obtained was designated as dataNum.
For outputData, it represents the relation between a series of optical flow data in the monitoring area of the selected row and the direction vector (flowLine) which is preset by us, taking the first group of data [ -0.33917326, -0.46716683] as an example, -0.33917326 is the projection of the corresponding optical flow vector to the direction vector, and-0.46716683 is the cosine similarity of the optical flow vector to the direction vector. According to the previous definition, the drainage state of a drain is determined by 2 factors, whether drainage and the drainage direction are consistent with the arrangement direction of the drain. Therefore, we need to count the data from the outputData that is close to the flowLine (or baseVec) direction (measured by cosine similarity, ranging from-1 to 1, with-1 being opposite direction and 1 being coincident with direction) and has a projection to the direction vector greater than a certain threshold. Then the ratio of the counted data amount in all data (namely, the ratio of the data amount to dataNum) is used as an important basis for whether to drain water or not.
According to the requirement of the previous step, the data of the outputData is counted, the threshold value of vector projection is given to be 0.01, and the threshold value of cosine similarity is given to be 0.5. The outputData can be filtered according to the condition, and after filtering, a two-dimensional array of 209853 × 2 is obtained, which is denoted as outputData _1, and the data structure of the outputData is as follows:
[[0.0155829 0.77702952][0.03556021 0.51000698]...[0.01502273 0.96439865][0.0112475 0.9540855]]
note that 209853 obtained is denoted as dataNum _1, and the ratio of dataNum _1 to dataNum is defined as posDataRatio, which is 0.23326875043768527.
Substituting the obtained outputflowmean into the model, calculating the projection value and cosine similarity of the model and flowLine (or baseVec), and respectively recording the model as vecProMean and cosSimMean. Here, vecProMean is 0.12936148057543792 and cosSimMean is 0.9993057716074317. These two data will also be used as important indexes for measuring the drainage state of the drain.
By this step, three indexes for measuring the drainage state are obtained: vecPromean, cosSimmean, and posDataRatio. vecProMean represents the water flow state of the whole discharge port area, and if the water flow state is larger than a set threshold value, water flows (the output para _1 is equal to 1), otherwise, no water flows (the output para _1 is equal to 0); the cosSimmean represents the water flow direction of the whole discharge port area, if the water flow direction is larger than a set threshold value, the water flow direction is judged to be consistent with the discharge port set direction (the output para _2 is equal to 1), otherwise, the water flow direction is not consistent with the discharge port set direction (para _2 is equal to 0); the posDataRatio represents the proportion of pixel points which accord with the drainage direction and the displacement size in a series of water flow data in the whole pixel points, and if the proportion is larger than a set threshold value, the pixel points are judged to be drained (the output para _3 is equal to 1), otherwise, the pixel points are not drained (the output para _3 is equal to 0), and the interference of slight shaking of a camera or water surface turbulence can be filtered. When all three conditions are satisfied (i.e., para _1 × para _2 × para _3 is 1), it is determined that the discharge port final state is drainage (output drainCon is 1), and otherwise, it is determined that the token port final state is not drainage (output drainCon is 1). Here, the threshold value is set as follows according to the statistical situation of the data: vecPromean (0.01), cosSimmean (0.3), posDataRatio (0.1).
For the row of ports:
vecProMean=0.12936148057543792;cosSimMean=0.9993057716074317;posDataRatio=0.23326875043768527;
and calculating to obtain: para _1 is 1, para _2 is 1, para _3 is 1, so drainCon is 1, i.e. the final state of the outlet is: and (6) draining.
The state of the second row of ports can be calculated according to the same steps as above, and the obtained related data:
vecProMean=-0.009332695940481995;cosSimMean=-0.33229440353309536;posDataRatio=0.11856037477102331;
according to the algorithm rule, calculating to obtain: para _1 ═ 0, para _2 ═ 0, para _3 ═ 1; the drainCon is substituted into para _1 × para _2 × para _3, and the drainCon is obtained to be 0 × 1 × 0, that is, the final state of the second discharge port is: and no water is drained.
Obviously, the calculation result of the algorithm conforms to the real situation of the video picture, the left side discharge outlet discharges water, the right side discharge outlet does not discharge water, as shown in figure 3,
the method can judge whether the drainage outlet has a drainage stealing phenomenon, but cannot judge and monitor whether the water body under the condition of not drainage stealing is polluted, and for this reason, the water body monitoring method of the embodiment also adopts a water body pollution identification method based on vision, and specifically comprises the following steps:
firstly, establishing a training model
Pictures of the polluted water body and pictures of the uncontaminated water body are respectively collected and placed in different file paths, and the pictures in the two states are collected as comprehensively as possible. And taking the polluted water body picture as a polluted water body picture group muddy and taking the uncontaminated water body picture as an uncontaminated water body picture group none _ muddy.
The image data of the two folders are respectively read to obtain BGR data (represented by imgData) of an image, which is a three-dimensional matrix of W × H × D, where W is the image width, H is the image height, and D is an image channel (here, BGR, where B is blue, G is green, and R is red), and the structure of the BGR data is as follows:
[[[82 128 99][82 128 99][82 128 99]…[80 130 100][78 130 100][78 130 100]]
[[96 102 77][126 135 109][179 190 164]…[79 129 99][79 129 99][79 129 99]]]
converting the BGR channel of the obtained picture data into an HSV channel, wherein H is hue, S is saturation and V is brightness, obtaining imgHsv, and the data structure is as follows:
[[[48 94 130][48 95 129][48 95 129]…[48 94 130][48 94 130][48 94 130]]
[[101 39 245][101 39 245][101 39 245]…[48 95 129][48 95 129][48 95 129]]]
converting the three-dimensional matrix of W × H × D of the imgHsv into a two-dimensional matrix form of (W × H) × D to obtain the imgHsv1 with the data structure as follows:
[[48 94 130][48 95 129][48 95 129]…[48 95 129][48 95 129][48 95 129]]
averaging the imgHsv1 data according to the first dimension to obtain hsvMean, which in this embodiment is: [52.47053731, 80.39821747, 137.40183346].
The hsvMean of the pictures of muddy and none _ muddy is sequentially solved, a two-dimensional matrix is respectively obtained, and the data in the matrix is rounded to obtain the muddyHsv and the non-nmuddyhsv, as shown below.
muddyHsv
[[13 125 201][17 107 167][17 80 118]…[24 117 107][22 111 141][39 60 174]]
nonMuddyHsv
[[42 112 58][39 139 70][49 99 73]…[50 82 135][53 79 138][52 80 137]]
Defining the label of the muddyHsv as 1 and the label of the nonmaddyHsv as 0, and obtaining training data trainData and the corresponding label trainLabel. The data structure is as follows:
trainData:
[[13 125 201][17 107 167][17 80 118]…[50 82 135][53 79 138][52 80 137]]
trainLabel:
[1 1 1…0 0 0]
and (4) disordering the training data to obtain shuffledData and a label shuffledLabel corresponding to the shuffledData. The data structure is as follows:
shuffledData:
[[55 84 88][18 104 149][49 114 58]…[50 95 84][26 50 147][79 70 116]]
shuffledLabel:
[0 1 0…0 1 0]
and (3) performing binary training on the data by using an SVM (support vector machine), and obtaining a model file SVM.
Water body pollution identification
As shown in fig. 4, for the regional photo of the water body to be monitored, the BGR data of the picture to be recognized is read first to obtain imgData; and (3) digging out the water surface area from the manually set observation area to obtain mask data, wherein the white area is a water surface part, and the black part is other backgrounds, as shown in fig. 5.
The mask data and the original image data imgData are AND-operated to obtain a mask Img, as shown in FIG. 6,
the obtained masked img data is converted into HSV channels to obtain masked HSV, as shown in fig. 7,
the method comprises the following steps of solving the average value of H, S, V channels of the pixel points of the water surface part, and comprises the following specific steps: the sum maskedhsvum of the H, S, V channels of the maskedHsv is obtained, then the number of elements noneZeroNum of which H, S, V is not 0 in the maskedHsv is counted, the average value hsvMean is maskedthsvSum/noneZeroNum, and the whole is obtained. In this embodiment, the mask hsvsum is: [38561854, 36290556, 49701133], nenzeronum is: 429387, the calculated hsvMean is: [89, 84, 115].
Model, the trained model svm.model is loaded, the above obtained hsvMean is taken as input, the predicted value outputdata is calculated, where outputdata is 0, 0 is the label value of nonmaddyhsv according to the previous definition, and is in an uncontaminated state.
The above description is only exemplary of the present invention and should not be taken as limiting, and any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (2)

1. A drainage monitoring method based on computer vision is characterized by comprising the following steps:
s1, firstly, obtaining a camera image, dividing a rectangular region set drainBox to be monitored and a preset water flow direction set flowLines in the image, and recording the coordinates of each region set drainBox to be monitored in the region set to be monitored as [ (x1, y1), (x2, y2) ], wherein (x1, y1) and (x2, y2) are the coordinates of diagonal points on the region set to be monitored; recording the coordinates of each preset water flow direction flowLine in the preset water flow direction sets flowLines as [ (fx1, f y1), (fx2, fy2) ], wherein (fx1, f y1) and (fx2, fy2) are coordinates of two points sequentially selected from the preset water flow direction flowLines;
s2, acquiring continuous M frames during monitoring, and obtaining a BGR digital matrix imgDatas formed by the BGR values of the pictures, wherein the dimension of the matrix imgDatas is N H W D, N is the number of frames, W is the width of the pictures, H is the height of the pictures, and D is a picture channel;
extracting position data of the region to be monitored drainBox from a BGR digital matrix imgDatas to obtain a region to be monitored matrix detFrames, wherein the dimension of the region to be monitored matrix detFrames is N H1W 1D, H1 is the picture height of the region to be monitored drainBox, and W1 is the picture width of the region to be monitored drainBox;
calculating an optical flow change value of each pixel point between two continuous frames by adopting an optical flow method for the first dimension of a matrix detFrames of a region to be monitored, obtaining an optical flow output matrix output Flows of the region to be monitored, wherein the dimension of the optical flow output matrix output Flows is ((N-1) H1) W1D, and converting the optical flow output matrix output Flows into an optical flow two-dimensional matrix output Flows _1 which is expressed as ((N-1) H1W 1) D;
traversing each optical flow data of the optical flow two-dimensional matrix outputFlows _1, if the sum of absolute values of two corresponding values in the optical flow two-dimensional matrix outputFlows _1 is larger than a set minimum value, retaining the optical flow data, otherwise, deleting the optical flow data, and recording a two-dimensional matrix formed by the retained optical flow data as an optical flow filtering two-dimensional matrix outputFlows _ 2;
s3, judging the drainage state: converting the coordinates of flowLine in the preset water flow direction into unit vectors, and recording as flow direction unit vectors baseVec; filtering the projection of each group of vectors in the two-dimensional matrix outputFlows _2 on the flow direction unit vector baseVec to obtain a one-dimensional optical flow projection array vecPros; traversing the optical flow filtering two-dimensional matrix outputFlows _2, and calculating the modular length of each group of vectors to obtain a one-dimensional optical flow modular length array vecNorms;
forming a one-dimensional index array filterWhere by indexing elements which are larger than a set length in the optical flow modular norm, taking out corresponding element values from corresponding positions in an optical flow projection array vecPros and an optical flow filtering two-dimensional matrix outputflows _2 according to index values in the index array filterWhere, and forming a one-dimensional optical flow projection index array vecPros _1 and a two-dimensional optical flow filtering index matrix outputflows _ 3;
averaging the first dimensionality of the optical flow filtering index matrix outputFlowss _3 to obtain an optical flow average vector outputFlowsmean; calculating the cosine similarity between each vector in the optical flow filtering index matrix outputFlows _3 and the flow unit vector baseVec to form a one-dimensional cosine similarity degree group cosSims;
carrying out dimension expansion operation on the optical flow projection index array vecPros _1 and the cosine similarity array cosSims to obtain a two-dimensional optical flow projection index two-dimensional array vecPros _2 and a cosine similarity two-dimensional array cosSims _1, recombining the optical flow projection index two-dimensional array vecPros _2 and the cosine similarity two-dimensional array cosSims _1 according to a second dimension to obtain a two-dimensional optical flow output array outputData, and recording the total optical flow row number dataNum of the optical flow output array outputData;
reserving data of which the first data is larger than a set vector projection threshold and the second data is larger than a set cosine similarity threshold in each group of data of the optical flow output array outputData to obtain a two-dimensional optical flow output effective array outputData _1 and a corresponding optical flow effective total line number dataNum _ 1; calculating the ratio of the total effective optical flow row number dataNum _1 to the total optical flow row number dataNum, and recording the ratio as an effective data rate posDataRatio;
projecting the light flow average vector outputFlowsMean on the flow direction unit vector baseVec to obtain a light flow average projection value vecProMean; the cosine similarity of the average light flow vector outputFlowMean and the flow unit vector baseVec is obtained to obtain the average cosine similarity cosSimMean of the light flow;
if the average projection value vecPromean of the optical flow, the average cosine similarity cosSimMean of the optical flow and the effective data rate posDataRatio are all larger than the corresponding preset threshold values, it is judged that the area drainBox to be monitored is draining, and if not, it is judged that the area drainBox to be monitored is not draining.
2. A computer vision based water body monitoring method, comprising the computer vision based drainage monitoring method of claim 1.
CN202110174609.0A 2021-02-05 2021-02-05 Drainage monitoring method based on computer vision and water body monitoring method Active CN112861856B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110174609.0A CN112861856B (en) 2021-02-05 2021-02-05 Drainage monitoring method based on computer vision and water body monitoring method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110174609.0A CN112861856B (en) 2021-02-05 2021-02-05 Drainage monitoring method based on computer vision and water body monitoring method

Publications (2)

Publication Number Publication Date
CN112861856A true CN112861856A (en) 2021-05-28
CN112861856B CN112861856B (en) 2022-05-27

Family

ID=75989336

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110174609.0A Active CN112861856B (en) 2021-02-05 2021-02-05 Drainage monitoring method based on computer vision and water body monitoring method

Country Status (1)

Country Link
CN (1) CN112861856B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004076972A1 (en) * 2003-02-27 2004-09-10 Mitsubishi Denki Kabushiki Kaisha Water level measuring system
US20150276445A1 (en) * 2014-04-01 2015-10-01 Saudi Arabian Oil Company Multiphase metering with ultrasonic tomography and vortex shedding
CN107085852A (en) * 2017-04-01 2017-08-22 南昌大学 A kind of river course surface flow field method of testing based on unmanned plane
CN111089625A (en) * 2019-12-13 2020-05-01 国网浙江省电力有限公司紧水滩水力发电厂 Binocular vision-simulated river flow real-time monitoring system and method
CN111798386A (en) * 2020-06-24 2020-10-20 武汉大学 River channel flow velocity measurement method based on edge identification and maximum sequence density estimation

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004076972A1 (en) * 2003-02-27 2004-09-10 Mitsubishi Denki Kabushiki Kaisha Water level measuring system
US20150276445A1 (en) * 2014-04-01 2015-10-01 Saudi Arabian Oil Company Multiphase metering with ultrasonic tomography and vortex shedding
CN107085852A (en) * 2017-04-01 2017-08-22 南昌大学 A kind of river course surface flow field method of testing based on unmanned plane
CN111089625A (en) * 2019-12-13 2020-05-01 国网浙江省电力有限公司紧水滩水力发电厂 Binocular vision-simulated river flow real-time monitoring system and method
CN111798386A (en) * 2020-06-24 2020-10-20 武汉大学 River channel flow velocity measurement method based on edge identification and maximum sequence density estimation

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
I-HUI CHEN等: ""Computer vision application programming for settlement monitoring in a drainage tunnel"", 《AUTOMATION IN CONSTRUCTION》 *
PARIVA DOBRIYAL等: ""A review of methods for monitoring streamflow for sustainable water resource management"", 《APPLIED WATER SCIENCE》 *
李潇: ""视频识别在HSE监控平台中的研究与应用"", 《中国优秀博硕士学位论文全文数据库(硕士) 信息科技辑》 *

Also Published As

Publication number Publication date
CN112861856B (en) 2022-05-27

Similar Documents

Publication Publication Date Title
DE60304785T2 (en) Method for detecting defective pixels in a digital image sensor
CN107730481B (en) Traffic signal lamp image processing method and traffic signal lamp image processing device
CN110415255B (en) Immunohistochemical pathological image CD3 positive cell nucleus segmentation method and system
CN101630360A (en) Method for identifying license plate in high-definition image
CN110866926B (en) Infrared remote sensing image rapid and fine sea-land segmentation method
CN109359593B (en) Rain and snow environment picture fuzzy monitoring and early warning method based on image local grid
CN105898111B (en) A kind of video defogging method based on spectral clustering
CN108009556B (en) River floating object detection method based on fixed-point image analysis
CN111476712B (en) Trolley grate image shooting and detecting method and system of sintering machine
CN112887693A (en) Image purple border elimination method, equipment and storage medium
CN107346547A (en) Real-time foreground extracting method and device based on monocular platform
CN111815556B (en) Vehicle-mounted fisheye camera self-diagnosis method based on texture extraction and wavelet transformation
CN112861856B (en) Drainage monitoring method based on computer vision and water body monitoring method
CN113326783A (en) Edge early warning method for water conservancy industry
CN115937269A (en) Image pyramid-based image difference region detection method
CN110443142B (en) Deep learning vehicle counting method based on road surface extraction and segmentation
CN112884039B (en) Water body pollution identification method based on computer vision
CN110430400B (en) Ground plane area detection method of binocular movable camera
Ahmed et al. Robust lane departure warning system for adas on highways
CN110533698B (en) Foundation pit construction pile detection control method based on visual detection
CN115620259A (en) Lane line detection method based on traffic off-site law enforcement scene
KR101521269B1 (en) Method for detecting snow or rain on video
Maeda et al. Rough and accurate segmentation of natural color images using fuzzy region-growing algorithm
CN112950484A (en) Method for removing color pollution of photographic image
CN115457485A (en) Drainage monitoring method and system based on 3D convolution and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant