CN116402817A - Sewage aeration quantity detection method based on video analysis - Google Patents

Sewage aeration quantity detection method based on video analysis Download PDF

Info

Publication number
CN116402817A
CN116402817A CN202310671503.0A CN202310671503A CN116402817A CN 116402817 A CN116402817 A CN 116402817A CN 202310671503 A CN202310671503 A CN 202310671503A CN 116402817 A CN116402817 A CN 116402817A
Authority
CN
China
Prior art keywords
aeration
periods
sewage
explosion
adjacent
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202310671503.0A
Other languages
Chinese (zh)
Other versions
CN116402817B (en
Inventor
黄青
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingdao Guoyuan Zhongchuang Electrical Automation Engineering Co ltd
Original Assignee
Qingdao Guoyuan Zhongchuang Electrical Automation Engineering Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingdao Guoyuan Zhongchuang Electrical Automation Engineering Co ltd filed Critical Qingdao Guoyuan Zhongchuang Electrical Automation Engineering Co ltd
Priority to CN202310671503.0A priority Critical patent/CN116402817B/en
Publication of CN116402817A publication Critical patent/CN116402817A/en
Application granted granted Critical
Publication of CN116402817B publication Critical patent/CN116402817B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/761Proximity, similarity or dissimilarity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02WCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO WASTEWATER TREATMENT OR WASTE MANAGEMENT
    • Y02W10/00Technologies for wastewater treatment
    • Y02W10/10Biological treatment of water, waste water, or sewage

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Geometry (AREA)
  • Health & Medical Sciences (AREA)
  • Quality & Reliability (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Multimedia (AREA)
  • Image Processing (AREA)

Abstract

The application relates to the field of image data processing, in particular to a method for detecting sewage aeration rate based on video analysis, wherein the detection of the sewage aeration rate comprises the following steps: the method comprises the steps of firstly obtaining video data of sewage aeration quantity, wherein the video data comprise video data corresponding to at least two aeration periods, then calculating edge density difference quantities corresponding to two adjacent aeration periods according to the video data corresponding to the at least two aeration periods, calculating approximation degree values according to the edge density difference quantities corresponding to the two adjacent aeration periods, and finally confirming whether the sewage aeration quantity is abnormal according to the approximation degree values. And calculating the corresponding edge density difference through video data corresponding to at least two aeration periods, and further calculating an approximation degree value to judge whether the sewage aeration amount is abnormal, so that the detection accuracy is further improved, and the detection cost is further reduced.

Description

Sewage aeration quantity detection method based on video analysis
Technical Field
The application relates to the field of image data processing, in particular to a method for detecting sewage aeration quantity based on video analysis.
Background
The sewage aeration quantity refers to the quantity of air in the water body through a sewage aeration pipeline in the sewage aeration sewage treatment process. The process of forcedly transferring oxygen in the air into the liquid aims at obtaining enough dissolved oxygen, preventing the suspension in the tank from sinking, strengthening the contact between the organic matters in the tank and the microorganisms and the dissolved oxygen, and ensuring the oxidative decomposition of the organic matters in the sewage by the microorganisms in the tank under the condition of enough dissolved oxygen.
With the increase of the service life of the aeration equipment, the aeration quantity of part of sewage aeration pipelines can be changed, and the aeration quantity is not in accordance with the expected sewage aeration quantity requirement, so that in order to ensure the filtration efficiency of a sewage tank, the detection of the aeration quantity has important significance for improving the treatment efficiency and reducing the energy consumption in the aeration sewage treatment process.
According to the traditional sewage aeration detection method, sewage aeration quantity detection can be performed by manually observing water wave changes, and an aeration pipeline which does not meet the expected aeration quantity of a sewage pool is found, but because water is fluid, even if the same aeration quantity is performed in the same place for aeration, water waves are still different, so that the detection accuracy is lower, and the corresponding detection cost is higher.
Disclosure of Invention
In view of the foregoing, it is desirable to provide a method, apparatus and computer storage medium for detecting aeration amount of sewage based on video analysis, so as to improve accuracy of sewage aeration amount detection and reduce cost of sewage aeration amount detection.
The first aspect of the application provides a method for detecting sewage aeration rate based on video analysis, which comprises the following steps: video data of sewage aeration quantity is obtained, wherein the video data comprises video data corresponding to at least two aeration periods, edge density difference quantities corresponding to two adjacent aeration periods are calculated according to the video data corresponding to the at least two aeration periods, approximation degree values are calculated according to the edge density difference quantities corresponding to the two adjacent aeration periods, and whether the sewage aeration quantity is abnormal is confirmed according to the approximation degree values.
According to the method, the device and the system for detecting the sewage aeration quantity, the video data of the sewage aeration quantity are obtained firstly, the video data comprise video data corresponding to at least two aeration periods, then edge density difference quantities corresponding to two adjacent aeration periods are calculated according to the video data corresponding to the at least two aeration periods, approximation degree values are calculated according to the edge density difference quantities corresponding to the two adjacent aeration periods, and finally whether the sewage aeration quantity is abnormal is confirmed according to the approximation degree values. The traditional detection method is replaced by the video data corresponding to at least two aeration periods, so that the detection cost is reduced, meanwhile, the corresponding edge density difference is calculated by the video data corresponding to at least two aeration periods, and then the approximation degree value is calculated, so that whether the sewage aeration amount is abnormal or not is judged, the detection accuracy is further improved, and the detection cost is further reduced.
In one embodiment, the calculating the edge density difference corresponding to the two adjacent explosion periods according to the video data corresponding to the at least two explosion periods specifically includes: acquiring the explosion images with the same time sequence corresponding to the two adjacent explosion periods according to the video data corresponding to the at least two explosion periods; the images with the same time sequence are subjected to equal area division, and a plurality of sub-explosion images are determined; and calculating the edge density difference corresponding to the two adjacent explosion periods according to the sub-explosion images corresponding to the two adjacent explosion periods.
In one embodiment, the calculating the edge density difference corresponding to the two adjacent explosion periods according to the sub-explosion images corresponding to the two adjacent explosion periods specifically includes: acquiring edge density characteristics corresponding to sub-explosive images corresponding to two adjacent explosive periods; and calculating the difference value of the edge density characteristics corresponding to the sub-explosive images corresponding to the two adjacent explosive periods, and obtaining the edge density difference value corresponding to the two adjacent explosive periods.
In one embodiment, the calculating the approximation degree value according to the edge density difference corresponding to the two adjacent explosion periods specifically includes: calculating a distance weight corresponding to the sub-explosive images according to the sub-explosive images corresponding to the two adjacent explosive periods; and calculating an approximation degree value according to the edge density difference and the distance weight.
In one of the embodiments of the present invention,the two adjacent explosion periods are set as a t explosion period and a t+1 explosion period, and corresponding explosion images with the same time sequence and corresponding to the two explosion periods including the t explosion period and the t+1 explosion period are set as an ith frame image and an ith frame image
Figure SMS_1
The sub-explosive images corresponding to the two adjacent explosive periods are j-region sub-images, and the distance weight corresponding to the j-region sub-images is +.>
Figure SMS_2
The method comprises the steps of carrying out a first treatment on the surface of the The calculating the approximation degree value according to the edge density difference and the distance weight value specifically comprises the following steps:
Figure SMS_3
wherein ,
Figure SMS_5
for the ith frame image in the t-th aeration period and the (t+1) -th aeration period>
Figure SMS_8
Approximation between frame images, said +.>
Figure SMS_10
Is the edge density characteristic of the j region sub-image of the ith frame image in the t aeration period,
Figure SMS_6
is t+1>
Figure SMS_7
Edge density characteristics of the sub-image of the j region of the frame image,
Figure SMS_9
for the edge density characteristic of the j-region sub-image of the ith frame image in the t aeration period and the +.1-th aeration period>
Figure SMS_11
Frame imageEdge density difference of edge density characteristics of a sub-image of a J region, J representing the number of all local regions, J representing traversal of J, +.>
Figure SMS_4
And the distance weight corresponding to the sub-image of the j region.
In one embodiment, the determining whether the sewage aeration amount is abnormal according to the approximation degree value specifically includes: calculating a standard approximation degree value according to a preset standard calculation method; comparing the approximation degree value with a standard approximation degree value, and confirming whether the sewage aeration quantity is abnormal or not.
In one embodiment, the comparing the approximation degree value with a standard approximation degree value, and determining whether the sewage is abnormal specifically includes: when the difference value between the approximation degree value and the standard approximation degree value is larger than a preset threshold value, confirming that the sewage aeration quantity is in an abnormal state; and when the difference value between the approximation degree value and the standard approximation degree value is smaller than or equal to a preset threshold value, confirming that the sewage aeration quantity is in a normal state.
In one embodiment, according to a preset standard calculation method, calculating a standard approximation degree value, and correspondingly, performing linear fitting on the similarity degree value by using a least square method to obtain a linear fitting function; and inputting the time sequence number of the aeration period into the fitting function, and obtaining the standard similarity value corresponding to the aeration period.
A second aspect of the present application provides a device for detecting aeration of wastewater, the device comprising: the device comprises an acquisition module, a control module and a control module, wherein the acquisition module is used for acquiring video data of sewage aeration quantity, wherein the video data comprises video data corresponding to at least two aeration periods; the first calculation module is used for calculating the edge density difference corresponding to the two adjacent explosion periods according to the video data corresponding to the at least two explosion periods; the second calculation module is used for calculating an approximation degree value according to the edge density difference corresponding to the two adjacent gas explosion periods; and the confirming module is used for confirming whether the sewage aeration quantity is abnormal according to the approximation degree value.
A third aspect of the present application provides a computer storage medium storing program instructions that, when executed on an electronic device, cause the electronic device to perform the above-described method for detecting aeration amount of sewage based on video analysis.
Drawings
FIG. 1 is a schematic flow chart of a method for detecting aeration quantity of sewage based on video analysis according to an embodiment of the present application;
FIG. 2 is a schematic illustration of a first sub-process of a method for detecting aeration rate of wastewater based on video analysis according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a second sub-flow of a method for detecting aeration rate of wastewater based on video analysis according to an embodiment of the present application;
FIG. 4 is a schematic diagram of a third sub-flow of a method for detecting aeration rate of wastewater based on video analysis according to an embodiment of the present application;
FIG. 5 is a schematic diagram of a fourth sub-flow of a method for detecting aeration rate of wastewater based on video analysis according to an embodiment of the present application;
FIG. 6 is a schematic block diagram of a sewage aeration rate detection device according to an embodiment of the present application;
reference numerals: 1. an acquisition module; 2: a first computing module; 3. a second computing module; 4. and a confirmation module.
Detailed Description
In describing embodiments of the present application, words such as "exemplary," "or," "such as," and the like are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "for example" should not be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary," "or," "such as," and the like are intended to present related concepts in a concrete fashion.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used in the description of the present application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. It should be understood that, "/" means or, unless otherwise indicated herein. For example, A/B may represent A or B. The term "and/or" in this application is merely an association relationship describing an association object, and means that three relationships may exist. For example, a and/or B may represent: a exists alone, A and B exist simultaneously, and B exists alone. "at least one" means one or more. "plurality" means two or more than two. For example, at least one of a, b or c may represent: seven cases of a, b, c, a and b, a and c, b and c, a, b and c.
It should be further noted that the terms "first," "second," and the like in the description and claims of this application and in the drawings are used for the purpose of describing particular sequences or successes, respectively. The methods disclosed in the embodiments of the present application or the methods illustrated in the flowcharts include one or more steps for implementing the methods, and the execution order of the steps may be interchanged with one another, and some steps may be deleted without departing from the scope of the claims.
The embodiment of the application firstly provides a method for detecting sewage aeration quantity based on video analysis. Referring to fig. 1, a method for detecting aeration rate of sewage based on video analysis is applied to the field of sewage detection, and comprises the following steps.
S101, acquiring video data of sewage aeration quantity, wherein the video data comprise video data corresponding to at least two aeration periods.
The video data of the sewage aeration quantity refers to video sequence data obtained by shooting the sewage aeration process through a camera shooting acquisition device. After the video data are acquired by the camera shooting and collecting equipment, real-time videos shot by the camera shooting and collecting equipment are transmitted to a computer server end through WiFi or a wired network for processing. Real-time RGB image sequences captured by the camera. The aeration period refers to a period of time when the aeration device performs one aeration action on sewage, and the starting time and the stopping time of the aeration device are fixed, namely, the time of each aeration period is the same, and when the aeration action is performed on the sewage, the complete aeration of the sewage can be completed only by performing a plurality of aeration periods.
Further, the start time of each aeration period is the start time of the aeration equipment in the aeration process, and the end time is the start time of the aeration equipment in the next aeration process.
The video data includes video data corresponding to at least two aeration periods, which means that aeration treatment is performed on sewage in this embodiment, aeration operation is performed for at least two periods, and after video data of the whole aeration process is obtained, video data corresponding to each aeration period, that is, video data corresponding to each aeration period is identified and divided.
S102, calculating edge density difference amounts corresponding to two adjacent explosion periods according to the video data corresponding to the at least two explosion periods.
Wherein, the two adjacent gas explosion periods refer to two gas explosion periods adjacent in time sequence, namely two adjacent gas explosion periods when at least two gas explosion periods are performed to the sewage. For example, ten times of gas explosion cycles are performed in the sewage gas explosion process, and the first gas explosion cycle and the second gas explosion cycle are two adjacent gas explosion cycles, and the third gas explosion cycle and the fourth gas explosion cycle are two adjacent gas explosion cycles, so that the sewage gas explosion process is similar.
The edge density difference is the difference of the aeration conditions on the surface of the sewage in two adjacent aeration periods. It is to be analyzed that when the aeration device performs aeration operation on the sewage, bubbles and water waves caused by the bubbles are generated on the surface of the sewage, the video data can record the changes generated by the bubbles and the water waves on the surface of the sewage, and the edge density difference is a representation of quantifying the changes generated by the bubbles and the water waves on the surface of the sewage.
S103, calculating an approximation degree value according to the edge density difference corresponding to the two adjacent explosion periods.
The approximation degree value refers to a value which is obtained by further data processing according to the edge density difference corresponding to two adjacent explosion periods and can be used for showing the explosion condition. It should be noted that when the sewage is treated by the aeration, the aeration action of a plurality of aeration periods is performed, and the approximation degree value corresponds to two adjacent aeration periods, so that the approximation degree value corresponding to the complete aeration process is also a plurality of, in other words, the approximation degree value is usually a sequence, i.e. a sequence including a plurality of values. It can be understood that the more periods of aeration are performed, the more approximation values are obtained, i.e. the longer the sequence corresponding to the approximation values, the more accurate the sewage aeration result is judged according to the sequence corresponding to the more values.
It can be understood that, the calculation of the edge density difference corresponding to two adjacent explosion periods is not limited to only one adjacent two explosion period, but may be a plurality of adjacent two explosion periods, in other words, the data processing of the edge density difference corresponding to two adjacent explosion periods may be the data processing of two explosion periods alone or the polling of the data processing of two adjacent explosion periods in a plurality of explosion periods, for example, the data processing of two explosion periods alone refers to the single calculation of the edge density difference corresponding to the first explosion period and the second explosion period, the polling of the data processing of two adjacent explosion periods in a plurality of explosion periods refers to the polling calculation of the edge density difference corresponding to the current explosion period and the next explosion period, and then the plurality of edge density difference is obtained, that is, after the edge density difference calculation of the first explosion period and the second explosion period is performed, the edge density difference is calculated, the polling is performed between the second explosion period and the third explosion period, and the like until all the polling of the edge density difference is finished.
S104, according to the approximation degree value, confirming whether the sewage aeration quantity is abnormal.
After the approximation degree value is obtained, further judging and comparing the approximation degree value according to a preset judging rule, and confirming whether the sewage aeration quantity is abnormal or not. For example, when the approximation degree value is further determined and compared, two results of a large result and a small result are obtained, the large result corresponds to abnormal sewage aeration quantity, and the small result corresponds to abnormal aeration quantity.
According to the method, the device and the system for detecting the sewage aeration quantity, the video data of the sewage aeration quantity are obtained firstly, the video data comprise video data corresponding to at least two aeration periods, then edge density difference quantities corresponding to two adjacent aeration periods are calculated according to the video data corresponding to the at least two aeration periods, approximation degree values are calculated according to the edge density difference quantities corresponding to the two adjacent aeration periods, and finally whether the sewage aeration quantity is abnormal is confirmed according to the approximation degree values. The traditional detection method is replaced by the video data corresponding to at least two aeration periods, so that the detection cost is reduced, meanwhile, the corresponding edge density difference is calculated by the video data corresponding to at least two aeration periods, and then the approximation degree value is calculated, so that whether the sewage aeration amount is abnormal or not is judged, the detection accuracy is further improved, and the detection cost is further reduced.
In one embodiment of the present application, and referring to fig. 2, the step S102: according to the video data corresponding to the at least two explosion periods, calculating the edge density difference corresponding to the two adjacent explosion periods, specifically comprising:
s201, acquiring the explosion images with the same time sequence corresponding to the two adjacent explosion periods according to the video data corresponding to the at least two explosion periods.
Each explosion period corresponds to video data, the video data comprises a plurality of time sequence explosion images, namely a plurality of frames of explosion images, and each explosion period comprises the same time sequence number of explosion images because the period duration of each explosion period is the same, namely each explosion period comprises the same frames of explosion images.
Further, the same time sequence of the two adjacent explosion periods refers to the time sequence of the explosion images in the two adjacent explosion periods, and the time sequences are the same in the respective explosion periods, for example, a total of two explosion periods are provided, namely, a first explosion period and a second explosion period, and each explosion period comprises ten time sequence explosion images, and the first time sequence explosion image of the first explosion period, the first time sequence explosion image corresponding to the second explosion period, the second time sequence explosion image of the first explosion period, and the second time sequence explosion image corresponding to the second explosion period are pushed in this way.
S202, respectively dividing the explosive images with the same time sequence into equal areas, and determining a plurality of sub-explosive images.
The equal area division of the explosive images with the same time sequence refers to equal area division of the explosive images, for example, the explosive images are divided into equal area areas of 10×10 respectively, and a total of 100 equal area areas are included. Further, the plurality of sub-explosive images refer to equal area areas obtained by dividing the explosive images into equal areas.
S203, calculating the edge density difference corresponding to the two adjacent explosion periods according to the sub-explosion images corresponding to the two adjacent explosion periods.
The sub-explosion images corresponding to the two adjacent explosion periods refer to sub-explosion images corresponding to the same region of the two adjacent explosion periods, for example, a first sub-region image is selected in the first explosion period, and a first sub-region image corresponding to the first sub-region image is selected in the first explosion period and is the first sub-region image of the second explosion period. Selecting sub-explosive images corresponding to two adjacent explosive periods, and calculating the edge density difference corresponding to the two adjacent explosive periods according to the selected sub-explosive images.
In the present embodiment, theoretically, if the aeration amount is approximated, the bubble amount and bubble distribution generated by aeration in two time periods are approximated in the aeration period of the adjacent time, but since the water is in a different fluid form, the bubble amount and bubble distribution cannot be absolutely uniform. The situation that detection is inaccurate due to the fact that the air bubble quantity and the air bubble distribution cannot be completely consistent due to water flow can be reduced by acquiring partial sub-aeration images with the same time sequence to detect the sewage aeration quantity, and the detection accuracy is further improved.
In one embodiment of the present application, and referring to fig. 3, the step S203 is: according to the sub-explosive images corresponding to the two adjacent explosive periods, calculating the edge density difference corresponding to the two adjacent explosive periods, specifically comprising:
s301, acquiring edge density characteristics corresponding to sub-explosive images corresponding to two adjacent explosive periods;
the edge density characteristic is that the edge detection result is obtained by carrying out canny edge detection on the sub-explosive image, and the edge density characteristic is obtained according to the number of valued pixel points in the edge detection result and the total pixel ratio of the sub-explosive image.
Specifically, the canny edge detection can be simply divided into four steps:
1. gaussian blur:
image denoising is the first step of edge detection, and some noise points in an image can be removed through denoising, so that the noise points are prevented from being interfered during edge detection. The value of the center point of the general denoising convolution kernel is determined by the pixel mean value of the surrounding points, taking 8 neighborhood of the 3x3 convolution kernel as an example, and the coordinate value of the convolution center point is determined by the mean value of the surrounding coordinates and the self. However, this method of determining the center point by means of the mean value is essentially that all surrounding points use the same weight, but when the convolution kernel becomes large, it is obviously inappropriate that the points farthest from the center point occupy the same weight as the nearest points, which requires the use of a gaussian kernel. Gaussian blur is the distribution of weights of points at different positions in the convolution kernel according to a gaussian distribution (normal distribution). Firstly, generating a 3x3 Gaussian convolution kernel, and then substituting the coordinates of the common Gaussian convolution kernel into a formula to obtain the convolution kernel with the weight having Gaussian distribution characteristics. Second, since the sum of weights of all points of the convolution kernel is 1, it is necessary to divide the weight value of each point of the convolution kernel by the sum of weights of all points. Finally, the Gaussian convolution kernel is completed, and denoising convolution operation can be performed on the image.
Image gradient calculation:
to perform edge detection, image gradient information needs to be obtained, edges are determined according to gradient amplitude values and gradient directions of images, and gradient amplitude values and gradient directions of the images are generally calculated by adopting a sobel operator.
3. Non-maximum suppression:
after the gradient amplitude and the gradient direction of the image are acquired, non-maximum suppression operation is required to be performed on the image edge through the acquired gradient amplitude and gradient direction, and as the gradient direction is perpendicular to the edge direction, the non-maximum suppression can effectively remove a large part of non-edge points.
4. Dual threshold boundary tracking:
the dual-threshold boundary tracking is divided into two steps, namely 1) the points of the low gradient amplitude and the weak threshold are set to be 0 by selecting the strong threshold and the weak threshold, and the points which are larger than the strong threshold are reserved and marked as 255. 2) For points with gradient amplitude greater than the weak threshold but less than the high threshold, by judging whether 8 neighborhoods of the points are greater than the strong threshold, if so, the point is reserved and set to 255, and if not, the point is abandoned and set to 0.
S302, calculating the difference value of the edge density characteristics corresponding to the sub-explosive images corresponding to the two adjacent explosive periods, and obtaining the edge density difference value corresponding to the two adjacent explosive periods.
After the edge density characteristics corresponding to the sub-explosive images corresponding to the two adjacent explosive periods are obtained, performing difference calculation on the edge density characteristics to obtain the edge density difference corresponding to the two adjacent explosive periods.
In one embodiment of the present application, and referring to fig. 4, the step S103: calculating an approximation degree value according to the edge density difference corresponding to the two adjacent explosion periods, wherein the approximation degree value specifically comprises:
s401, calculating a distance weight corresponding to the sub-explosive images according to the sub-explosive images corresponding to the two adjacent explosive periods;
wherein, because the water flow instability is at a position far from the pipeline, larger difference change can be generated in the same two aeration periods, and the weighting of the edge density difference quantity, namely the distance weight, is realized by utilizing the density characteristic change in each aeration period. The density characteristic corresponding to a local area closer to a pipeline is more important, wherein the sewage pipeline is fixed, and when gas is introduced into an aeration pipeline, bubbles are preferentially generated in the area closer to the pipeline to cause the change of the water surface near the pipeline, so that the change quantity of each pixel point between two adjacent frames of images in the same aeration period is acquired, the change quantity accumulated value corresponding to each pixel point between the starting time and the stopping time of the aeration equipment in the aeration period is accumulated, the accumulated value is normalized, and the normalized change quantity accumulated value corresponding to each pixel point is used as the distance weight of each pixel point from the aeration pipeline. And the average value of the accumulated values of the variation corresponding to all the pixel points in the sub-explosive image is the distance weight corresponding to the sub-explosive image. The smaller the value, the farther from the orifice.
S402, calculating an approximation degree value according to the edge density difference and the distance weight.
Wherein, the two adjacent explosion periods are set as a t explosion period and a t+1 explosion period, and the two explosion periods comprise the t explosion period and the t+1 explosion period, and the explosion images with the same time sequence and corresponding to the t explosion period are the ith frame image and the ith frame image
Figure SMS_12
The sub-explosive images corresponding to the two adjacent explosive periods are j-region sub-images, and the distance weight corresponding to the j-region sub-images is +.>
Figure SMS_13
The calculating the approximation degree value according to the edge density difference and the distance weight value specifically comprises the following steps:
Figure SMS_14
wherein ,
Figure SMS_16
for the ith frame image in the t-th aeration period and the (t+1) -th aeration period>
Figure SMS_19
Approximation between frame images, said +.>
Figure SMS_21
Is the edge density characteristic of the j region sub-image of the ith frame image in the t aeration period,
Figure SMS_17
is t+1>
Figure SMS_18
Edge density characteristics of the sub-image of the j region of the frame image,
Figure SMS_20
means that the edge density characteristic of the j-region sub-image of the ith frame image in the t aeration period is equal to the +.1-th aeration period>
Figure SMS_22
Edge density difference of edge density characteristics of sub-images of J areas of a frame image, J representing the number of all local areas, J representing traversal of J +. >
Figure SMS_15
And the distance weight corresponding to the sub-image of the j region.
It will be appreciated that when the t aeration period is defined, the edge density characteristics of the j-region sub-image of the ith frame image
Figure SMS_23
The larger the image is, the more the edge information of the j region sub-image of the ith frame image is proved in the t aeration period, namely, the more the water waves are, the more the aeration quantity is. When the edge density difference is +.>
Figure SMS_24
When the value of (2) is larger, proving the edge density characteristic of the j-region sub-image of the ith frame image in the t aeration period and the +.1-th aeration period>
Figure SMS_25
Frame image j region subgraphThe characteristic difference of the edge density of the image is larger, and the water ripple difference generated by the aeration quantity of the front period and the rear period is proved to be larger.
It can be understood that J represents the number of all local areas, J represents the traversal of J, and J approximation values are corresponding to each frame of image, and the factor areas are divided by equal distances, so that the J approximation values of each frame of image can form a square matrix in a matrix. Then, a total approximation degree value of the whole frame of image between two aeration periods is obtained through a DTW algorithm, and the larger the total approximation degree value is, the larger the aeration approximation degree between the two periods is, and the sewage aeration amount is possibly abnormal. When the adjacent total approximation degree value is obtained, each explosion period is polled through an iterative assignment form, for example, the first two adjacent explosion periods are a first explosion period and a second explosion period, and the first explosion period of the second two adjacent explosion periods is assigned as the approximation degree value of the second explosion period of the first adjacent explosion period. And then a similarity value sequence between two adjacent aeration periods in time sequence in the current sewage pool can be obtained, the sewage aeration quantity is detected according to the similarity value sequence, and the detection accuracy can be further improved.
In one embodiment of the present application, referring to fig. 5, the step S104: and according to the approximation degree value, determining whether the sewage aeration quantity is abnormal or not, wherein the method specifically comprises the following steps of:
s501, calculating a standard approximation degree value according to a preset standard calculation method;
s502, comparing the approximation degree value with a standard approximation degree value, and confirming whether the sewage aeration quantity is abnormal.
After obtaining a similarity value sequence between two adjacent aeration periods in time sequence in the current sewage pool, if the similarity value corresponding to the new aeration period has a larger difference value from other similarity values in the similarity value sequence, the current sewage aeration rate is abnormal.
Performing linear fitting on the similarity value by using a least square method to obtain a linear fitting function;
and inputting the time sequence number of the aeration period into the fitting function, and obtaining the standard similarity value corresponding to the aeration period.
In the sewage treatment process, the factors such as the water density, the water volume and the like change with time, so that the values in the similarity value sequence are always changed data. And in order to determine whether the approximation degree value corresponding to the new aeration period is normal or not, the scheme selects to perform linear fitting on the similarity degree value sequence by using a least square method to obtain a linear fitting function, wherein the linear fitting is performed because the change of factors such as water density, water quantity and the like along with the time is a gradual process, so that the similarity degree value sequence value is also an approximation value, and further linear fitting is performed on the similarity degree value sequence. The corresponding time sequence numbers of the aeration periods are serial numbers of the aeration periods in the similarity value sequence, and the serial numbers are sequentially arranged from 1 to the back according to time.
After the fitting function is obtained, the time sequence number corresponding to the new aeration period is brought into the fitting function to obtain a standard similarity value corresponding to the new aeration period, and then the similarity value and the standard similarity value are compared to confirm whether the sewage aeration quantity is abnormal. Specifically, the comparing the approximation degree value with the standard approximation degree value to confirm whether the sewage is abnormal or not specifically includes: when the difference value between the approximation degree value and the standard approximation degree value is larger than a preset threshold value, confirming that the sewage aeration quantity is in an abnormal state, and when the difference value between the approximation degree value and the standard approximation degree value is smaller than or equal to the preset threshold value, confirming that the sewage aeration quantity is in a normal state.
Specifically, the preset threshold is an empirical value, and the preset threshold can be adjusted according to a specific implementation scenario, which is not particularly limited.
According to the method, the device and the system for detecting the sewage aeration quantity, the video data of the sewage aeration quantity are obtained firstly, the video data comprise video data corresponding to at least two aeration periods, then edge density difference quantities corresponding to two adjacent aeration periods are calculated according to the video data corresponding to the at least two aeration periods, approximation degree values are calculated according to the edge density difference quantities corresponding to the two adjacent aeration periods, and finally whether the sewage aeration quantity is abnormal is confirmed according to the approximation degree values. The traditional detection method is replaced by the video data corresponding to at least two aeration periods, so that the detection cost is reduced, meanwhile, the corresponding edge density difference is calculated by the video data corresponding to at least two aeration periods, and then the approximation degree value is calculated, so that whether the sewage aeration amount is abnormal or not is judged, the detection accuracy is further improved, and the detection cost is further reduced.
Referring to fig. 6, the embodiment of the present application further provides a device for detecting sewage aeration rate, where the device includes a plurality of functional sub-modules including program code segments, the device for detecting sewage aeration rate may be divided into a plurality of functional sub-modules according to functions executed by the device, and the functional sub-modules at least include an acquisition module 1, a first calculation module 2, a second calculation module 3, and a confirmation module 4, where functions of each functional sub-module are as follows:
the device comprises an acquisition module 1, a control module and a control module, wherein the acquisition module 1 is used for acquiring video data of sewage aeration quantity, wherein the video data comprises video data corresponding to at least two aeration periods;
the first calculation module 2 is used for calculating the edge density difference corresponding to the two adjacent explosion periods according to the video data corresponding to the at least two explosion periods;
the second calculation module 3 is used for calculating an approximation degree value according to the edge density difference corresponding to the two adjacent gas explosion periods;
and the confirming module 4 is used for confirming whether the sewage aeration quantity is abnormal according to the approximation degree value.
The embodiment of the application also provides a computer storage medium. The computer storage medium stores program instructions that, when executed on the electronic device, cause the electronic device to perform the method for detecting aeration of sewage based on video analysis as described above.
The implementation principles of the sewage aeration rate detection device and the computer storage medium provided in the embodiments of the present application may refer to the related description in the above-mentioned sewage aeration rate detection method based on video analysis, and are not described herein again.
Embodiments of the present disclosure provide a computer-readable storage medium storing computer-executable instructions configured to perform the above-described method for detecting sewage aeration based on video analysis.
The disclosed embodiments provide a computer program product comprising a computer program stored on a computer readable storage medium, the computer program comprising program instructions which, when executed by a computer, cause the computer to perform the above-described method for detecting sewage aeration based on video analysis.
The computer readable storage medium may be a transitory computer readable storage medium or a non-transitory computer readable storage medium.
Embodiments of the present disclosure may be embodied in a software product stored on a storage medium, including one or more instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of a method according to embodiments of the present disclosure. And the aforementioned storage medium may be a non-transitory storage medium including: a plurality of media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or a transitory storage medium.
The above description and the drawings illustrate embodiments of the disclosure sufficiently to enable those skilled in the art to practice them. Other embodiments may involve structural, logical, electrical, process, and other changes. The embodiments represent only possible variations. Individual components and functions are optional unless explicitly required, and the sequence of operations may vary. Portions and features of some embodiments may be included in, or substituted for, those of others. Moreover, the terminology used in the present application is for the purpose of describing embodiments only and is not intended to limit the claims. As used in the description of the embodiments and the claims, the singular forms "a," "an," and "the" (the) are intended to include the plural forms as well, unless the context clearly indicates otherwise. Similarly, the term "and/or" as used in this application is meant to encompass any and all possible combinations of one or more of the associated listed. Furthermore, when used in this application, the terms "comprises," "comprising," and/or "includes," and variations thereof, mean that the stated features, integers, steps, operations, elements, and/or components are present, but that the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof is not precluded. Without further limitation, an element defined by the phrase "comprising one …" does not exclude the presence of other like elements in a process, method or apparatus comprising such elements. In this context, each embodiment may be described with emphasis on the differences from the other embodiments, and the same similar parts between the various embodiments may be referred to each other. For the methods, products, etc. disclosed in the embodiments, if they correspond to the method sections disclosed in the embodiments, the description of the method sections may be referred to for relevance.
Those of skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. The skilled artisan may use different methods for each particular application to achieve the described functionality, but such implementation should not be considered to be beyond the scope of the embodiments of the present disclosure. It will be clearly understood by those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, which are not repeated herein.
In the embodiments disclosed herein, the disclosed methods, articles of manufacture (including but not limited to devices, apparatuses, etc.) may be practiced in other ways. For example, the apparatus embodiments described above are merely illustrative, and for example, the division of the units may be merely a logical function division, and there may be additional divisions when actually implemented, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. In addition, the coupling or direct coupling or communication connection shown or discussed with each other may be through some interface, device or unit indirect coupling or communication connection, which may be in electrical, mechanical or other form. The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to implement the present embodiment. In addition, each functional unit in the embodiments of the present disclosure may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. In the description corresponding to the flowcharts and block diagrams in the figures, operations or steps corresponding to different blocks may also occur in different orders than that disclosed in the description, and sometimes no specific order exists between different operations or steps. For example, two consecutive operations or steps may actually be performed substantially in parallel, they may sometimes be performed in reverse order, which may be dependent on the functions involved. Each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
It will be evident to those skilled in the art that the present application is not limited to the details of the foregoing illustrative embodiments, and that the present application may be embodied in other specific forms without departing from the spirit or essential characteristics thereof. The above-described embodiments of the application are, therefore, to be considered in all respects as illustrative and not restrictive, the scope of the application being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein.

Claims (8)

1. The method for detecting the aeration quantity of the sewage based on video analysis is applied to the field of sewage detection and is characterized by comprising the following steps of:
acquiring video data of sewage aeration quantity, wherein the video data comprises video data corresponding to at least two aeration periods;
calculating edge density difference corresponding to two adjacent explosion periods according to the video data corresponding to the at least two explosion periods;
calculating an approximation degree value according to the edge density difference corresponding to the two adjacent explosion periods;
and according to the approximation degree value, determining whether the sewage aeration quantity is abnormal.
2. The method for detecting aeration rate of sewage based on video analysis according to claim 1, wherein calculating the edge density difference corresponding to two adjacent aeration periods according to the video data corresponding to the at least two aeration periods specifically comprises:
acquiring the explosion images with the same time sequence corresponding to the two adjacent explosion periods according to the video data corresponding to the at least two explosion periods;
respectively dividing the explosion images with the same time sequence into equal areas to determine a plurality of sub-explosion images;
and calculating the edge density difference corresponding to the two adjacent explosion periods according to the sub-explosion images corresponding to the two adjacent explosion periods.
3. The method for detecting aeration rate of sewage based on video analysis according to claim 2, wherein the calculating the edge density difference corresponding to two adjacent aeration periods according to the sub-aeration images corresponding to two adjacent aeration periods specifically comprises:
acquiring edge density characteristics corresponding to sub-explosive images corresponding to two adjacent explosive periods;
and calculating the difference value of the edge density characteristics corresponding to the sub-explosive images corresponding to the two adjacent explosive periods, and obtaining the edge density difference value corresponding to the two adjacent explosive periods.
4. The method for detecting aeration rate of sewage based on video analysis according to claim 3, wherein the calculating an approximation degree value according to the edge density difference corresponding to the two adjacent aeration periods specifically comprises:
calculating a distance weight corresponding to the sub-explosive images according to the sub-explosive images corresponding to the two adjacent explosive periods;
and calculating an approximation degree value according to the edge density difference and the distance weight.
5. The method for detecting aeration rate of sewage based on video analysis according to claim 4, wherein the two adjacent aeration periods are set to be t aeration periods and t+1 aeration periods, and the two aeration periods correspond to each other, and comprise aeration patterns with the same time sequence corresponding to the t aeration periods and the t+1 aeration periodsLike the ith frame image
Figure QLYQS_1
The sub-explosive images corresponding to the two adjacent explosive periods are j-region sub-images, and the distance weight corresponding to the j-region sub-images is +.>
Figure QLYQS_2
The calculating the approximation degree value according to the edge density difference and the distance weight value specifically comprises the following steps:
Figure QLYQS_3
wherein ,
Figure QLYQS_5
for the ith frame image in the t-th aeration period and the (t+1) -th aeration period >
Figure QLYQS_7
Approximation between frame images, said +.>
Figure QLYQS_9
Is the edge density characteristic of the j region sub-image of the ith frame image in the t aeration period,
Figure QLYQS_6
is t+1>
Figure QLYQS_8
Edge density characteristics of the sub-image of the j region of the frame image,
Figure QLYQS_10
for the edge density characteristic of the j-region sub-image of the ith frame image in the t aeration period and the +.1-th aeration period>
Figure QLYQS_11
Edge density difference of edge density characteristics of sub-images of J areas of a frame image, J representing the number of all local areas, J representing traversal of J +.>
Figure QLYQS_4
And the distance weight corresponding to the sub-image of the j region.
6. The method for detecting aeration rate of sewage based on video analysis according to any one of claims 1 to 5, wherein the determining whether the aeration rate of sewage is abnormal according to the approximation value specifically comprises:
calculating a standard approximation degree value according to a preset standard calculation method;
comparing the approximation degree value with a standard approximation degree value, and confirming whether the sewage aeration quantity is abnormal or not.
7. The method for detecting aeration rate of sewage based on video analysis according to claim 6, wherein comparing the approximation degree value with a standard approximation degree value, and determining whether the sewage is abnormal, comprises:
When the difference value between the approximation degree value and the standard approximation degree value is larger than a preset threshold value, confirming that the sewage aeration quantity is in an abnormal state;
and when the difference value between the approximation degree value and the standard approximation degree value is smaller than or equal to a preset threshold value, confirming that the sewage aeration quantity is in a normal state.
8. The method for detecting aeration rate of sewage based on video analysis according to claim 6, wherein the calculating a standard approximation value according to a preset standard calculating method specifically comprises:
performing linear fitting on the similarity value by using a least square method to obtain a linear fitting function;
and inputting the time sequence number of the aeration period into the fitting function, and obtaining the standard similarity value corresponding to the aeration period.
CN202310671503.0A 2023-06-08 2023-06-08 Sewage aeration quantity detection method based on video analysis Active CN116402817B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310671503.0A CN116402817B (en) 2023-06-08 2023-06-08 Sewage aeration quantity detection method based on video analysis

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310671503.0A CN116402817B (en) 2023-06-08 2023-06-08 Sewage aeration quantity detection method based on video analysis

Publications (2)

Publication Number Publication Date
CN116402817A true CN116402817A (en) 2023-07-07
CN116402817B CN116402817B (en) 2023-08-15

Family

ID=87014573

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310671503.0A Active CN116402817B (en) 2023-06-08 2023-06-08 Sewage aeration quantity detection method based on video analysis

Country Status (1)

Country Link
CN (1) CN116402817B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116693075A (en) * 2023-07-27 2023-09-05 杭州回水科技股份有限公司 Aeration device of activated carbon biological filter

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102014206309A1 (en) * 2013-04-05 2014-10-09 Mitutoyo Corporation System and method for obtaining offset images for use for improved edge resolution
CN105776770A (en) * 2016-05-06 2016-07-20 云南大学 Sewage deep purifying device with high adaptivity and method thereof
EP3364342A1 (en) * 2017-02-17 2018-08-22 Cogisen SRL Method for image processing and video compression
WO2018181618A1 (en) * 2017-03-28 2018-10-04 東レ株式会社 Effluent treatment method for membrane separation activated sludge, effluent treatment apparatus, and effluent treatment system management program
US20200397297A1 (en) * 2019-06-20 2020-12-24 Ethicon Llc Noise aware edge enhancement in a pulsed fluorescence imaging system
CN115980050A (en) * 2022-12-15 2023-04-18 深圳市万物云科技有限公司 Water quality detection method and device for water outlet, computer equipment and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102014206309A1 (en) * 2013-04-05 2014-10-09 Mitutoyo Corporation System and method for obtaining offset images for use for improved edge resolution
CN105776770A (en) * 2016-05-06 2016-07-20 云南大学 Sewage deep purifying device with high adaptivity and method thereof
EP3364342A1 (en) * 2017-02-17 2018-08-22 Cogisen SRL Method for image processing and video compression
WO2018181618A1 (en) * 2017-03-28 2018-10-04 東レ株式会社 Effluent treatment method for membrane separation activated sludge, effluent treatment apparatus, and effluent treatment system management program
US20200397297A1 (en) * 2019-06-20 2020-12-24 Ethicon Llc Noise aware edge enhancement in a pulsed fluorescence imaging system
CN115980050A (en) * 2022-12-15 2023-04-18 深圳市万物云科技有限公司 Water quality detection method and device for water outlet, computer equipment and storage medium

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
孙德永;魏旭;: "污水处理厂CASS池出现浮泥的原因分析及解决办法", 科技创新导报, no. 18 *
杜向润;孙楠;王蒙;: "基于PIV测量技术的变曝气量下气液两相流速度场研究", 水利学报, no. 11 *
王嘉俊;段先华;: "改进Canny算子在水面目标边缘检测中的研究", 计算机时代, no. 01 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116693075A (en) * 2023-07-27 2023-09-05 杭州回水科技股份有限公司 Aeration device of activated carbon biological filter
CN116693075B (en) * 2023-07-27 2023-11-21 杭州回水科技股份有限公司 Aeration device of activated carbon biological filter

Also Published As

Publication number Publication date
CN116402817B (en) 2023-08-15

Similar Documents

Publication Publication Date Title
CN108550101B (en) Image processing method, device and storage medium
US9230148B2 (en) Method and system for binarization of two dimensional code image
CN113989313B (en) Edge detection method and system based on image multidimensional analysis
CN116402817B (en) Sewage aeration quantity detection method based on video analysis
JP5308062B2 (en) Method and apparatus for detecting and removing false contours
CN102547365B (en) Black edge detection method and device for video image
CN113109368B (en) Glass crack detection method, device, equipment and medium
Xu et al. A switching weighted vector median filter based on edge detection
EP2383701B1 (en) Image processing method and apparatus
CN104680483B (en) The noise estimation method of image, video image denoising method and device
CN106919883B (en) QR code positioning method and device
CN113362238A (en) Test image processing method and device, electronic equipment and storage medium
CN110555863A (en) moving object detection method and device and computer readable storage medium
CN111582032A (en) Pedestrian detection method and device, terminal equipment and storage medium
CN110637227B (en) Detection parameter determining method and detection device
CN107230212B (en) Vision-based mobile phone size measuring method and system
EP3462409B1 (en) A method for filtering spurious pixels in a depth-map
CN115658410B (en) Method and system for testing fluency of touch screen of electronic equipment and storage medium
Fouad et al. Developing a new methodology for de-noising and gridding cDNA microarray images
Zhu et al. Efficient perceptual-based spatially varying out-of-focus blur detection
CN105894457A (en) Image noise removing method and device
CN111353991A (en) Target detection method and device, electronic equipment and storage medium
Fouad¹ et al. A fully automated method for noisy cDNA microarray image quantification
JP6042313B2 (en) Deterioration detection apparatus for concrete structure, deterioration detection method and program thereof
CN111768357A (en) Image detection method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant