CN110533626B - All-weather water quality identification method - Google Patents

All-weather water quality identification method Download PDF

Info

Publication number
CN110533626B
CN110533626B CN201910531661.XA CN201910531661A CN110533626B CN 110533626 B CN110533626 B CN 110533626B CN 201910531661 A CN201910531661 A CN 201910531661A CN 110533626 B CN110533626 B CN 110533626B
Authority
CN
China
Prior art keywords
image
component
water quality
night
daytime
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910531661.XA
Other languages
Chinese (zh)
Other versions
CN110533626A (en
Inventor
林峰
徐韬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang University ZJU
Original Assignee
Zhejiang University ZJU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang University ZJU filed Critical Zhejiang University ZJU
Priority to CN201910531661.XA priority Critical patent/CN110533626B/en
Publication of CN110533626A publication Critical patent/CN110533626A/en
Application granted granted Critical
Publication of CN110533626B publication Critical patent/CN110533626B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/10Image enhancement or restoration using non-spatial domain filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/667Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20048Transform domain processing
    • G06T2207/20064Wavelet transform [DWT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Health & Medical Sciences (AREA)
  • Quality & Reliability (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses an all-weather water quality identification method, which belongs to the field of water quality identification and comprises the following steps: 1) acquiring video data by using video equipment, judging whether the video data is in the daytime or at night, and if the video data is in the daytime, switching to a daytime mode; if the night is judged, the night mode is switched to; 2) processing the video data into a single frame image; 3) processing a single-frame image shot in a night mode; 4) and 3) identifying the single frame image shot in the night mode and the single frame image shot in the daytime mode after the processing in the step 3), and outputting an identification result. And processing the acquired video information by a single-frame image, processing the video image according to the daytime and nighttime modes, and identifying the water quality according to the processing result. The technology can work all weather, not only can carry out early warning in daytime, but also can provide water quality information in time at night, and has very strong practical value.

Description

All-weather water quality identification method
Technical Field
The invention relates to the field of water quality identification, in particular to an all-weather water quality identification method.
Background
The protection of the water environment is very important for the development of the economic society. The traditional water quality detection method mainly comprises the steps of setting sampling points in a fixed-point and fixed-section water area, analyzing components by adopting a water sample, or setting an underwater lens to collect water color and turbidity for analysis, and also comprises some water quality identification methods adopting intelligent vision, but images shot at night are often blacker and cannot be effectively judged.
There are many methods for detecting water quality, such as: chinese patent publication No. CN104568797A discloses an on-line monitoring system for sewage chromaticity, which includes a clean water absorption tank, a sample water absorption tank, an optical fiber probe a, an optical fiber probe b, a CCD array detector, and a data acquisition and processing device, wherein the clean water absorption tank is connected to the CCD array detector through the optical fiber probe a, the sample water absorption tank is connected to the CCD array detector through the optical fiber probe b, and the CCD array detector is connected to the data acquisition and processing device. Chinese patent publication No. CN108051442A discloses a water quality identification method and a water quality identification system based on an intelligent terminal. The water quality identification method comprises the following steps: collecting a current image of the water resource to be detected containing the graphic mark; detecting the current chromaticity of the water resource to be detected according to the color of the current image; extracting the current turbidity of the water resource to be detected according to the definition of the graphic mark; calculating the content of the current suspended substances contained in the current image; and judging the current quality of the water resource to be detected according to the current chromaticity, the current turbidity and the current suspended matter content. The water quality identification system comprises an acquisition module, a detection module, an extraction module, a calculation module and a processing module. By adopting the water quality identification method and the water quality identification system, a user can be helped to conveniently detect the quality of the local water resource in real time. Chinese patent publication No. CN109118548A discloses a comprehensive intelligent water quality identification method, which adopts an intelligent image identification technology to analyze a water quality image, thereby achieving the purpose of identifying water quality.
The above techniques often require special equipment, so that the range of use is limited. Such as: when the water quality of some remote regions is abnormal, the abnormal water quality cannot be timely found and prevented; the chemical detection method also has the defects of unstable concentration of chemical reagents, easy secondary pollution and the like; the monitoring and recording technology of the intelligent image technology often needs to record when the light is bright in the daytime, which limits the application of the technology.
Disclosure of Invention
The invention aims to provide an all-weather water quality identification method, which adopts a conventional camera and is provided with a common flash lamp, only needs to carry out data connection with the existing camera around a water area, judges whether a day mode or a night mode is selected, then obtains a video image in the camera, can carry out all-weather real-time water quality identification on the water area, and can be used for water quality monitoring in remote areas.
In order to achieve the purpose, the all-weather water quality identification method provided by the invention comprises the following steps:
1) acquiring video data by using video equipment, judging whether the video data is in the daytime or at night, and if the video data is in the daytime, switching to a daytime mode; if the night is judged, the night mode is switched to;
2) processing the video data into a single frame image;
3) processing the single-frame image shot in the night mode to obtain an enhanced night image;
4) and 3) identifying the single frame image shot in the night mode and the single frame image shot in the daytime mode after the processing in the step 3), and outputting an identification result.
In the above technical scheme, the high-definition camera is used for recording the water surface of the monitored area from a certain angle and a certain distance, and it should be noted that the camera needs to be provided with a flash lamp, and parameters such as the angle and the distance need to be fixed and cannot be changed at will. And then, carrying out single-frame image processing on the acquired video information, processing the video image according to the daytime and nighttime modes, and carrying out water quality identification according to the video image. The technology can work all weather, not only can carry out early warning in daytime, but also can provide water quality information in time at night, and has very strong practical value.
Preferably, in step 1), the step of determining whether the day or night is:
1-1) converting a color space, converting a video image into an HSV color space, and independently extracting a brightness V component for image enhancement;
1-2) taking the brightness mean value of the V component after image enhancement processing as a judgment standard.
Selecting a V brightness mean value under an HSV color space as a judgment standard; preferably, when the threshold value is set to be 0.1, namely the V brightness mean value under the HSV color space is lower than 0.1, the current image is judged to be a night image, and night mode processing is carried out; otherwise, judging the image is a daytime image, and switching to a daytime mode for image processing.
Preferably, step 3) comprises:
3-1) image preprocessing, namely removing light spots, impulse noise and salt and pepper noise areas by adopting percentage segmentation;
3-2) converting the color space of the image into HSV color space, and independently extracting a brightness V component for image enhancement;
3-3) performing wavelet transformation on the V component subjected to image enhancement processing, dividing low frequency and high frequency, performing nonlinear transformation on a low frequency band, and filtering a high frequency coefficient by Gaussian filtering on a high frequency band to remove image noise;
and 3-4) re-synthesizing the reconstructed V component, the H component and the S component to obtain an enhanced night image.
Preferably, in step 3-3), the method for performing the nonlinear transformation in the low frequency band is as follows:
drawing a brightness histogram of the V component, and performing self-adaptive adjustment on parameters of the histogram by adopting a logarithmic mapping equation with self-adjusting parameters, wherein the logarithmic mapping equation is as follows:
Vo=a+b×ln(Vi+1)
wherein, ViRepresenting the input luminance component, VoThe brightness component of the output is represented, a and b represent two adjustable parameters, a represents intercept and can translate and adjust gray level, and b represents contrast increase speed and can adjust increase speed.
Preferably, the histogram parameter adaptive adjustment process is as follows:
(1) setting an initial parameter as a0=0.1,b0=2;
(2) The V component is subjected to nonlinear transformation according to the initial parameters;
(3) acquiring a brightness histogram processed and transformed by a percentage segmentation function to obtain upper and lower limit gray values of the histogram;
for an image with a gray histogram showing a normal distribution, sometimes it is necessary to obtain an image of a main portion in the histogram, and remove regions with relatively few sides, and perform segmentation in a percentage manner, which is called percentage segmentation processing. The segmentation method is mainly used for the processing process of the gray-scale map, when the gray-scale map is subjected to translation stretching and other processing, the main part of the histogram can be obtained by a percentage segmentation method, and the influence of a small-proportion area is ignored. Percentage division may divide G1And G2And deleting the area, and only reserving the main area G area. The main purpose of this operation is to find suitable upper and lower limits for the grey level. (4) Converting the upper and lower limit gray values of the histogram into a standard interval through self-adaptive adjustment; because the situations of the live images are different, in order to uniformly compare the images under different illumination conditions, a uniform standard brightness interval needs to be set, all night images are converted into the standard brightness interval during processing, the images are subjected to subsequent color difference calculation and color feature extraction under the same brightness, and errors can be greatly reduced.
The formula of the proportional feedback adjustment parameter is as follows: a-da d 1; b-db d 2;
the formula for calculating the difference value with the standard interval is as follows: da-gvl-0.5; db is gvh-0.9;
a and b are two parameters to be adjusted, gvl is a lower limit of a main brightness interval obtained after percentage division, gch is a corresponding upper limit, da is a difference value between the lower limit and a standard lower limit, and db is a difference value between the upper limit and the standard upper limit. d1 is the scaling factor in the feedback adjustment of the a parameter, and d2 is the scaling factor in the feedback adjustment of the b parameter.
Preferably, step 4) comprises:
4-1) filtering the image by adopting a median filtering method;
4-2) image segmentation, and global threshold segmentation is adopted.
4-3) calculating chromatic aberration, namely, calculating the chromatic aberration after converting the color space of the image into a Lab color space;
4-4) extracting color features, wherein the color difference mean value is used as the color features;
4-5) water quality identification, when the monitoring value exceeds a preset early warning value, the system judges that the water quality is sewage, and the formula is as follows:
Figure GDA0002244766190000051
in the formula, Y is a monitoring value, ave is a color difference mean value of a water surface area, and imave is a color difference mean value of a standard image and an original point image, wherein the original point image is defined as an image in a Lab color space, and a component and b component are both-110.
Compared with the prior art, the invention has the beneficial effects that:
(1) the invention utilizes the characteristics of intelligent vision to intelligently process and analyze the water quality image which is difficult to be identified by naked eyes at night and identify the water quality image, and has strong practicability.
(2) The invention can realize long-time online and dynamic monitoring, reduces a large amount of manual workload, and has important significance for solving the problem of aftermath of the current water quality monitoring and managing the water quality safety early warning system which is more effective and lower in cost.
Drawings
FIG. 1 is a flow chart of an all-weather water quality identification method in an embodiment of the invention;
FIG. 2 is a flow chart of diurnal mode image processing and identification in accordance with an embodiment of the present invention;
FIG. 3 is a flowchart illustrating night mode image processing and recognition according to an embodiment of the present invention;
FIG. 4 is a night time raw image of an embodiment of the present invention;
FIG. 5 is a histogram of the V component of a night image according to an embodiment of the present invention;
FIG. 6 is a schematic diagram illustrating adaptive adjustment of histogram parameters according to an embodiment of the present invention;
FIG. 7 is a histogram with percentage segmentation according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be further described with reference to the following embodiments and accompanying drawings.
Examples
Referring to fig. 1 to 7, the all-weather water quality identification method of the embodiment includes the following steps:
s1 acquires a video image.
S2 judges whether the day or night is present. The method comprises the following specific steps:
(1) performing color space conversion; the embodiment is converted into HSV color space, and the brightness V component is independently extracted for image enhancement.
(2) Selecting a V brightness mean value under an HSV color space as a judgment standard; in this example, the threshold is set to be 0.1, that is, when the average value of V luminance in the HSV color space is lower than 0.1, the current image is determined to be a night image, and the night mode processing is performed; otherwise, judging the image is a daytime image, and switching to a daytime mode for image processing.
S3 daytime mode shooting and image processing recognition is shown in fig. 2, and includes the following steps:
s301, filtering the image; this example uses a median filtering method.
S302, image segmentation; this example employs global threshold segmentation.
S303, calculating chromatic aberration; and converting into Lab color space and then performing color difference calculation.
S304, extracting color features; this example uses the color difference mean as the color feature.
S305, identifying water quality; when the monitoring value exceeds the early warning value, the system judges that the water quality is sewage, and the specific formula is as follows:
Figure GDA0002244766190000061
in the formula, Y is a monitoring value, ave is a color difference mean value of the water surface area, and imave is a color difference mean value of the standard image and the origin image. This example defines the origin image as an image of-110 for both the a-component and the b-component in the Lab color space.
S4 night mode shooting and image processing recognition is shown in fig. 3, and includes the following steps:
s401, preprocessing an image; in the embodiment, the percentage division is adopted to remove the light spots, partial impulse noise, salt and pepper noise and other areas.
S402, converting a color space; the embodiment is converted into HSV color space, and the brightness V component is independently extracted for image enhancement.
S403, performing wavelet transformation on the V component; in this example, wavelet transform is used to separate low and high frequencies, nonlinear transform is performed in the low frequency band to prevent amplification of noise, and Gaussian filtering is used in the high frequency band to filter high frequency coefficients to remove image noise. Wherein the low-frequency band nonlinear transformation is described as follows:
the histogram of the V component of the night image is shown in fig. 5, and the image is excessively concentrated in the low gray level region, which is not beneficial to the subsequent analysis process. This example uses a logarithmic mapping equation with self-adjusting parameters as shown in the following equation:
Vo=a+b×ln(Vi+1)
wherein, ViRepresenting the input luminance component, VoRepresenting the luminance component of the output, and a and b represent two adjustable parameters. Wherein a represents intercept, translationally adjustable gray level; b represents the contrast increasing speed, and the increasing speed can be adjusted; the 1 in the formula is to prevent the occurrence of 0 in parentheses. The adaptive histogram parameter adjustment process is shown in fig. 6, and the implementation steps are described as follows:
(1) setting initial parameters; the initial parameter of this example is a0=0.1,b0=2。
(2) Carrying out nonlinear transformation according to the parameters;
(3) acquiring a brightness histogram processed and transformed by a percentage segmentation function to obtain upper and lower limit gray values of the histogram;
(4) converting the upper and lower limit gray values of the histogram into a standard interval through self-adaptive adjustment; because the situations of the live images are different, in order to uniformly compare the images under different illumination conditions, a uniform standard brightness interval needs to be set, all night images are converted into the standard brightness interval during processing, the images are subjected to subsequent color difference calculation and color feature extraction under the same brightness, and errors can be greatly reduced. The adaptive histogram parameter adjustment process is shown in fig. 6, and includes the following specific steps:
for an image with a gray histogram showing a normal distribution, sometimes it is necessary to obtain an image of a main portion in the histogram, and remove regions with relatively few sides, and perform segmentation in a percentage manner, which is called percentage segmentation processing. The segmentation method is mainly used for the processing process of the gray-scale map, when the gray-scale map is subjected to translation stretching and other processing, the main part of the histogram can be obtained by a percentage segmentation method, and the influence of a small-proportion area is ignored. The percentage of the inter-meal soil is 7, and G can be divided1And G2And deleting the area, and only reserving the main area G area. The main purpose of this operation is to find suitable upper and lower limits for the grey level.
S404 sets initial values of two parameters a and b.
S405, carrying out nonlinear transformation, wherein the formula is as follows: vo=a+b×ln(Vi+1)。
S406, percentage segmentation is carried out to obtain upper and lower limit values of the histogram.
S407 determines whether the upper and lower limit values are within the standard interval.
S408, calculating a difference value between the standard interval and the interval, wherein the formula is as follows: da-gvl-0.5; db is gvh-0.9.
S409, calculating a proportional feedback adjustment parameter, wherein the formula is as follows: a-da d 1; b-db d2
a and b are two parameters to be adjusted, gvl is the lower limit of the main brightness interval obtained after percentage division, gvh is the corresponding upper limit, da is the difference between the lower limit and the standard lower limit (0.5 in this example), and db is the difference between the upper limit and the standard upper limit (0.9 in this example). d1 is the scaling factor for feedback adjustment of the a parameter, 0.5 in this example, and d2 is the scaling factor for feedback adjustment of the b parameter, 5 in this example.
The self-adjusting process is as follows: and (c) firstly carrying out the proportional feedback adjustment of the (a) until the converted lower limit meets the condition, and then carrying out the proportional feedback adjustment of the (b) until the converted upper limit meets the condition.
S410 obtains the adjusted values of the a and b parameters.
S411, re-synthesizing the reconstructed V component, the H component and the S component to obtain an enhanced night image; the night image becomes a high-contrast and high-brightness image under the same brightness standard after wavelet transformation and self-adaptive nonlinear transformation.
S412 further processes the image in the daytime mode, and the processes from S301 to S305 are executed.
And S5 outputting and displaying the recognition result.

Claims (4)

1. An all-weather water quality identification method is characterized by comprising the following steps:
1) acquiring video data by using video equipment, judging whether the video data is in the daytime or at night, and if the video data is in the daytime, switching to a daytime mode; if the night is judged, the night mode is switched to;
2) processing the video data into a single frame image;
3) processing the single-frame image shot in the night mode to obtain an enhanced night image;
4) identifying the single frame image shot in the night mode and the single frame image shot in the daytime mode after being processed in the step 3), and outputting an identification result;
the step 3) comprises the following steps:
3-1) image preprocessing, namely removing light spots, impulse noise and salt and pepper noise areas by adopting percentage segmentation;
3-2) converting the color space of the image into HSV color space, and independently extracting a brightness V component for image enhancement;
3-3) performing wavelet transformation on the V component subjected to image enhancement processing, dividing low frequency and high frequency, performing nonlinear transformation on a low frequency band, and filtering a high frequency coefficient by Gaussian filtering on a high frequency band to remove image noise;
3-4) re-synthesizing the reconstructed V component, the H component and the S component to obtain an enhanced night image;
in step 3-3), the method for performing nonlinear transformation in the low frequency band is as follows:
drawing a brightness histogram of the V component, and performing self-adaptive adjustment on parameters of the histogram by adopting a logarithmic mapping equation with self-adjusting parameters, wherein the logarithmic mapping equation is as follows:
Vo=a+b×ln(Vi+1)
wherein, ViRepresenting the input luminance component, VoThe output brightness component is represented, a and b represent two adjustable parameters, a represents intercept and can be used for horizontally adjusting gray level, and b represents contrast increase speed and can be used for adjusting increase speed;
the histogram parameter adaptive adjustment process is as follows:
(1) setting an initial parameter as a0=0.1,b0=2;
(2) The V component is subjected to nonlinear transformation according to the initial parameters;
(3) acquiring a brightness histogram processed and transformed by a percentage segmentation function to obtain upper and lower limit gray values of the histogram;
(4) converting the upper and lower limit gray values of the histogram into a standard interval through self-adaptive adjustment;
the formula of the proportional feedback adjustment parameter is as follows: a-da d 1; b-db d 2;
the formula for calculating the difference value with the standard interval is as follows: da-gv 1-0.5; db is gvh-0.9;
a and b are two parameters to be adjusted, gvl is a lower limit of a main brightness interval obtained after percentage division, gvh is a corresponding upper limit, da is a difference value between the lower limit and a standard lower limit, and db is a difference value between the upper limit and the standard upper limit; d1 is the scaling factor in the feedback adjustment of the a parameter, and d2 is the scaling factor in the feedback adjustment of the b parameter.
2. The all-weather water quality recognition method according to claim 1, wherein the step of determining whether the water quality is daytime or nighttime in step 1) comprises:
1-1) converting a color space, converting a video image into an HSV color space, and independently extracting a brightness V component for image enhancement;
1-2) taking the brightness mean value of the V component after image enhancement processing as a judgment standard.
3. The all-weather water quality identification method according to claim 2, wherein in step 1-2), the judgment threshold of the brightness mean value of the V component is set to be 0.1, and when the V brightness mean value in the HSV color space is lower than 0.1, the judgment is made to be night; otherwise, judging the day.
4. The all-weather water quality identification method according to claim 1, wherein the step 4) comprises:
4-1) filtering the image by adopting a median filtering method;
4-2) image segmentation, wherein global threshold segmentation is adopted;
4-3) calculating chromatic aberration, namely, calculating the chromatic aberration after converting the color space of the image into a Lab color space;
4-4) extracting color features, wherein the color difference mean value is used as the color features;
4-5) water quality identification, when the monitoring value exceeds a preset early warning value, the system judges that the water quality is sewage, and the formula is as follows:
Figure FDA0003238497800000031
in the formula, Y is a monitoring value, ave is a color difference mean value of a water surface area, and imave is a color difference mean value of a standard image and an original point image, wherein the original point image is defined as an image in a Lab color space, and a component and b component are both-110.
CN201910531661.XA 2019-06-19 2019-06-19 All-weather water quality identification method Active CN110533626B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910531661.XA CN110533626B (en) 2019-06-19 2019-06-19 All-weather water quality identification method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910531661.XA CN110533626B (en) 2019-06-19 2019-06-19 All-weather water quality identification method

Publications (2)

Publication Number Publication Date
CN110533626A CN110533626A (en) 2019-12-03
CN110533626B true CN110533626B (en) 2021-11-09

Family

ID=68659429

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910531661.XA Active CN110533626B (en) 2019-06-19 2019-06-19 All-weather water quality identification method

Country Status (1)

Country Link
CN (1) CN110533626B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111639656A (en) * 2020-05-28 2020-09-08 东软睿驰汽车技术(沈阳)有限公司 Traffic signal lamp identification method and device
CN112067517A (en) * 2020-09-11 2020-12-11 杭州市地下管道开发有限公司 Intelligent monitoring method, equipment and system for river and lake water body and readable storage medium
CN112461762B (en) * 2020-11-26 2023-05-12 中国科学院苏州生物医学工程技术研究所 Solution turbidity detection method, medium and image processing system based on HSV model

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN202565375U (en) * 2012-04-19 2012-11-28 合肥博微安全电子科技有限公司 All-weather intelligent video surveillance camera
CN107917919A (en) * 2017-12-18 2018-04-17 广东技术师范学院 Urban waterway water quality monitoring early warning system and method
CN109087363A (en) * 2018-06-28 2018-12-25 江南大学 A kind of sewage discharge detection method based on hsv color space
CN109118548A (en) * 2018-07-17 2019-01-01 浙江大学 A kind of comprehensive intelligent water quality recognition methods

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN202565375U (en) * 2012-04-19 2012-11-28 合肥博微安全电子科技有限公司 All-weather intelligent video surveillance camera
CN107917919A (en) * 2017-12-18 2018-04-17 广东技术师范学院 Urban waterway water quality monitoring early warning system and method
CN109087363A (en) * 2018-06-28 2018-12-25 江南大学 A kind of sewage discharge detection method based on hsv color space
CN109118548A (en) * 2018-07-17 2019-01-01 浙江大学 A kind of comprehensive intelligent water quality recognition methods

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Low-light imgage enhancement based on Retinex theory and dual-tree complex wavelet transform;Yang Maoxiang 等;《OPTOELECTRONICS lETTERS》;20181130;第14卷(第6期);正文第471-473页,图1 *
城市污水排放在线智能监测关键技术的研究;王霞俊 等;《科技信息》;20120405;正文第2-3节 *
基于色差模型的水质检测技术研究;梁秀丽;《中国优秀博硕士学位论文全文数据库(硕士)工程科技Ⅰ辑》;20180615;正文第8-9页 *

Also Published As

Publication number Publication date
CN110533626A (en) 2019-12-03

Similar Documents

Publication Publication Date Title
CN110414334B (en) Intelligent water quality identification method based on unmanned aerial vehicle inspection
CN110533626B (en) All-weather water quality identification method
CN107507173B (en) No-reference definition evaluation method and system for full-slice image
CN113592861B (en) Bridge crack detection method based on dynamic threshold
CN110895806A (en) Method and system for detecting screen display defects
CN111294589B (en) Camera module lens surface detection method
KR101448164B1 (en) Method for Image Haze Removal Using Parameter Optimization
US9811746B2 (en) Method and system for detecting traffic lights
CN111161170B (en) Underwater image comprehensive enhancement method for target recognition
CN112149543B (en) Building dust recognition system and method based on computer vision
CN115619793B (en) Power adapter appearance quality detection method based on computer vision
CN111598791B (en) Image defogging method based on improved dynamic atmospheric scattering coefficient function
CN111611907B (en) Image-enhanced infrared target detection method
CN109472788B (en) Method for detecting flaw on surface of airplane rivet
CN114004834B (en) Method, equipment and device for analyzing foggy weather condition in image processing
CN111353968B (en) Infrared image quality evaluation method based on blind pixel detection and analysis
CN113313677A (en) Quality detection method for X-ray image of wound lithium battery
CN115731493A (en) Rainfall micro physical characteristic parameter extraction and analysis method based on video image recognition
Karnawat et al. Turbidity detection using image processing
CN115601379A (en) Surface crack accurate detection technology based on digital image processing
CN112750089B (en) Optical remote sensing image defogging method based on local block maximum and minimum pixel prior
CN110223253B (en) Defogging method based on image enhancement
CN117036259A (en) Metal plate surface defect detection method based on deep learning
CN115861304A (en) Method and system for detecting steel strip-shaped structure based on image processing
CN109448012A (en) A kind of method for detecting image edge and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant