CN112528968B - Raise dust detection method and system applied to urban management - Google Patents

Raise dust detection method and system applied to urban management Download PDF

Info

Publication number
CN112528968B
CN112528968B CN202110173982.4A CN202110173982A CN112528968B CN 112528968 B CN112528968 B CN 112528968B CN 202110173982 A CN202110173982 A CN 202110173982A CN 112528968 B CN112528968 B CN 112528968B
Authority
CN
China
Prior art keywords
dust
data stream
image
video data
raise dust
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110173982.4A
Other languages
Chinese (zh)
Other versions
CN112528968A (en
Inventor
王强
王静宇
梅一多
谷雨明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhongguancun Smart City Co Ltd
Original Assignee
Zhongguancun Smart City Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhongguancun Smart City Co Ltd filed Critical Zhongguancun Smart City Co Ltd
Priority to CN202110173982.4A priority Critical patent/CN112528968B/en
Publication of CN112528968A publication Critical patent/CN112528968A/en
Application granted granted Critical
Publication of CN112528968B publication Critical patent/CN112528968B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/41Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2411Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on the proximity to a decision surface, e.g. support vector machines
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/26Government or public services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/269Analysis of motion using gradient-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/758Involving statistics of pixels or of feature values, e.g. histogram matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/46Extracting features or characteristics from the video content, e.g. video fingerprints, representative shots or key frames
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/48Matching video sequences
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Business, Economics & Management (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Biology (AREA)
  • Tourism & Hospitality (AREA)
  • Software Systems (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Economics (AREA)
  • Medical Informatics (AREA)
  • Computing Systems (AREA)
  • Development Economics (AREA)
  • Educational Administration (AREA)
  • Computational Linguistics (AREA)
  • Databases & Information Systems (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • Image Analysis (AREA)

Abstract

The embodiment of the invention provides a raise dust detection method and a raise dust detection system applied to urban management, wherein the method comprises the steps of acquiring a video data stream and a sensor data stream, and preprocessing the video data stream and the sensor data stream; acquiring a dust raising image, comparing the dust raising image with the picture in the video data stream, and identifying the dust raising image; acquiring various sensor data streams, and comparing the sensor data streams with preset dust raising thresholds respectively, wherein the data streams higher than the dust raising thresholds are dust raising data streams; and inputting the raise dust graph and the raise dust data stream into a pre-trained classifier as samples, detecting whether raise dust exists or not, transmitting a detection result to a server, and performing visual display on an upper computer. The invention collects real-time video stream and sensor data in a fixed scene, and distinguishes two states of dust emission and non-dust emission through data comparison, thereby achieving the purpose of accurately identifying the dust emission.

Description

Raise dust detection method and system applied to urban management
Technical Field
The invention relates to the technical field of dust emission detection, in particular to a dust emission detection method and system applied to urban management.
Background
The dust is an open pollution source which enters the atmosphere due to the fact that dust on the ground is driven by wind power, people and the like to fly, and is an important component of total suspended particles in the ambient air. And constructing the house. During the manual activities of road and pipeline construction, house demolition, material transportation, material stacking, road cleaning, plant planting, maintenance and the like, dust particles are generated, and the pollution to the atmosphere is very easy to cause.
The traditional raise dust detection method comprises the following steps: the method comprises the steps of obtaining an image to be identified from a fixed source dust collection point, carrying out decomposition domain transformation in a spatial domain, enabling image information to be decomposed into a plurality of scale spaces, carrying out edge detection on the image to be identified by utilizing a gradient operator, extracting a feature vector of the image to be identified, which is obtained by the edge detection, carrying out mode identification, and realizing identification of the image to be identified. Because the external environmental factors (illumination, interference of other objects and the like) have high requirements on the accuracy of the picture, extraction has great influence, noise points can be removed only by image preprocessing for many times, and the increase of uncertain factors causes the low accuracy of image identification.
Disclosure of Invention
In order to accurately identify the raise dust in the air and provide an instructive solution for later-stage city management and environmental protection monitoring, the embodiment of the invention provides a raise dust detection method and a system applied to the city management. The specific technical scheme is as follows:
in order to achieve the above object, an embodiment of the present invention provides a method for detecting fugitive dust applied to urban treatment, including the steps of:
acquiring a video data stream and a sensor data stream of a fixed scene, and preprocessing the video data stream and the sensor data stream;
acquiring a dust raising image, comparing the dust raising image with the picture in the video data stream, and identifying the dust raising image;
acquiring various sensor data streams, and comparing the sensor data streams with preset dust raising thresholds respectively, wherein the data streams higher than the dust raising thresholds are dust raising data streams;
and inputting the raise dust graph and the raise dust data stream into a pre-trained classifier as samples, detecting whether raise dust exists or not, transmitting a detection result to a server, and performing visual display on an upper computer.
Further, comparing the dust raising image with the picture in the video data stream to identify the dust raising image; the method specifically comprises the following steps:
comparing the histogram of the raise dust image with the histogram of the image in the video data stream by adopting a gray histogram comparison method, and judging whether the image in the video data stream is the raise dust image or not according to the similarity of the two histograms;
or, matching the characteristic points of the raise dust map with the characteristic points of the images in the video data stream by adopting a characteristic point matching method, and judging whether the images in the video data stream are the raise dust map or not according to the matching degree.
Further, calculating the distance between the histogram of the raise dust figure and the histogram of the image in the video data stream by adopting a chi-square similarity calculation method; if the distance is smaller than a preset similarity threshold value, the image in the video data stream is a raise dust map; wherein the chi-square similarity algorithm formula is as follows:
Figure 292429DEST_PATH_IMAGE001
d represents a distance; h1Histogram gray scale, H, representing a dusting map2Representing the histogram gray scale of the images in the video data stream.
Further, the method also comprises the step of carrying out target tracking on the flying dust in the video data stream by adopting an optical flow method.
Further, the target tracking of the flying dust in the video data stream by using the optical flow method specifically includes the steps of:
processing a sequence of consecutive video frames of said video data stream;
aiming at each video frame sequence, detecting whether a raise dust foreground target appears by using a target detection method;
if a raise dust foreground target appears in a certain frame, finding out representative key feature points of the raise dust foreground target;
for any two subsequent adjacent video frames, the optimal position of the key feature point appearing in the previous frame in the current frame is searched, so that the position coordinate of the foreground target in the current frame is obtained;
and (4) iterating in the above way to realize the tracking of the raise dust target.
Further, the classifier adopts an SVM two-classifier.
Further, the matching the characteristic points of the raise dust map with the characteristic points of the images in the video data stream by using a characteristic point matching method, and determining whether the images in the video data stream are the raise dust map according to the matching degree specifically includes:
acquiring images in a video data stream every 5 frames, and comparing the images with the raise dust image;
dividing the image and the raise dust image into ten equal parts, randomly extracting five parts of the image, copying the image to a preset position in the raise dust image, and then performing feature extraction;
and fusing boundaries, matching feature points, and judging whether the image in the video data stream is a raise dust map or not according to the matching degree.
Further, the plurality of sensors includes: an MQ-7 model CO concentration sensor, a YW-51GJ model PM2.5 concentration sensor and a DHT11 digital temperature and humidity sensor.
A second aspect of the embodiments of the present invention provides a dust detection system applied to urban management, including:
the data stream acquisition module is used for acquiring a video data stream and a sensor data stream of a fixed scene and preprocessing the video data stream and the sensor data stream;
the dust image acquisition module is used for acquiring a dust image, comparing the dust image with the image in the video data stream and identifying the dust image;
the sensor data flow acquisition module is used for acquiring various sensor data flows, comparing the sensor data flows with a preset raise dust threshold value respectively, and taking the data flow higher than the raise dust threshold value as the raise dust data flow;
and the detection module is used for inputting the raise dust image and the raise dust data stream into a pre-trained classifier as samples, detecting whether raise dust exists or not, transmitting a detection result to the server, and performing visual display on the upper computer.
The third aspect of the embodiments of the present invention further provides a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the processor is caused to process the steps of the dust detection method applied to urban management.
A fourth aspect of the present invention provides an electronic apparatus comprising:
a processor; and the number of the first and second groups,
a memory arranged to store computer executable instructions that when executed cause the processor to perform the method of dust detection applied to urban remediation described above.
The embodiment of the invention provides a method, a device and a computer readable storage medium for detecting flying dust applied to urban management, wherein the method comprises the steps of acquiring a video data stream and a sensor data stream of a fixed scene, and preprocessing the video data stream and the sensor data stream; acquiring a dust raising image, comparing the dust raising image with the picture in the video data stream, and identifying the dust raising image; acquiring various sensor data streams, and comparing the sensor data streams with preset dust raising thresholds respectively, wherein the data streams higher than the dust raising thresholds are dust raising data streams; and inputting the raise dust graph and the raise dust data stream into a pre-trained classifier as samples, detecting whether raise dust exists or not, transmitting a detection result to a server, and performing visual display on an upper computer. The method comprises the steps of collecting real-time video stream and sensor data in a fixed scene, and distinguishing two states of dust emission and non-dust emission through data comparison; training a classifier by taking the recognized flying dust and non-flying dust as samples; and finally, inputting the collected real-time video stream and the collected sensor data stream into a trained classifier for learning, and detecting whether the flying dust exists or not, so that the aim of accurately identifying the flying dust is fulfilled.
Drawings
Fig. 1 is a flowchart of an implementation of a raise dust detection method applied to urban management according to embodiment 1 of the present invention;
FIG. 2 is a schematic view of a process of an optical flow method target tracking algorithm according to an embodiment of the present invention;
FIG. 3 shows a schematic structural diagram of an electronic device according to one embodiment of the invention;
FIG. 4 shows a schematic structural diagram of a computer-readable storage medium according to one embodiment of the invention;
in the figure: 31-a processor; 32-a memory; 33-storage space; 34-program code; 41-program code.
Detailed Description
In order to clearly and thoroughly show the technical solution of the present invention, the following description is made with reference to the accompanying drawings, but the scope of the present invention is not limited thereto.
Referring to fig. 1, a flow chart of an implementation manner of a raise dust detection method applied to urban treatment provided in embodiment 1 of the present invention includes the steps of:
acquiring a video data stream and a sensor data stream of a fixed scene, and preprocessing the video data stream and the sensor data stream;
acquiring a dust raising image, comparing the dust raising image with the picture in the video data stream, and identifying the dust raising image;
acquiring various sensor data streams, and comparing the sensor data streams with preset dust raising thresholds respectively, wherein the data streams higher than the dust raising thresholds are dust raising data streams;
and inputting the raise dust graph and the raise dust data stream into a pre-trained classifier as samples, detecting whether raise dust exists or not, transmitting a detection result to a server, and performing visual display on an upper computer.
The video data stream refers to video data acquired by a camera device in a specific scene; the sensor data stream refers to data of a specific scene obtained by a sensor; in the embodiment of the invention, a plurality of sensors are used for measuring the CO concentration, the PM2.5 concentration and the temperature and humidity of the environment, and an MQ-7 sensor is used for measuring the CO concentration; the concentration of PM2.5 adopts a YW-51GJ dust sensor, the sensor adopts a serial port mode, the temperature and the humidity adopt a DHT11 digital sensor, and the temperature and humidity composite sensor comprises a temperature and humidity composite sensor with calibrated digital signal output.
The preprocessing comprises the step of screening data through the similarity of histograms of the dust map and the non-dust map, or screening the data according to the feature matching of the dust map and the non-dust map.
In an optional embodiment of the present invention, the raise dust map is identified by comparing the raise dust map with the pictures in the video data stream; the method specifically comprises the following steps:
comparing the histogram of the raise dust image with the histogram of the image in the video data stream by adopting a gray histogram comparison method, and judging whether the image in the video data stream is the raise dust image or not according to the similarity of the two histograms;
or, matching the characteristic points of the raise dust map with the characteristic points of the images in the video data stream by adopting a characteristic point matching method, and judging whether the images in the video data stream are the raise dust map or not according to the matching degree.
The above matching method using feature points matches the feature points of the raise dust map with the feature points of the images in the video data stream, and determines whether the images in the video data stream are the raise dust map according to the matching degree, specifically including:
acquiring images in a video data stream every 5 frames, and comparing the images with the raise dust image;
dividing the image and the raise dust image into ten equal parts, randomly extracting five parts of the image, copying the image to a preset position in the raise dust image, and then performing feature extraction;
and fusing boundaries, matching feature points, and judging whether the image in the video data stream is a raise dust map or not according to the matching degree.
In an optional embodiment of the present invention, histogram data of two images are calculated, and the similarity between two sets of data is compared, so as to obtain the degree of similarity between the two images, and four distance standard calculation formulas are available, preferably, in the embodiment of the present invention, a chi-square similarity calculation method is used to calculate the distance between the histogram of the raise dust map and the histogram of the images in the video data stream; if the distance is smaller than a preset similarity threshold value, the image in the video data stream is a raise dust map; wherein the chi-square similarity algorithm formula is as follows:
Figure 210707DEST_PATH_IMAGE001
d represents a distance; h1 represents the histogram gray scale of the dust map, and H2 represents the histogram gray scale of the images in the video data stream.
The classifier adopts an SVM two-classifier, and the training process of the two-classifier is as follows:
dust training dataset T = { (x1, y1), (x2, y2),. } xN, yN) on a given feature space
Wherein,
Figure 645755DEST_PATH_IMAGE002
,
Figure 598667DEST_PATH_IMAGE003
,
Figure 926880DEST_PATH_IMAGE004
for the ith feature vector, also referred to as an instance,
Figure 66875DEST_PATH_IMAGE005
is composed of
Figure 771525DEST_PATH_IMAGE006
Class label of (2). When in use
Figure 578944DEST_PATH_IMAGE007
At first, call
Figure 78059DEST_PATH_IMAGE008
Is a positive example; when in use
Figure 705349DEST_PATH_IMAGE009
At first, call
Figure 213691DEST_PATH_IMAGE004
Is a negative example.
Figure 610037DEST_PATH_IMAGE010
Referred to as sample points.
Generally, when the training data set is linearly shareable, the optimal separation hyperplane is obtained by utilizing the interval maximization of a support vector machine; when the training data is not linear, the nonlinear support vector machine is trained by using the kernel technique and soft interval maximization.
And detecting whether dust pollutants exist or not by using the trained SVM classifier, transmitting the result to a server, and displaying the result through an upper computer.
In an optional implementation manner of the embodiment of the present invention, the method further includes performing target tracking on the flying dust in the video data stream by using an optical flow method, and preferably, using a DIS optical flow method.
Fig. 2 is a schematic process diagram of an optical flow method target tracking algorithm provided in an embodiment of the present invention, where the above-mentioned target tracking on the raise dust in the video data stream by using an optical flow method specifically includes the steps of:
processing a sequence of consecutive video frames of said video data stream;
aiming at each video frame sequence, detecting whether a raise dust foreground target appears by using a target detection method;
if a certain frame has a dust foreground target, finding a representative key feature point (in the embodiment of the invention, preferably adopting a dust density feature);
for any two subsequent adjacent video frames, the optimal position of the key feature point appearing in the previous frame in the current frame is searched, so that the position coordinate of the foreground target in the current frame is obtained;
and (4) iterating in the above way to realize the tracking of the raise dust target.
The embodiment of the invention provides a method, a device and a computer readable storage medium for detecting flying dust applied to urban management, wherein the method comprises the steps of acquiring a video data stream and a sensor data stream of a fixed scene, and preprocessing the video data stream and the sensor data stream; acquiring a dust raising image, comparing the dust raising image with the picture in the video data stream, and identifying the dust raising image; acquiring various sensor data streams, and comparing the sensor data streams with preset dust raising thresholds respectively, wherein the data streams higher than the dust raising thresholds are dust raising data streams; and inputting the raise dust graph and the raise dust data stream into a pre-trained classifier as samples, detecting whether raise dust exists or not, transmitting a detection result to a server, and performing visual display on an upper computer. The method comprises the steps of collecting real-time video stream and sensor data in a fixed scene, and distinguishing two states of dust emission and non-dust emission through data comparison; training a classifier by taking the recognized flying dust and non-flying dust as samples; and finally, inputting the collected real-time video stream and the collected sensor data stream into a trained classifier for learning, and detecting whether the flying dust exists or not, so that the aim of accurately identifying the flying dust is fulfilled.
A second aspect of the embodiments of the present invention provides a dust detection system applied to urban management, including:
the data stream acquisition module is used for acquiring a video data stream and a sensor data stream of a fixed scene and preprocessing the video data stream and the sensor data stream;
the dust image acquisition module is used for acquiring a dust image, comparing the dust image with the image in the video data stream and identifying the dust image;
the sensor data flow acquisition module is used for acquiring various sensor data flows, comparing the sensor data flows with a preset raise dust threshold value respectively, and taking the data flow higher than the raise dust threshold value as the raise dust data flow;
and the detection module is used for inputting the raise dust image and the raise dust data stream into a pre-trained classifier as samples, detecting whether raise dust exists or not, transmitting a detection result to the server, and performing visual display on the upper computer.
The third aspect of the embodiments of the present invention further provides a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the processor is caused to process the steps of the dust detection method applied to urban management.
A fourth aspect of the present invention provides an electronic apparatus comprising:
a processor; and the number of the first and second groups,
a memory arranged to store computer executable instructions that when executed cause the processor to perform the method of dust detection applied to urban remediation described above.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
It should be noted that:
the algorithms and displays presented herein are not inherently related to any particular computer, virtual machine, or other apparatus. Various general purpose devices may be used with the teachings herein. The required structure for constructing such a device will be apparent from the description above. Moreover, the present invention is not directed to any particular programming language. It is appreciated that a variety of programming languages may be used to implement the teachings of the present invention as described herein, and any descriptions of specific languages are provided above to disclose the best mode of the invention.
In the description provided herein, numerous specific details are set forth. It is understood, however, that embodiments of the invention may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the foregoing description of exemplary embodiments of the invention, various features of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the invention and aiding in the understanding of one or more of the various inventive aspects. However, the disclosed method should not be interpreted as reflecting an intention that: that the invention as claimed requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this invention.
Those skilled in the art will appreciate that the modules in the device in an embodiment may be adaptively changed and disposed in one or more devices different from the embodiment. The modules or units or components of the embodiments may be combined into one module or unit or component, and furthermore they may be divided into a plurality of sub-modules or sub-units or sub-components. All of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or elements of any method or apparatus so disclosed, may be combined in any combination, except combinations where at least some of such features and/or processes or elements are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments described herein include some features included in other embodiments, rather than other features, combinations of features of different embodiments are meant to be within the scope of the invention and form different embodiments. For example, in the following claims, any of the claimed embodiments may be used in any combination.
The various component embodiments of the invention may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof. It will be appreciated by those skilled in the art that a microprocessor or Digital Signal Processor (DSP) may be used in practice to implement some or all of the functions of some or all of the components of the apparatus for detecting a wearing state of an electronic device according to embodiments of the present invention. The present invention may also be embodied as apparatus or device programs (e.g., computer programs and computer program products) for performing a portion or all of the methods described herein. Such programs implementing the present invention may be stored on computer-readable media or may be in the form of one or more signals. Such a signal may be downloaded from an internet website or provided on a carrier signal or in any other form.
For example, fig. 3 shows a schematic structural diagram of an electronic device according to an embodiment of the invention. The electronic device conventionally comprises a processor 31 and a memory 32 arranged to store computer-executable instructions (program code). The memory 32 may be an electronic memory such as a flash memory, an EEPROM (electrically erasable programmable read only memory), an EPROM, a hard disk, or a ROM. The memory 32 has a storage space 33 storing program code 34 for performing the method steps shown in fig. 1 and in any of the embodiments. For example, the storage space 33 for storing the program code may comprise respective program codes 34 for implementing the various steps in the above method, respectively. The program code can be read from or written to one or more computer program products. These computer program products comprise a program code carrier such as a hard disk, a Compact Disc (CD), a memory card or a floppy disk. Such a computer program product is typically a computer readable storage medium such as described in fig. 4. The computer readable storage medium may have memory segments, memory spaces, etc. arranged similarly to the memory 32 in the electronic device of fig. 3. The program code may be compressed, for example, in a suitable form. In general, the memory space stores program code 41 for performing the steps of the method according to the invention, i.e. there may be program code, such as read by the processor 31, which, when run by the electronic device, causes the electronic device to perform the steps of the method described above.
Although the invention has been described in detail above with reference to a general description and specific examples, it will be apparent to one skilled in the art that modifications or improvements may be made thereto based on the invention. Accordingly, such modifications and improvements are intended to be within the scope of the invention as claimed.

Claims (6)

1. The raise dust detection method applied to urban treatment is characterized by comprising the following steps:
acquiring a video data stream and a sensor data stream of a fixed scene, and preprocessing the video data stream and the sensor data stream;
acquiring a dust raising image, comparing the dust raising image with the picture in the video data stream, and identifying the dust raising image;
acquiring various sensor data streams, and comparing the sensor data streams with preset dust raising thresholds respectively, wherein the data streams higher than the dust raising thresholds are dust raising data streams;
inputting the raise dust map and the raise dust data stream as samples into a pre-trained classifier, detecting whether raise dust exists or not, transmitting a detection result to a server, and performing visual display on an upper computer;
the acquiring of the raise dust map, comparing the raise dust map with the picture in the video data stream, and identifying the raise dust map comprises the following steps:
matching the characteristic points of the raise dust image with the characteristic points of the images in the video data stream by adopting a characteristic point matching method, and judging whether the images in the video data stream are the raise dust images or not according to the matching degree;
the method for matching the characteristic points of the raise dust map with the characteristic points of the images in the video data stream by adopting the characteristic point matching method and judging whether the images in the video data stream are the raise dust map or not according to the matching degree specifically comprises the following steps:
acquiring images in a video data stream every 5 frames, and comparing the images with the raise dust image;
dividing the image and the raise dust image into ten equal parts, randomly extracting five parts of the image, copying the image to a preset position in the raise dust image, and then performing feature extraction;
fusing boundaries and matching feature points, and judging whether the image in the video data stream is a raise dust map or not according to the matching degree;
carrying out target tracking on the raised dust in the video data stream by adopting an optical flow method; and extracting the dust density characteristic of the flying dust of the image in the video data stream, and carrying out target tracking on the flying dust through the dust density characteristic of the flying dust.
2. The method for detecting fugitive dust applied to urban management according to claim 1, wherein the step of performing target tracking on fugitive dust in the video data stream by using an optical flow method specifically comprises the steps of:
processing a sequence of consecutive video frames of said video data stream;
aiming at each video frame sequence, detecting whether a raise dust foreground target appears by using a target detection method;
if a raise dust foreground target appears in a certain frame, finding out representative key feature points of the raise dust foreground target;
for any two subsequent adjacent video frames, the optimal position of the key feature point appearing in the previous frame in the current frame is searched, so that the position coordinate of the foreground target in the current frame is obtained;
and (4) iterating in the above way to realize the tracking of the raise dust target.
3. A dust detection method applied to urban treatment according to claim 1, wherein the classifier is an SVM two-classifier.
4. The utility model provides a raise dust detecting system for urban treatment which characterized in that includes:
the data stream acquisition module is used for acquiring a video data stream and a sensor data stream of a fixed scene and preprocessing the video data stream and the sensor data stream;
the dust image acquisition module is used for acquiring a dust image, comparing the dust image with the image in the video data stream and identifying the dust image;
the sensor data flow acquisition module is used for acquiring various sensor data flows, comparing the sensor data flows with a preset raise dust threshold value respectively, and taking the data flow higher than the raise dust threshold value as the raise dust data flow;
the detection module is used for inputting the dust map and the dust data stream into a pre-trained classifier by taking the dust map and the dust data stream as samples, detecting whether dust exists or not, transmitting a detection result to the server, and performing visual display on the upper computer;
the dust map acquisition module is used for matching the characteristic points of the dust map with the characteristic points of the images in the video data stream by adopting a characteristic point matching method and judging whether the images in the video data stream are the dust maps or not according to the matching degree;
the method is used for matching the characteristic points of the raise dust map with the characteristic points of the images in the video data stream by adopting a characteristic point matching method, and judging whether the images in the video data stream are the raise dust map according to the matching degree, and specifically comprises the following steps:
acquiring images in a video data stream every 5 frames, and comparing the images with the raise dust image;
dividing the image and the raise dust image into ten equal parts, randomly extracting five parts of the image, copying the image to a preset position in the raise dust image, and then performing feature extraction;
fusing boundaries and matching feature points, and judging whether the image in the video data stream is a raise dust map or not according to the matching degree;
the system is also used for carrying out target tracking on the raised dust in the video data stream by adopting an optical flow method; and extracting the dust density characteristic of the flying dust of the image in the video data stream, and carrying out target tracking on the flying dust through the dust density characteristic of the flying dust.
5. A computer-readable storage medium having stored thereon a computer program which, when executed by a processor, causes the processor to process the method of any one of claims 1-3.
6. An electronic device, comprising:
a processor; and the number of the first and second groups,
a memory arranged to store computer executable instructions that, when executed, cause the processor to perform the method of any one of claims 1-3.
CN202110173982.4A 2021-02-09 2021-02-09 Raise dust detection method and system applied to urban management Active CN112528968B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110173982.4A CN112528968B (en) 2021-02-09 2021-02-09 Raise dust detection method and system applied to urban management

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110173982.4A CN112528968B (en) 2021-02-09 2021-02-09 Raise dust detection method and system applied to urban management

Publications (2)

Publication Number Publication Date
CN112528968A CN112528968A (en) 2021-03-19
CN112528968B true CN112528968B (en) 2021-06-08

Family

ID=74975613

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110173982.4A Active CN112528968B (en) 2021-02-09 2021-02-09 Raise dust detection method and system applied to urban management

Country Status (1)

Country Link
CN (1) CN112528968B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108760594A (en) * 2018-06-12 2018-11-06 中国环境科学研究院 Airborne dust based on unmanned plane monitors system
CN108845536A (en) * 2018-04-20 2018-11-20 燕山大学 A kind of stockyard fugitive dust real-time detection and intelligent water sprinkling device for reducing dust and method based on video technique
CN110458047A (en) * 2019-07-23 2019-11-15 北京理工大学 A kind of country scene recognition method and system based on deep learning
CN110926532A (en) * 2019-11-29 2020-03-27 四川省生态环境科学研究院 Digital monitoring system of city raise dust based on big data
CN111860531A (en) * 2020-07-28 2020-10-30 西安建筑科技大学 Raise dust pollution identification method based on image processing
CN112052744A (en) * 2020-08-12 2020-12-08 成都佳华物链云科技有限公司 Environment detection model training method, environment detection method and device

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2002242204B2 (en) * 2001-02-23 2006-11-09 Ge Betz, Inc. Automated dust control method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108845536A (en) * 2018-04-20 2018-11-20 燕山大学 A kind of stockyard fugitive dust real-time detection and intelligent water sprinkling device for reducing dust and method based on video technique
CN108760594A (en) * 2018-06-12 2018-11-06 中国环境科学研究院 Airborne dust based on unmanned plane monitors system
CN110458047A (en) * 2019-07-23 2019-11-15 北京理工大学 A kind of country scene recognition method and system based on deep learning
CN110926532A (en) * 2019-11-29 2020-03-27 四川省生态环境科学研究院 Digital monitoring system of city raise dust based on big data
CN111860531A (en) * 2020-07-28 2020-10-30 西安建筑科技大学 Raise dust pollution identification method based on image processing
CN112052744A (en) * 2020-08-12 2020-12-08 成都佳华物链云科技有限公司 Environment detection model training method, environment detection method and device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于无人机的施工扬尘污染源自动监测系统设计与实现;马国鑫 等;《中国环境监测》;20180228;第34卷(第1期);第151-156页 *

Also Published As

Publication number Publication date
CN112528968A (en) 2021-03-19

Similar Documents

Publication Publication Date Title
CN104424466B (en) Method for checking object, body detection device and image pick up equipment
Christlein et al. An evaluation of popular copy-move forgery detection approaches
CN112070135B (en) Power equipment image detection method and device, power equipment and storage medium
CN112200081A (en) Abnormal behavior identification method and device, electronic equipment and storage medium
CN111723786A (en) Method and device for detecting wearing of safety helmet based on single model prediction
CN106133756A (en) For filtering, split and identify the system without the object in constraint environment
CN111611970A (en) Urban management monitoring video-based disposable garbage behavior detection method
CN109857878B (en) Article labeling method and device, electronic equipment and storage medium
CN114462469B (en) Training method of target detection model, target detection method and related device
CN113361643A (en) Deep learning-based universal mark identification method, system, equipment and storage medium
CN116704490B (en) License plate recognition method, license plate recognition device and computer equipment
CN116824135A (en) Atmospheric natural environment test industrial product identification and segmentation method based on machine vision
CN114429577B (en) Flag detection method, system and equipment based on high confidence labeling strategy
Abujayyab et al. Integrating object-based and pixel-based segmentation for building footprint extraction from satellite images
Soetedjo et al. Plant leaf detection and counting in a greenhouse during day and nighttime using a Raspberry Pi NoIR camera
CN112907138B (en) Power grid scene early warning classification method and system from local to whole perception
CN105469099B (en) Pavement crack detection and identification method based on sparse representation classification
CN114140663A (en) Multi-scale attention and learning network-based pest identification method and system
CN106529455A (en) Fast human posture recognition method based on SoC FPGA
CN112528968B (en) Raise dust detection method and system applied to urban management
Hara et al. An initial study of automatic curb ramp detection with crowdsourced verification using google street view images
CN115131826B (en) Article detection and identification method, and network model training method and device
CN114821978B (en) Method, device and medium for eliminating false alarm
CN115601684A (en) Emergency early warning method and device, electronic equipment and storage medium
CN114494355A (en) Trajectory analysis method and device based on artificial intelligence, terminal equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CB03 Change of inventor or designer information

Inventor after: Wang Qiang

Inventor after: Sun Xinglei

Inventor after: Wang Jingyu

Inventor after: Mei Yiduo

Inventor after: Gu Yuming

Inventor before: Wang Qiang

Inventor before: Wang Jingyu

Inventor before: Mei Yiduo

Inventor before: Gu Yuming

CB03 Change of inventor or designer information