CN112649900A - Visibility monitoring method, device, equipment, system and medium - Google Patents

Visibility monitoring method, device, equipment, system and medium Download PDF

Info

Publication number
CN112649900A
CN112649900A CN202011364730.1A CN202011364730A CN112649900A CN 112649900 A CN112649900 A CN 112649900A CN 202011364730 A CN202011364730 A CN 202011364730A CN 112649900 A CN112649900 A CN 112649900A
Authority
CN
China
Prior art keywords
visibility
data
meteorological
time period
features
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011364730.1A
Other languages
Chinese (zh)
Inventor
陈舜东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Eye Control Technology Co Ltd
Original Assignee
Shanghai Eye Control Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Eye Control Technology Co Ltd filed Critical Shanghai Eye Control Technology Co Ltd
Priority to CN202011364730.1A priority Critical patent/CN112649900A/en
Publication of CN112649900A publication Critical patent/CN112649900A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01WMETEOROLOGY
    • G01W1/00Meteorology
    • G01W1/10Devices for predicting weather conditions
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01WMETEOROLOGY
    • G01W1/00Meteorology
    • G01W1/02Instruments for indicating weather conditions by measuring two or more variables, e.g. humidity, pressure, temperature, cloud cover or wind speed
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • G06N20/20Ensemble learning
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Abstract

The invention discloses a visibility monitoring method, a visibility monitoring device, visibility monitoring equipment, visibility monitoring system and visibility monitoring media. The method comprises the following steps: acquiring meteorological data and image data of a monitoring area at the current moment; respectively extracting features of the meteorological data and the image data to obtain meteorological features and image features; inputting meteorological features into a first visibility recognition model to obtain first visibility; if the first visibility is smaller than a set threshold value, acquiring a time period of the current moment; if the time period is a first preset time period, inputting the image characteristics into a second visibility recognition model to obtain the target visibility; and if the time period is a second preset time period, inputting the meteorological features into a third visibility recognition model to obtain the target visibility. By using the method, the visibility can be monitored in real time in all weather, and for the condition that the visibility is lower than 4000 meters, different visibility recognition models are used for recognizing the visibility in the day and at night, so that the monitoring accuracy is effectively improved.

Description

Visibility monitoring method, device, equipment, system and medium
Technical Field
The embodiment of the invention relates to the technical field of image processing, in particular to a visibility monitoring method, a visibility monitoring device, visibility monitoring equipment, a visibility monitoring system and a visibility monitoring medium.
Background
Visibility is a meteorological element that has a significant impact on aviation, navigation, land transportation, and military activities. In the field of transportation, when the visibility is lower than 500 meters, the road traffic is influenced; when visibility is below 100 meters, the highway will be shut down. In civil aviation, when the visibility is less than 800 meters, the aircraft stops taking off and landing. In air force weather, when visibility is below 4 km, it is called complex weather. In the visibility class, visibility below 4000 meters starts to be defined as poor visibility, and therefore low visibility weather is a concern for various departments. And the low visibility condition is transmitted with low frequency, and the time of the visibility lower than 4000 meters is only about 10 percent according to the observation data analysis of the visibility in two years at Pudong airport in Shanghai.
In the related art, visibility detection is generally performed based on single meteorological data or image data, however, since the low visibility is too little and generally only about 10%, the machine learning model is difficult to learn the characteristics of the low visibility, so that the detection accuracy of the low visibility in the related art is low, and the accurate detection of the low visibility has important significance for some departments.
Disclosure of Invention
The embodiment of the invention provides a visibility monitoring method, a device, equipment, a system and a medium, which can monitor visibility in real time all day long and effectively improve monitoring accuracy.
In a first aspect, an embodiment of the present invention provides a visibility monitoring method, including:
acquiring meteorological data and image data of a monitoring area at the current moment;
respectively extracting features of the meteorological data and the image data to obtain meteorological features and image features;
inputting the meteorological features into a first visibility recognition model to obtain first visibility;
if the first visibility is smaller than a set threshold value, acquiring a time period of the current moment;
if the time period is a first preset time period, inputting the image characteristics into a second visibility recognition model to obtain the target visibility;
and if the time period is a second preset time period, inputting the meteorological features into a third visibility recognition model to obtain the target visibility.
In a second aspect, an embodiment of the present invention further provides a visibility monitoring device, including:
the data acquisition module is used for acquiring meteorological data image data of the monitoring area at the current moment;
the characteristic acquisition module is used for respectively extracting the characteristics of the meteorological data and the image data to obtain meteorological characteristics and image characteristics;
the first visibility obtaining module is used for inputting the meteorological features into a first visibility recognition model to obtain first visibility;
the time period acquisition module is used for acquiring the time period of the current moment if the first visibility is smaller than a set threshold;
the first target visibility obtaining module is used for inputting the image characteristics into a second visibility recognition model to obtain target visibility if the time period is a first preset time period;
and the second target visibility obtaining module is used for inputting the meteorological features into a third visibility recognition model to obtain the target visibility if the time period is a second preset time period.
In a third aspect, an embodiment of the present invention further provides a visibility monitoring system, including:
the visibility monitoring device is respectively connected with the at least one camera and the various sensors;
the at least one camera is arranged in different directions and is used for acquiring image data in different directions;
the various sensors are used for collecting meteorological data; the meteorological data includes: temperature data, humidity data, rainfall data, barometric pressure data, PM data, light sensitive data, wind speed data and wind direction data;
and the visibility monitoring device is used for determining visibility according to the image data and the meteorological data.
In a fourth aspect, an embodiment of the present invention further provides a computer device, including:
one or more processors;
storage means for storing one or more programs;
when the one or more programs are executed by the one or more processors, the one or more processors implement the visibility monitoring method according to any embodiment of the present invention.
In a fifth aspect, an embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program is executed by a processor to implement the visibility monitoring method according to any embodiment of the present invention.
The embodiment of the invention provides a visibility monitoring method, a visibility monitoring device, visibility monitoring equipment, a visibility monitoring system and a visibility monitoring medium, wherein meteorological data and image data of a monitoring area at the current moment are firstly acquired; then, respectively extracting features of the meteorological data and the image data to obtain meteorological features and image features; finally, inputting the meteorological features into a first visibility recognition model to obtain first visibility; if the first visibility is smaller than a set threshold value, acquiring a time period of the current moment; if the time period is a first preset time period, inputting the image characteristics into a second visibility recognition model to obtain the target visibility; and if the time period is a second preset time period, inputting the meteorological features into a third visibility recognition model to obtain the target visibility. By the aid of the scheme, visibility can be monitored in all weather in real time, different visibility recognition models can be used for visibility recognition in consideration of different daytime visibility and night visibility when the visibility is lower than a certain value, and monitoring accuracy is effectively improved.
Drawings
Fig. 1 is a schematic flow chart of a visibility monitoring method according to an embodiment of the present invention;
fig. 2 is a schematic flowchart illustrating a visibility monitoring method according to an embodiment of the present invention;
fig. 3 is a schematic structural diagram of a visibility monitoring apparatus according to a second embodiment of the present invention;
fig. 4 is a schematic structural diagram of a visibility monitoring system according to a third embodiment of the present invention;
fig. 5 is a schematic structural diagram of a computer device according to a fourth embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
Before discussing exemplary embodiments in more detail, it should be noted that some exemplary embodiments are described as processes or methods depicted as flowcharts. Although a flowchart may describe the operations (or steps) as a sequential process, many of the operations can be performed in parallel, concurrently or simultaneously. In addition, the order of the operations may be re-arranged. The process may be terminated when its operations are completed, but may have additional steps not included in the figure. The processes may correspond to methods, functions, procedures, subroutines, and the like. In addition, the embodiments and features of the embodiments in the present invention may be combined with each other without conflict.
The term "include" and variations thereof as used herein are intended to be open-ended, i.e., "including but not limited to". The term "based on" is "based, at least in part, on". The term "one embodiment" means "at least one embodiment".
Example one
Visibility is the maximum horizontal visibility distance that a sighted person can identify a target object from the background of the sky under the weather conditions at that time. Visibility can be divided into daytime visibility and night visibility. The daytime visibility means that the light emitting point of the target lamp can be clearly seen, and is different from the daytime visibility at night. Factors influencing visibility mainly include atmospheric transparency and light intensity, and are closely related to the weather conditions at that time.
At present, visibility monitoring methods mainly comprise manual observation, instrument measurement and traditional machine learning detection methods.
The manual observation method is mainly based on visual observation, and the farthest distance of the target object is distinguished from the background by human eyes. The manual observation has strong subjectivity and large error, the visual observation method has complicated procedures and large workload, the manual observation cannot realize all-weather observation, and the labor cost is high.
The instrument measurement method mainly uses a transmission type or scattering type instrument, and can realize all-weather real-time detection of visibility. However, the measuring instrument is high in price, the equipment layout is very sparse, and the visibility measurement is easily influenced by a local microenvironment of the instrument. Furthermore, since the definition of night visibility is different from day visibility, the measured visibility of the instrument often needs to be further modified for application.
The existing visibility monitoring technology based on machine learning can solve the problem of high cost in manual observation and instrument measurement to a certain extent, but is generally based on single meteorological data or image data, and does not consider the difference of visibility in daytime and at night. More importantly, because the low visibility condition accounts for too little, only about 10%, the machine learning model is difficult to learn the characteristics of the low visibility, the monitoring accuracy for the low visibility is low, and the low visibility condition is really concerned by all departments.
In view of the above problems, an embodiment of the present invention provides a visibility monitoring method, which can implement all-weather real-time monitoring through a low-cost meteorological sensor and a camera. The method comprises the steps of firstly judging whether the visibility is lower than a set threshold value by using a first visibility identification model, and for example, giving a specific visibility value by using a regression fitting model under the condition of being under 4000 meters concerned by a meteorological department. When the visibility value is given, the difference between the first preset time period and the second preset time period is fully considered, and the first visibility identification model based on meteorological data and the second visibility identification model based on image data are used, so that the accuracy of visibility monitoring is higher.
Fig. 1 is a schematic flowchart of a visibility monitoring method according to an embodiment of the present invention, where the method is applicable to monitoring visibility of an environment, and the method may be implemented by a visibility monitoring apparatus, where the apparatus may be implemented by software and/or hardware, and is generally integrated on a computer device.
As shown in fig. 1, a visibility monitoring method according to a first embodiment of the present invention includes the following steps:
and S110, acquiring meteorological data and image data of the monitoring area at the current moment.
The monitoring area may be any area where visibility needs to be monitored, and may include, for example, airports, highways, and the like.
In this embodiment, the meteorological data may include temperature data, humidity data, rainfall data, barometric pressure data, PM2.5 data, light sensitive data, wind speed and direction, and the like, within the monitored area. Meteorological data may be acquired by various sensors, for example, temperature sensors may measure temperature data within a monitored area.
In this embodiment, the image data may be obtained by processing the image in the monitoring area obtained by shooting. For example, the image may be an airport image captured on an airport tower, and the image data may be obtained by uploading the image and performing data processing on the image.
And S120, respectively extracting the features of the meteorological data and the image data to obtain meteorological features and image features.
In this embodiment, meteorological features may be extracted from the meteorological data, and the meteorological features may include temperature, humidity, rainfall, barometric pressure, PM2.5, illumination intensity, wind speed, wind direction, and the like. The meteorological feature extraction process is the prior art, and is not described herein.
The image feature may be a feature that characterizes a property of the image, and in this embodiment, the image feature may be a feature that characterizes a transmittance of the image in the monitored area.
Specifically, the extracting the features of the image data, and obtaining the image features may include: and preprocessing and extracting the features of the image data based on a dark channel inspection algorithm to obtain the image features.
The dark channel prior algorithm can be used for extracting image features of image data, the dark channel can be understood as the minimum value of three channels of each pixel point in the image data, and the dark channel prior theory considers that in an outdoor fog-free image, in most of non-sky local areas, at least one color channel of some pixels always has a very low value.
The dark channel first-pass algorithm considers that in most non-sky local areas, some pixels always have at least one color channel with a value close to 0. For any image data J, its dark channel can be represented by:
Figure BDA0002805089680000071
wherein, JcPixels representing image data, JdarkRepresenting the pixels of the dark channel image in the image data and omega (x) representing a window area centered on pixel x.
On the basis, the dark channel calculation process of the image data comprises two steps: firstly, solving the minimum value of each pixel RGB component of image data, and storing the minimum value into a gray-scale image with the same size as the original image; the grey map is then minimum filtered.
The fog map formation model of the image data is expressed by the following equation:
I(x)=J(x)t(x)+A(1-t(x))
wherein i (x) represents a pixel of a fog image in the image data, j (x) represents a pixel of a fog-free image in the image data, t (x) represents a transmittance, and a represents a global atmospheric light value.
Let the transmission t (x) be constant within a window Ω (x) centered on x
Figure BDA0002805089680000084
The dark channels are calculated by dividing both sides of the equation by A to obtain the following equation:
Figure BDA0002805089680000081
according to the dark channel prior theory, the dark channel of the fog-free image tends to be 0, and then the dark channel prior of the fog-free image represents the equation as follows:
Figure BDA0002805089680000082
combining the above two equations, the transmittance estimation equation can be obtained as:
Figure BDA0002805089680000083
where ω (0 < ω ≦ 1) represents a correction parameter that may characterize the effect of particles present in the air in the image data, and ω is typically 0.95.
For the estimation of the global atmospheric light value a, the pixel intensities of the dark channel images in the image data may be arranged in order from high to low, and the first 0.1% brightest pixel value may be taken as the global atmospheric light value a of the current image data.
In this embodiment, any one of the pictures may include a sky region and a non-sky region, and the non-sky region is divided into a plurality of regions, for example, the non-sky region may be divided into a plurality of strip-shaped regions; any one of the stripe regions may include a plurality of window regions of any size, for example, 10 × 10 window regions, a transmittance value may be calculated for each window region according to the above dark channel formula, and an average value of the transmittance in each stripe region in the non-sky region may be calculated as an average transmittance, which may be represented by [ t1, t2, t3, …, tn ], where t1 represents the average transmittance in the first stripe region and tn represents the average transmittance in the nth stripe region.
In the present embodiment, [ t1, t2, t3, …, tn ] obtained by the above calculation may be used as an image feature of one image.
S130, inputting the meteorological features into a first visibility recognition model to obtain first visibility.
The first visibility recognition model can be a trained binary model, and the first visibility recognition model can output a corresponding real visibility category according to the input meteorological features. In this embodiment, 4000 meters is used as a boundary, if the visibility is greater than 4000, the output category is 0, and if the visibility is less than or equal to 4000, the output category is 1.
Specifically, the training process of the first visibility recognition model may be: collecting meteorological data and real visibility in a set time period; performing feature extraction on the meteorological data to obtain meteorological features; determining a category label according to the real visibility; constructing a first data set from the meteorological features and the category tags; and training a set neural network based on the first data set to obtain the first visibility recognition model.
The real visibility can be the real visibility of a monitoring area, and the real visibility can be collected from an artificial observation station or a visibility observation instrument.
The meteorological features are derived features, and the derived features can be new features obtained by feature learning of original meteorological features. For example, if the raw meteorological features are temperatures, the derived features may be average temperatures over a period of time. The number of derived features may be one or more, and the specific numerical values are not limited herein.
The category label may be a label representing a category to which real visibility belongs, and illustratively, the real visibility is divided into two categories by taking 4000 meters as a threshold, specifically, a category 1 is set to an actual visibility of less than or equal to 4000 meters, and a category 0 is set to an actual visibility of more than 4000 meters.
Wherein the first data set may be a data set consisting of weather features and category tags, for example, the data set may be represented as [ a1, a2, a3, …, an, L ], n represents the number of weather features, L represents the real visibility, a1 to an represent different weather features, for example, a1 may represent the average temperature, and a2 represents the average humidity for a certain period of time.
Specifically, the first data set is input into a neural network of the second classification model for training, so that the first visibility recognition model can be obtained.
The first data set may be divided into training data and test data, and for example, 80% of data in the data set may be divided into training data and 20% of data may be divided into test data.
It should be noted that the binary classification model may be a Logistic Regression algorithm, or may be other classification algorithms such as a support vector machine SVM.
It should be further noted that the real visibility of the collected meteorological data in the set time period should be distributed as uniformly as possible in the two categories [0,1], and the meteorological data can be sampled at each iteration in the training process, so that the data amount of the two categories [0,1] is balanced.
As an example, the training process of the first visibility recognition model may be: collecting meteorological data and real visibility for a period of time, wherein the data amount of each visibility grade lower than 4000 meters in the collected data is balanced as much as possible; generating derived features according to meteorological features in low visibility; the real visibility is divided into two categories by taking 4000 meters as a threshold value, the category 1 is set when the visibility is less than or equal to 4000 meters, and the category 0 is set when the visibility is greater than or equal to 4000 meters; taking derived features obtained according to meteorological data as input data, and taking a category tag [0,1] as a tag value to construct a first data set; and performing model training on the two classification models based on the first data set to obtain a first visibility recognition model.
And S140, if the first visibility is smaller than a set threshold, acquiring a time period of the current moment.
In this embodiment, the set threshold may be a preset real visibility value, for example, the set threshold may be 4000 meters, and the real visibility is a low visibility below 4000 meters.
Specifically, the obtaining of the time period of the current time may include: and determining a time period corresponding to the current moment according to the photosensitive data, wherein the time period comprises a day period and a night period. When the photosensitive data is smaller than a set value, determining that the time period corresponding to the current moment is a day stage; and when the photosensitive data is greater than or equal to a set numerical value, determining the time period corresponding to the current moment as the night stage.
Wherein the meteorological data may include light sensitive data, which may be measured by a light sensitive resistance sensor. The working principle of the photoresistor sensor is that the stronger the light intensity, the lower its measured value. Alternatively, the measurement range of the photo-sensor can be (0, 1023), and the photo-sensor measurement value is very low and close to 0 when the sun is directly illuminated in the daytime and close to 1023 when the light is completely dark at night.
The set value may be a value of preset light sensitivity data, and the current time zone is determined as the daytime phase when the light sensitivity data is less than the set value, and the current time zone is determined as the nighttime phase when the light sensitivity data is greater than or equal to the set value. For example, if the value is 200, the light sensitivity data is less than 200, and the light sensitivity data is greater than or equal to 200, the day phase is determined, and the night phase is determined.
S150, if the time period is a first preset time period, inputting the image characteristics into a second visibility recognition model to obtain the target visibility.
In the present embodiment, the first preset time period may be understood as a daytime period. The second visibility recognition model can be a regression model obtained through training, and the second visibility recognition model can output a corresponding real visibility value according to the input image characteristics.
Specifically, the training process of the second visibility recognition model may be: collecting image data and a real visibility label when the actual visibility is smaller than a set threshold value in the daytime; performing feature extraction on the image data to obtain image features; constructing a second data set according to the image characteristics and the visibility labels; and training a set neural network based on the second data set to obtain the second visibility recognition model.
In an exemplary case, if the real visibility of one piece of image data is 1000 meters, the 1000 meters is used as the real visibility label of the piece of image data.
The image data is subjected to feature extraction, and the obtained image features may be the same as the image feature extraction process in step S120, which is not described herein again.
Where the second data set may be obtained by combining image features and real visibility tags, for example, one image data may correspond to one piece of training data [ t1, t2, t3, …, tn, L ], and L represents a real visibility tag.
The second data set may be divided into training data and test data, for example, 80% of the data in the data set may be divided into training data, and 20% of the data in the data set may be divided into test data.
Specifically, training data and test data are input into a polynomial regression model for model training, parameters are optimized by a least square method during training, and the regression model with the highest visibility recognition accuracy on the test data is selected to be determined as the second visibility recognition model.
And S160, if the time period is a second preset time period, inputting the meteorological features into a third visibility identification model to obtain the target visibility.
In the present embodiment, the second preset time period may be understood as a night period. The third visibility recognition model can be a regression model obtained through training, and the third visibility recognition model can output a corresponding real visibility value according to the input meteorological features.
Specifically, the training process of the third visibility recognition model is as follows: collecting meteorological data and a real visibility label when the actual visibility is smaller than a set threshold value in the night period; performing feature extraction on the meteorological data to obtain meteorological features; constructing a third data set according to the meteorological features and the real visibility labels; and training a set neural network based on the third data set to obtain the third visibility recognition model.
Illustratively, meteorological data with visibility at night lower than a set threshold value are screened from the meteorological data, and meteorological features, namely derived features and real visibility labels, are extracted from the screened meteorological data; dividing the third data set, and taking 80% of data as training data and 20% of data as test data; respectively using training data and test data to carry out model training on the regression model, and optimizing parameters by adopting a least square method during training; and finally, selecting a regression model with the best visibility recognition effect on the test data to determine the regression model as a third visibility recognition model.
It should be noted that the regression model is not limited to polynomial regression, and may also be other regression algorithms including linear regression, random forest regression, and the like, and the regression model is not specifically limited herein and may be selected according to actual situations.
The visibility monitoring method provided by the embodiment of the invention comprises the steps of firstly, acquiring meteorological data and image data of a monitoring area at the current moment; then, respectively extracting features of the meteorological data and the image data to obtain meteorological features and image features; finally, inputting the meteorological features into a first visibility recognition model to obtain first visibility; if the first visibility is smaller than a set threshold value, acquiring a time period of the current moment; if the time period is a first preset time period, inputting the image characteristics into a second visibility recognition model to obtain the target visibility; and if the time period is a second preset time period, inputting the meteorological features into a third visibility recognition model to obtain the target visibility. By the aid of the scheme, visibility can be monitored in all weather in real time, different visibility recognition models are used for visibility recognition in consideration of different daytime visibility and night visibility under the condition that the visibility is lower than 4000 meters, and monitoring accuracy is effectively improved.
An implementation process of the visibility monitoring method according to the first embodiment is exemplarily explained below, and fig. 2 is an exemplary flowchart of the visibility monitoring method according to the first embodiment of the present invention.
As shown in fig. 2, meteorological data and image data of the monitored area are collected every one minute; inputting the meteorological data extraction derivative characteristics into a two-class model, namely a first visibility identification model, determining whether the visibility is less than 4000 meters according to an output result of the model, ending if the visibility is greater than 4000 meters, continuously judging whether a monitoring area is in the daytime according to photosensitive data in the meteorological data if the visibility is less than 4000 meters, extracting image characteristics according to the image data if the visibility is in the daytime, inputting the image characteristics into a visibility fitting model based on images in the daytime, namely a second visibility identification model, obtaining a visibility value, namely target visibility, if the visibility is not in the daytime, indicating that the monitoring area is at night, extracting the derivative characteristics from the collected meteorological data, inputting the derivative characteristics into a visibility fitting model based on the meteorological data in the night time, namely a third visibility identification model, and obtaining the visibility value, namely target visibility.
Example two
Fig. 3 is a schematic structural diagram of a visibility monitoring apparatus according to a second embodiment of the present invention, which can be adapted to monitor visibility of an environment, wherein the apparatus can be implemented by software and/or hardware and is generally integrated on a computer device.
As shown in fig. 3, the apparatus includes:
the data acquisition module 310 is configured to acquire meteorological data image data of a monitoring area at a current moment;
a feature obtaining module 320, configured to perform feature extraction on the meteorological data and the image data, respectively, to obtain meteorological features and image features;
the first visibility obtaining module 330 is configured to input the meteorological features into a first visibility identification model to obtain first visibility;
a time period obtaining module 340, configured to obtain a time period of the current time if the first visibility is smaller than a set threshold;
a target visibility first obtaining module 350, configured to input the image features into a second visibility recognition model to obtain target visibility if the time period is a first preset time period;
and a second target visibility obtaining module 360, configured to input the meteorological features into a third visibility recognition model to obtain target visibility if the time period is a second preset time period.
In this embodiment, the device first obtains meteorological data image data of a monitoring area at the current moment through a data obtaining module; then, respectively extracting features of the meteorological data and the image data through a feature acquisition module to obtain meteorological features and image features; finally, inputting the meteorological features into a first visibility recognition model through a first visibility obtaining module to obtain first visibility; the time period acquisition module is used for acquiring the time period of the current moment if the first visibility is smaller than a set threshold; the target visibility first acquisition module is used for inputting the image characteristics into a second visibility recognition model to acquire target visibility if the time period is a first preset time period; and the target visibility second acquisition module is used for inputting the meteorological features into a third visibility recognition model to acquire the target visibility if the time period is a second preset time period.
The embodiment provides a visibility monitoring device, can all-weather real-time monitoring visibility, to the condition that visibility is less than certain numerical value, considers the difference of first preset time quantum and second preset time quantum visibility, uses different visibility recognition models to carry out visibility recognition, effectively improves the accuracy of monitoring.
Further, the feature obtaining module 320 is further configured to perform feature extraction on the image data based on a dark channel prior inspection algorithm to obtain image features.
Further, the time period obtaining module 340 is further configured to determine that the time period corresponding to the current time is the daytime stage when the photosensitive data is smaller than a set value; and when the photosensitive data is greater than or equal to a set numerical value, determining the time period corresponding to the current moment as the night stage.
Further, the first visibility recognition model training module is used for acquiring meteorological data and real visibility in a set time period; performing feature extraction on the meteorological data to obtain meteorological features; determining a category label according to the real visibility; constructing a first data set from the meteorological features and the category tags; and training a set neural network based on the first data set to obtain the first visibility recognition model.
Further, the second visibility recognition model training module is used for acquiring image data and real visibility labels when the actual visibility is smaller than a set threshold value in the daytime; performing feature extraction on the image data to obtain image features; constructing a second data set according to the image characteristics and the real visibility label; and training a set neural network based on the second data set to obtain the second visibility recognition model.
Further, the third visibility recognition model training module is used for acquiring meteorological data and a real visibility label when the actual visibility is smaller than a set threshold value in the night period; performing feature extraction on the meteorological data to obtain meteorological features; constructing a third data set according to the meteorological features and the real visibility labels; and training a set neural network based on the third data set to obtain the third visibility recognition model.
The visibility monitoring device can execute the visibility monitoring method provided by any embodiment of the invention, and has corresponding functional modules and beneficial effects of the execution method.
EXAMPLE III
Fig. 4 is a schematic structural diagram of a visibility monitoring system according to a third embodiment of the present invention, which can be applied to monitoring visibility of an environment, where the system can be implemented by software and/or hardware.
As shown in fig. 4, the system includes: a visibility monitoring device 410, at least one camera 420 and various sensors 430, which can perform the visibility monitoring method described in the above embodiments;
the visibility monitoring device 410 is respectively connected with at least one camera 420 and various sensors 430;
the at least one camera 420 is arranged in different directions and used for acquiring image data in different directions;
the various sensors 430 are used to collect meteorological data;
and the visibility monitoring device 410 is used for determining visibility according to the image data and the meteorological data.
At least one camera 420 can be horizontally arranged, the view field of the oriented image cannot be blocked, and the information as much as possible can be obtained from the image shot by the camera. If a plurality of cameras are arranged, different cameras can be oriented to different directions so as to shoot images at different angles as much as possible.
The camera 420 may transmit the collected images to the visibility device 410 by being connected to the visibility monitoring device 410;
among other things, the various sensors 430 may include: the device comprises a temperature sensor, a humidity sensor, a rainfall sensor, an atmospheric pressure sensor, a PM2.5 sensor, a photoresistance sensor and a wind speed and direction sensor. The meteorological data may include: temperature data, humidity data, rainfall data, barometric pressure data, PM data, light sensitivity data, and wind speed data.
The various sensors 430 may transmit the measured meteorological data to the visibility monitoring device 410 by being connected to the visibility monitoring device 410.
For example, temperature data of the monitored area may be measured by a temperature sensor; the humidity sensor can be used for measuring humidity data of the monitoring area; the rainfall sensor can be used for measuring rainfall data of a monitoring area; the barometric pressure sensor may be used to measure barometric pressure data for the monitored area; the PM2.5 sensor may be used to measure PM2.5 data for the monitored area; the photoresistor sensor can be used for measuring photosensitive data of the monitoring area; the wind speed and direction sensor can be used for measuring wind speed data and wind direction data of a monitoring area.
The visibility monitoring device 410 may be connected to at least one camera 420 and various sensors 430, respectively, for acquiring image data and weather data.
Specifically, the visibility monitoring device 410 may be configured to obtain meteorological data image data of the monitored area at the current moment;
the visibility monitoring device 410 may be configured to perform feature extraction on the meteorological data and the image data, respectively, to obtain meteorological features and image features;
the visibility monitoring device 410 may be configured to input the meteorological features into a first visibility recognition model to obtain a first visibility;
the visibility monitoring device 410 may be further configured to obtain a time period of the current time if the first visibility is smaller than a set threshold;
the visibility monitoring device 410 may be further configured to, if the time period is a first preset time period, input the image features into a second visibility recognition model to obtain target visibility;
the visibility monitoring device 410 may be further configured to, if the time period is a second preset time period, input the meteorological features into a third visibility recognition model to obtain the target visibility.
In this embodiment, the visibility monitoring system includes a visibility monitoring device, at least one camera, and various sensors, and the visibility monitoring device is configured to determine visibility according to the image data and the meteorological data. The visibility monitoring system can monitor the visibility in all weather and in real time, and for the situation that the visibility is lower than 4000 meters, different visibility recognition models are used for recognizing the visibility in the daytime and at night, so that the monitoring accuracy is effectively improved.
Example four
Fig. 5 is a schematic structural diagram of a computer device according to a fourth embodiment of the present invention. As shown in fig. 5, a computer device provided in the fourth embodiment of the present invention includes: one or more processors 51 and storage 52; the processor 51 in the computer device may be one or more, and fig. 5 illustrates one processor 51 as an example; storage 52 is used to store one or more programs; the one or more programs are executed by the one or more processors 51, so that the one or more processors 51 implement the visibility monitoring method according to any one of the embodiments of the present invention.
The computer device may further include: an input device 53 and an output device 54.
The processor 51, the storage means 52, the input means 53 and the output means 54 in the computer apparatus may be connected by a bus or other means, which is exemplified in fig. 5.
The storage device 52 in the computer device is used as a computer readable storage medium, and can be used for storing one or more programs, which may be software programs, computer executable programs, and modules, such as program instructions/modules corresponding to the visibility monitoring method provided in the embodiment of the present invention (for example, the modules in the visibility monitoring apparatus shown in fig. 3 may include a first visibility obtaining module 330, a target visibility first obtaining module 350, a target visibility second obtaining module 360, and the like). The processor 51 executes various functional applications and data processing of the computer device by running software programs, instructions and modules stored in the storage device 52, so as to implement the visibility monitoring method in the above method embodiment.
The storage device 52 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to use of the computer device, and the like. Further, the storage 52 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some examples, the storage 52 may further include memory located remotely from the processor 51, which may be connected to the device over a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The input device 53 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function controls of the computer apparatus. The output device 54 may include a display device such as a display screen.
And, when one or more programs included in the above-mentioned computer apparatus are executed by the one or more processors 51, the programs perform the following operations:
acquiring meteorological data and image data of a monitoring area at the current moment;
respectively extracting features of the meteorological data and the image data to obtain meteorological features and image features;
inputting the meteorological features into a first visibility recognition model to obtain first visibility;
if the first visibility is smaller than a set threshold value, acquiring a time period of the current moment;
if the time period is a first preset time period, inputting the image characteristics into a second visibility recognition model to obtain the target visibility;
and if the time period is a second preset time period, inputting the meteorological features into a third visibility recognition model to obtain the target visibility.
EXAMPLE five
An embodiment of the present invention provides a computer-readable storage medium, on which a computer program is stored, where the computer program is used, when executed by a processor, to execute a visibility monitoring method, where the method includes:
acquiring meteorological data and image data of a monitoring area at the current moment;
respectively extracting features of the meteorological data and the image data to obtain meteorological features and image features;
inputting the meteorological features into a first visibility recognition model to obtain first visibility;
if the first visibility is smaller than a set threshold value, acquiring a time period of the current moment;
if the time period is a first preset time period, inputting the image characteristics into a second visibility recognition model to obtain the target visibility;
and if the time period is a second preset time period, inputting the meteorological features into a third visibility recognition model to obtain the target visibility.
Optionally, the program, when executed by the processor, may be further configured to perform the visibility monitoring method according to any embodiment of the present invention.
Computer storage media for embodiments of the invention may employ any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a Read Only Memory (ROM), an Erasable Programmable Read Only Memory (EPROM), a flash Memory, an optical fiber, a portable CD-ROM, an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. A computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take a variety of forms, including, but not limited to: an electromagnetic signal, an optical signal, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, Radio Frequency (RF), etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (10)

1. A method of monitoring visibility, comprising:
acquiring meteorological data and image data of a monitoring area at the current moment;
respectively extracting features of the meteorological data and the image data to obtain meteorological features and image features;
inputting the meteorological features into a first visibility recognition model to obtain first visibility;
if the first visibility is smaller than a set threshold value, acquiring a time period of the current moment;
if the time period is a first preset time period, inputting the image characteristics into a second visibility recognition model to obtain the target visibility;
and if the time period is a second preset time period, inputting the meteorological features into a third visibility recognition model to obtain the target visibility.
2. The method of claim 1, wherein performing feature extraction on the image data to obtain image features comprises:
and carrying out feature extraction on the image data based on a dark channel pre-inspection algorithm to obtain image features.
3. The method of claim 1, wherein the meteorological data comprises light-sensitive data, and the obtaining the time period for the current time comprises:
when the photosensitive data is smaller than a set value, determining that the time period corresponding to the current moment is a day stage; and when the photosensitive data is greater than or equal to a set numerical value, determining the time period corresponding to the current moment as the night stage.
4. The method of claim 3, wherein the training process of the first visibility recognition model is:
collecting meteorological data and real visibility in a set time period;
performing feature extraction on the meteorological data to obtain meteorological features;
determining a category label according to the real visibility;
constructing a first data set from the meteorological features and the category tags;
and training a set neural network based on the first data set to obtain the first visibility recognition model.
5. The method of claim 3, wherein the second visibility recognition model is trained by:
acquiring image data and a real visibility label when the real visibility is smaller than a set threshold value in the daytime;
performing feature extraction on the image data to obtain image features;
constructing a second data set according to the image characteristics and the real visibility label;
and training a set neural network based on the second data set to obtain the second visibility recognition model.
6. The method of claim 3, wherein the third visibility recognition model is trained by:
collecting meteorological data and a real visibility label when the actual visibility is smaller than a set threshold value in the night period;
performing feature extraction on the meteorological data to obtain meteorological features;
constructing a third data set according to the meteorological features and the real visibility labels;
and training a set neural network based on the third data set to obtain the third visibility recognition model.
7. A visibility monitoring device, comprising:
the data acquisition module is used for acquiring meteorological data image data of the monitoring area at the current moment;
the characteristic acquisition module is used for respectively extracting the characteristics of the meteorological data and the image data to obtain meteorological characteristics and image characteristics;
the first visibility obtaining module is used for inputting the meteorological features into a first visibility recognition model to obtain first visibility;
the time period acquisition module is used for acquiring the time period of the current moment if the first visibility is smaller than a set threshold;
the first target visibility obtaining module is used for inputting the image characteristics into a second visibility recognition model to obtain target visibility if the time period is a first preset time period;
and the second target visibility obtaining module is used for inputting the meteorological features into a third visibility recognition model to obtain the target visibility if the time period is a second preset time period.
8. A visibility monitoring system, comprising: the visibility monitoring device as claimed in claim 7, at least one camera and a plurality of sensors, said visibility monitoring device being connected to said at least one camera and said plurality of sensors, respectively;
the at least one camera is arranged in different directions and is used for acquiring image data in different directions;
the various sensors are used for collecting meteorological data;
and the visibility monitoring device is used for determining visibility according to the image data and the meteorological data.
9. A computer device, comprising:
one or more processors;
storage means for storing one or more programs;
when executed by the one or more processors, cause the one or more processors to implement the visibility monitoring method as claimed in any one of claims 1-6.
10. A computer-readable storage medium, on which a computer program is stored, which program, when being executed by a processor, is adapted to carry out a visibility monitoring method as defined in any one of claims 1 to 6.
CN202011364730.1A 2020-11-27 2020-11-27 Visibility monitoring method, device, equipment, system and medium Pending CN112649900A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011364730.1A CN112649900A (en) 2020-11-27 2020-11-27 Visibility monitoring method, device, equipment, system and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011364730.1A CN112649900A (en) 2020-11-27 2020-11-27 Visibility monitoring method, device, equipment, system and medium

Publications (1)

Publication Number Publication Date
CN112649900A true CN112649900A (en) 2021-04-13

Family

ID=75349559

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011364730.1A Pending CN112649900A (en) 2020-11-27 2020-11-27 Visibility monitoring method, device, equipment, system and medium

Country Status (1)

Country Link
CN (1) CN112649900A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113064220A (en) * 2021-06-03 2021-07-02 四川九通智路科技有限公司 Visibility measuring system and measuring method based on nonlinear autoregressive neural network
CN113807454A (en) * 2021-09-18 2021-12-17 航天新气象科技有限公司 Visibility determination method and device, computer equipment and readable storage medium
CN114202542A (en) * 2022-02-18 2022-03-18 象辑科技(武汉)股份有限公司 Visibility inversion method and device, computer equipment and storage medium
CN114720425A (en) * 2022-04-24 2022-07-08 安徽气象信息有限公司 Visibility monitoring system and method based on image recognition
CN117576490A (en) * 2024-01-16 2024-02-20 口碑(上海)信息技术有限公司 Kitchen environment detection method and device, storage medium and electronic equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060271297A1 (en) * 2005-05-28 2006-11-30 Carlos Repelli Method and apparatus for providing environmental element prediction data for a point location
CN104297176A (en) * 2014-09-17 2015-01-21 武汉理工大学 Device, system and method for monitoring visibility of channel segments of Yangtze River in mountainous area in all-weather manner
CN106556579A (en) * 2016-11-07 2017-04-05 南京理工大学 Group's mist image all-weather self-adapting detection method based on laser
CN208335208U (en) * 2018-03-07 2019-01-04 中国科学院西安光学精密机械研究所 Image co-registration acquisition system comprising meteorologic parameter
CN110188586A (en) * 2018-04-13 2019-08-30 山东百世通大数据科技有限公司 System and application method based on meteorological observation, road camera shooting visibility identification

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060271297A1 (en) * 2005-05-28 2006-11-30 Carlos Repelli Method and apparatus for providing environmental element prediction data for a point location
CN104297176A (en) * 2014-09-17 2015-01-21 武汉理工大学 Device, system and method for monitoring visibility of channel segments of Yangtze River in mountainous area in all-weather manner
CN106556579A (en) * 2016-11-07 2017-04-05 南京理工大学 Group's mist image all-weather self-adapting detection method based on laser
CN208335208U (en) * 2018-03-07 2019-01-04 中国科学院西安光学精密机械研究所 Image co-registration acquisition system comprising meteorologic parameter
CN110188586A (en) * 2018-04-13 2019-08-30 山东百世通大数据科技有限公司 System and application method based on meteorological observation, road camera shooting visibility identification

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113064220A (en) * 2021-06-03 2021-07-02 四川九通智路科技有限公司 Visibility measuring system and measuring method based on nonlinear autoregressive neural network
CN113807454A (en) * 2021-09-18 2021-12-17 航天新气象科技有限公司 Visibility determination method and device, computer equipment and readable storage medium
CN114202542A (en) * 2022-02-18 2022-03-18 象辑科技(武汉)股份有限公司 Visibility inversion method and device, computer equipment and storage medium
CN114202542B (en) * 2022-02-18 2022-04-19 象辑科技(武汉)股份有限公司 Visibility inversion method and device, computer equipment and storage medium
CN114720425A (en) * 2022-04-24 2022-07-08 安徽气象信息有限公司 Visibility monitoring system and method based on image recognition
CN114720425B (en) * 2022-04-24 2023-02-21 安徽气象信息有限公司 Visibility monitoring system and method based on image recognition
CN117576490A (en) * 2024-01-16 2024-02-20 口碑(上海)信息技术有限公司 Kitchen environment detection method and device, storage medium and electronic equipment
CN117576490B (en) * 2024-01-16 2024-04-05 口碑(上海)信息技术有限公司 Kitchen environment detection method and device, storage medium and electronic equipment

Similar Documents

Publication Publication Date Title
CN112649900A (en) Visibility monitoring method, device, equipment, system and medium
CN112422783B (en) Unmanned aerial vehicle intelligent patrol system based on parking apron cluster
CN108037770B (en) Unmanned aerial vehicle power transmission line inspection system and method based on artificial intelligence
CN112435207B (en) Forest fire monitoring and early warning method based on sky-ground integration
Li et al. Meteorological visibility evaluation on webcam weather image using deep learning features
CN111931565A (en) Photovoltaic power station UAV-based autonomous inspection and hot spot identification method and system
CN109375290B (en) Cross-sea bridge fog monitoring system based on machine learning and application method thereof
Li et al. A novel evaluation method for pavement distress based on impact of ride comfort
CN112365467A (en) Foggy image visibility estimation method based on single image depth estimation
CN116846059A (en) Edge detection system for power grid inspection and monitoring
CN208335208U (en) Image co-registration acquisition system comprising meteorologic parameter
Liu et al. Manhole cover detection from natural scene based on imaging environment perception
CN111914933A (en) Snowfall detection method and device, computer equipment and readable storage medium
CN114998771B (en) Display method and system for enhancing visual field of aircraft, aircraft and storage medium
US10489923B2 (en) Estimating conditions from observations of one instrument based on training from observations of another instrument
CN105469115A (en) Statistical feature-based day and night image recognition method
CN112629881B (en) Method for extracting automatic driving simulation test element
Zhao et al. Visibility video detection with dark channel prior on highway
KR20210111578A (en) Distortion Method of Total Cloude Cover in Night Time using Ground Based Whole Sky Image Data
Li Urban Remote Sensing Using Ground‐Based Street View Images
Mengxuan et al. Development of cloud recognition system for ground-based cloud images based on machine vision
Petkova Deploying drones for autonomous detection of pavement distress
CN116597404B (en) Sustainable road abnormality detection method and system based on multi-source sensor fusion
CN116109712B (en) Automatic measuring method and measuring equipment for background sky light of telescope-pointing sky area
KR102596080B1 (en) Method of Calculating Day and Night Total Cloud Cover using Photographing Image Ground Camera-based and Support Vector Machine Algorithm

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20210413