CN117893909A - Weather identification method and system based on image data, computing equipment and Internet of things equipment - Google Patents
Weather identification method and system based on image data, computing equipment and Internet of things equipment Download PDFInfo
- Publication number
- CN117893909A CN117893909A CN202410079143.XA CN202410079143A CN117893909A CN 117893909 A CN117893909 A CN 117893909A CN 202410079143 A CN202410079143 A CN 202410079143A CN 117893909 A CN117893909 A CN 117893909A
- Authority
- CN
- China
- Prior art keywords
- target area
- weather type
- predicted weather
- image
- prediction model
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 46
- 238000012423 maintenance Methods 0.000 abstract description 5
- 238000005516 engineering process Methods 0.000 abstract description 3
- 238000012937 correction Methods 0.000 description 11
- 238000010586 diagram Methods 0.000 description 8
- 238000012549 training Methods 0.000 description 7
- 238000004891 communication Methods 0.000 description 5
- 238000000605 extraction Methods 0.000 description 5
- 230000008569 process Effects 0.000 description 5
- 238000012545 processing Methods 0.000 description 5
- 238000012360 testing method Methods 0.000 description 4
- 230000009471 action Effects 0.000 description 3
- 238000001514 detection method Methods 0.000 description 3
- 238000010801 machine learning Methods 0.000 description 3
- 238000012544 monitoring process Methods 0.000 description 3
- 230000011218 segmentation Effects 0.000 description 3
- CURLTUGMZLYLDI-UHFFFAOYSA-N Carbon dioxide Chemical compound O=C=O CURLTUGMZLYLDI-UHFFFAOYSA-N 0.000 description 2
- QAOWNCQODCNURD-UHFFFAOYSA-N Sulfuric acid Chemical compound OS(O)(=O)=O QAOWNCQODCNURD-UHFFFAOYSA-N 0.000 description 2
- 238000003491 array Methods 0.000 description 2
- 238000013473 artificial intelligence Methods 0.000 description 2
- 238000003709 image segmentation Methods 0.000 description 2
- 239000005304 optical glass Substances 0.000 description 2
- 239000002689 soil Substances 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- 230000001960 triggered effect Effects 0.000 description 2
- GRYLNZFGIOXLOG-UHFFFAOYSA-N Nitric acid Chemical compound O[N+]([O-])=O GRYLNZFGIOXLOG-UHFFFAOYSA-N 0.000 description 1
- 238000009825 accumulation Methods 0.000 description 1
- 230000002411 adverse Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- QVGXLLKOCUKJST-UHFFFAOYSA-N atomic oxygen Chemical compound [O] QVGXLLKOCUKJST-UHFFFAOYSA-N 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 229910002092 carbon dioxide Inorganic materials 0.000 description 1
- 239000001569 carbon dioxide Substances 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 150000001875 compounds Chemical class 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000013527 convolutional neural network Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 239000000428 dust Substances 0.000 description 1
- 238000003708 edge detection Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 238000003702 image correction Methods 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 229910017604 nitric acid Inorganic materials 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 229910052760 oxygen Inorganic materials 0.000 description 1
- 239000001301 oxygen Substances 0.000 description 1
- 239000002245 particle Substances 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000010200 validation analysis Methods 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01D—MEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
- G01D21/00—Measuring or testing not otherwise provided for
- G01D21/02—Measuring two or more variables by means not covered by a single other subclass
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/26—Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/82—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02A—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
- Y02A90/00—Technologies having an indirect contribution to adaptation to climate change
- Y02A90/10—Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Evolutionary Computation (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computing Systems (AREA)
- Databases & Information Systems (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- Artificial Intelligence (AREA)
- Health & Medical Sciences (AREA)
- Image Analysis (AREA)
Abstract
The embodiment of the specification provides a weather identification method, a system, a computing device and an Internet of things device based on image data. The method comprises the following steps: acquiring an image of a target area; obtaining a first predicted weather type and a second predicted weather type of the target area by using the first prediction model and the second prediction model based on the image of the target area; the predicted weather type of the target area is determined based on the first predicted weather type and the second predicted weather type of the target area. According to the method, intelligent weather identification is realized by using an AI vision technology, so that the deployment cost and maintenance cost of equipment can be reduced, and the method is suitable for large-scale deployment. In addition, the method adopts a double-model reasoning mode, so that scientific and accurate weather identification can be realized.
Description
Technical Field
The present disclosure relates to the field of weather monitoring, and in particular, to a weather identification method, system, computing device and internet of things device based on image data.
Background
Weather identification provides references and guarantees for activities such as traveling, production and the like of people. Conventional weather identification methods require the use of a large number of sensors to collect various weather data, such as wind speed, wind direction, soil temperature, soil moisture, air temperature, air humidity, carbon dioxide, barometric pressure, light, ultraviolet light, oxygen, PM2.5, and so forth. The deployment and maintenance costs of these sensors are high and are not suitable for large scale deployment.
In recent years, computer vision (AI vision) technology based on artificial intelligence (Artificial Intelligence, AI) has been increasingly used. At present, it is desirable to provide a weather identification method based on AI vision to reduce the deployment cost and maintenance cost of equipment.
Disclosure of Invention
One of the purposes of the application is to provide a weather identification method which is low in cost and suitable for large-scale deployment.
A first aspect of embodiments of the present specification provides a weather identification method based on image data. The method comprises the following steps: acquiring an image of a target area photographed by a camera; obtaining a first predicted weather type and a second predicted weather type of the target area by using a first predicted weather type and a second predicted weather type based on the image of the target area, wherein the first predicted weather type is the output of the first predicted model, the second predicted weather type is the output of the second predicted model, the tag set of the first predicted model comprises a cloudy day and a sunny day, and the tag set of the second predicted model comprises a cloudy sunny day and at least one other tag except the cloudy sunny day; a predicted weather type for the target area is determined based on the first predicted weather type and the second predicted weather type.
In some embodiments, the camera is a wide-angle camera, the obtaining, based on the image of the target area, the first predicted weather type and the second predicted weather type of the target area using the first prediction model and the second prediction model includes: dividing an image of the target region into a first image portion and a second image portion; the first image part corresponds to the sky of the target area, and the second image part corresponds to the ground of the target area; obtaining the first predicted weather type by using a first prediction model based on the first image portion; and obtaining the second predicted weather type by using a second prediction model based on the second image part.
In some embodiments, the determining the predicted weather type for the target area based on the first predicted weather type and the second predicted weather type includes: when the second predicted weather type is sunny and sunny, determining the first predicted weather type as the predicted weather type of the target area; and determining the second predicted weather type as the predicted weather type of the target area when the second predicted weather type is not sunny and sunny.
In some embodiments, the at least one other tag than a cloudy, sunny day includes a rainy day, a snowy day, and a foggy day.
In some embodiments, the method further comprises: acquiring PM2.5 of the target area measured by a PM2.5 sensor and/or air humidity of the target area measured by a humidity sensor; and correcting the predicted weather type of the target area according to PM2.5 and/or air humidity of the target area.
In some embodiments, the at least one other tag than a cloudy day includes a foggy day, and the modifying includes: when the predicted weather type of the target area is cloudy or foggy, judging whether PM2.5 of the target area exceeds a preset concentration threshold value and whether humidity of the target area exceeds a preset humidity threshold value; and if PM2.5 of the target area exceeds the concentration threshold and the humidity of the target area does not exceed the humidity threshold, correcting the predicted weather type of the target area to be haze.
A second aspect of embodiments of the present specification provides a weather identification system based on image data. The system comprises an acquisition module, a prediction module and a determination module. The acquisition module is used for acquiring an image of a target area shot by the camera. The prediction module is used for: and obtaining a first predicted weather type and a second predicted weather type of the target area by using the first prediction model and the second prediction model based on the image of the target area. The first prediction weather type is the output of the first prediction model, the second prediction weather type is the output of the second prediction model, the tag set of the first prediction model comprises a cloudy day and a sunny day, and the tag set of the second prediction model comprises a cloudy and sunny day and at least one other tag except the cloudy and sunny day. The determining module is used for: a predicted weather type for the target area is determined based on the first predicted weather type and the second predicted weather type.
A third aspect of embodiments of the present description provides a computing device. The computing device includes a processor and a memory for storing instructions that, when executed by the processor, implement an image data based weather identification method as described in any of the embodiments herein.
A fourth aspect of embodiments of the present specification provides an internet of things device. The internet of things device comprises a camera, a processor and a memory, wherein the camera is used for shooting an image of a target area, and the memory is used for storing instructions, and when the processor executes the instructions, the weather identification method based on the image data is realized. In some embodiments, the internet of things device further comprises a PM2.5 sensor and/or a humidity sensor. Wherein the PM2.5 sensor is used for measuring PM2.5 of the target area, and the humidity sensor is used for measuring humidity of the target area.
The embodiment of the specification provides a weather identification method, a weather identification system, a computing device and an Internet of things device based on image data. According to the method, intelligent weather identification is realized by using an AI vision technology, so that the deployment cost and maintenance cost of equipment can be reduced, and the method is suitable for large-scale deployment. In addition, the method adopts a double-model reasoning mode, so that scientific and accurate weather identification can be realized.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and together with the description, serve to explain the principles of the application.
FIG. 1 is a schematic illustration of a scenario of meteorological monitoring shown according to some embodiments of the present description;
FIG. 2 is an exemplary block diagram of a computing device shown in accordance with some embodiments of the present description;
FIG. 3 is an exemplary block diagram of an Internet of things device according to some embodiments of the present description;
FIG. 4 is a schematic diagram of an exemplary Internet of things device according to some embodiments of the present description;
FIG. 5 is an exemplary block diagram of an image data based weather identification system according to some embodiments of the present description;
FIG. 6 is an exemplary flow chart of a method of weather identification based on image data, according to some embodiments of the present description.
Detailed Description
In order that the above objects, features and advantages of the present application may be more clearly understood, embodiments of the present application will be further described below. It should be noted that, in the case of no conflict, the embodiments of the present application and the features in the embodiments may be combined with each other.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present application, but the present application may be practiced otherwise than as described herein. It will be apparent that the embodiments in the specification are only some, but not all, embodiments of the application.
FIG. 1 is a schematic illustration of a scenario of meteorological monitoring shown according to some embodiments of the present description. As shown in fig. 1, scene 100 includes a camera 110, a computing device 120, and a network 130.
The camera 110 is used to capture an image of a target area (e.g., an intersection). In some embodiments, a single camera 110 (e.g., a wide angle camera) may be used to capture a single image of the target area. In other embodiments, multiple cameras 110 may be used to capture multiple images of a target area, e.g., the image of the target area includes a ground image and a sky image, and accordingly, a first camera of the ground toward the target area is used to capture the ground image and a second camera of the sky toward the target area is used to capture the sky image.
The computing device 120 is configured to acquire an image of a target area captured by the camera 110 and determine a predicted weather type for the target area based on the image of the target area. More details regarding determining the predicted weather type for the target area may be found elsewhere herein, e.g., steps 620-630 and their associated descriptions.
FIG. 2 is an exemplary block diagram of a computing device, shown according to some embodiments of the present description. As shown in fig. 2, computing device 120 includes a processor 122 and a memory 124. The memory 124 is used to store instructions that, when executed by the processor 122, implement the weather identification method based on image data as described in any of the embodiments of the present specification. For specific details of this method, reference may be made to fig. 6 and its associated description.
In some embodiments, the scenario 100 further comprises a PM2.5 sensor 140 and/or a humidity sensor 150. The PM2.5 sensor 140 is used to measure PM2.5 of the target area, and the humidity sensor 150 is used to measure humidity of the target area. Based on this, the computing device 120 may modify the predicted weather type for the target zone based on the PM2.5 and/or air humidity of the target zone. Further details regarding the correction may be found elsewhere herein, for example, in step 640 and its associated description.
The network 130 is responsible for communication between the different devices. For example, camera 110 may provide an image of the target area to computing device 120 over network 130. As another example, the PM2.5 sensor 140 and/or the humidity sensor 150 may provide PM2.5 and/or humidity of the target area to the computing device 120 over the network 130. The network 130 may include a wired network and/or a wireless network. For example only, the computing device 130 and sensors (camera 110/PM2.5 sensor 140/humidity sensor 150) may communicate remotely over the network 130.
It should be noted that in some scenarios, network 130 is not necessary. Specifically, the camera 110 and the computing device 120 may be integrated as an internet of things device, for example, as a smart camera with data processing functionality. Fig. 3 is an exemplary block diagram of an internet of things device according to some embodiments of the present description. Referring to fig. 3, the internet of things device 200 includes a camera 110, a processor 122, and a memory 124. In some embodiments, the internet of things device 200 further includes a PM2.5 sensor 140 and/or a humidity sensor 150. Further details regarding the PM2.5 sensor 140 and the humidity sensor 150 may be found elsewhere herein, e.g., step 640 and its associated description. The internet of things device 200 may be installed at any suitable location, for example, at an edge location of a target area. In particular, the internet of things device 200 may be mounted on an exterior wall of a building located at the edge of a target area, or on an auxiliary installation (e.g., a shaft) located at the edge of the target area.
Fig. 4 is a schematic structural diagram of an exemplary internet of things device according to some embodiments of the present description. As shown in fig. 4, the internet of things device 200 includes an optical glass window 210, a camera 110, a circuit board 220, a PM2.5 sensor 140, a humidity sensor 150, a power plug 230, and a communication plug 240. The optical glass window 210 is disposed in front of the camera 110 and is used for protecting the camera 110. The camera 110, PM2.5 sensor 140, humidity sensor 150, power plug 230, and communication plug 240 are all connected to the circuit board 220 by interfaces. The circuit board 220 integrates the processor 122, memory 124, and the like. Both the power plug 230 and the communication plug 240 may use aviation plugs to ensure circuit reliability.
In one possible implementation, the circuit board 220 includes an MCU (Microcontroller Unit, micro control unit) that integrates the processor 122 (e.g., a central processing unit), the memory 124 (e.g., memory), the interface, etc. on a single chip. The camera 110 may be connected to the circuit board 220 through a CSI (Camera Serial Interface ) interface, the pm2.5 sensor 140 and the humidity sensor 150 may be connected to the circuit board 220 through a serial interface (such as an RS485 interface), and a power line and a network port (such as an RJ45 interface) are respectively connected to the outside through a power plug 230 and a communication plug 240.
FIG. 5 is an exemplary block diagram of an image data based weather identification system according to some embodiments of the present description. The weather identification system 500 (system 500 for short) may be implemented in the computing device 120 or the internet of things device 200. As shown in fig. 5, the system 500 includes an acquisition module 510, a prediction module 520, and a determination module 530.
The acquiring module 510 is configured to acquire an image of a target area, where the image of the target area is captured by the camera 110.
The prediction module 520 is configured to: and obtaining a first predicted weather type and a second predicted weather type of the target area by using the first prediction model and the second prediction model based on the image of the target area. The first prediction weather type is the output of the first prediction model, the second prediction weather type is the output of the second prediction model, the tag set of the first prediction model comprises a cloudy day and a sunny day, and the tag set of the second prediction model comprises a cloudy sunny day and at least one other tag except the cloudy sunny day.
In some embodiments, the prediction module 520 is further configured to pre-process the image of the target region. Based on this, the prediction module 520 may derive the first predicted weather type and the second predicted weather type of the target region using the first prediction model and the second prediction model based on the image of the preprocessed target region.
The determining module 530 is configured to: a predicted weather type for the target area is determined based on the first predicted weather type and the second predicted weather type.
In some embodiments, the acquisition module 510 is further configured to acquire PM2.5 and/or air humidity of the target zone. Wherein PM2.5 of the target zone may be measured by PM2.5 sensor 140 and air humidity of the target zone may be measured by humidity sensor 150. Accordingly, the system 500 further comprises a correction module 540, the correction module 540 being configured to: and correcting the predicted weather type of the target area according to the PM2.5 and/or the air humidity of the target area.
For more details on system 500 and its modules, reference may be made to fig. 6 and its associated description.
It should be understood that the system shown in fig. 5 and its modules may be implemented in a variety of ways. For example, in some embodiments, the system and its modules may be implemented in hardware, software, or a combination of software and hardware. Wherein the hardware portion may be implemented using dedicated logic; the software portions may then be stored in a memory and executed by a suitable instruction execution system, such as a microprocessor or special purpose design hardware. Those skilled in the art will appreciate that the methods and systems described above may be implemented using computer executable instructions and/or embodied in processor control code, such as provided on a carrier medium such as a magnetic disk, CD or DVD-ROM, a programmable memory such as read only memory (firmware), or a data carrier such as an optical or electronic signal carrier. The system of the present specification and its modules may be implemented not only with hardware circuits such as very large scale integrated circuits or gate arrays, semiconductors such as logic chips, transistors, etc., or programmable hardware devices such as field programmable gate arrays, programmable logic devices, etc., but also with software executed by various types of processors, for example, and with a combination of the above hardware circuits and software (e.g., firmware).
It should be noted that the above description of the system and its modules is for convenience of description only and is not intended to limit the present description to the scope of the illustrated embodiments. It will be appreciated by those skilled in the art that, given the principles of the system, various modules may be combined arbitrarily or a subsystem may be constructed in connection with other modules without departing from such principles. For example, in some embodiments, the prediction module 520 and the determination module 530 may be two modules, or may be combined into one module. Such variations are within the scope of the present description.
FIG. 6 is an exemplary flow chart of a method of weather identification based on image data, according to some embodiments of the present description. The flow 600 may be performed by the computing device 120 or the internet of things device 200, for example, by the system 500 implemented in the computing device 120 or the internet of things device 200. As shown in fig. 6, the flow 600 includes the following steps.
At step 610, an image of a target area is acquired.
An image of the target area is captured by the camera 110. The target area may refer to an area actually photographed by the camera 110, for example, a square, an intersection, a street, etc.
The acquisition module 510 may acquire one or more images of the target area as weather identification basis for a certain time point (or a certain period of time) and a certain area.
In some embodiments, acquiring a single image of the target area as a weather identification basis for a certain time point (or a certain period of time) and a certain area can further save the deployment cost and maintenance cost of the device (for example, only one camera 110 is needed).
In other embodiments, acquiring multiple images of the target area as a weather identification basis for a certain time point (or a certain period of time), a certain area, for example, capturing multiple images of the same area from different angles, or acquiring multiple images of sub-areas (for example, sky and ground) of the target area helps to improve the accuracy of weather identification.
In some embodiments, a wide angle camera may be used to capture both sky and ground. That is, the camera 110 is a wide-angle camera, and an image of the target area includes a first image portion corresponding to the sky of the target area and a second image portion corresponding to the ground of the target area. For example, at a certain viewing angle, the upper half of the image corresponds to the sky and the lower half corresponds to the ground. The rich image field of view can provide more valuable environmental information, thereby helping to improve the accuracy of weather identification.
In some embodiments, the acquisition module 510 may acquire images of the target region in a timed acquisition manner (e.g., acquisition once an hour) or a triggered acquisition manner (e.g., triggered by a user instruction).
In some embodiments, the acquisition module 510 may capture image data from all data sources including video devices, IPC (webcam), local and network videos, pictures, through cv2.videocapture in the Camera class. When the Camera class is created, equipment is started according to parameters such as parameter data addresses, data sizes, frame rates and the like transmitted by the construction function, the internal circulation of a thread detects the starting state, and image data is circularly grabbed and cached according to the frame rates. That is, the acquisition module 510 may buffer the image of the target region at a frame rate, e.g., one image of the target region per second. The obtaining module 510 may preferably capture a new image of the target area from the buffer memory as a weather identification basis. If no new image of the target area is captured within a set period of time (e.g., 1 hour), the obtaining module 510 may obtain the latest image of the target area in the buffer as a weather identification basis. If a new image of the target area is not captured beyond the set period of time, the acquisition module 510 may determine that the data source (camera 110) is out of order and shut down, i.e., the system 500 may stop the weather recognition (e.g., shut down the Detector class and the Recognizer class, which will be described below).
Step 620, obtaining a first predicted weather type and a second predicted weather type of the target area by using the first prediction model and the second prediction model based on the image of the target area.
The inventors have found that combining cloudy, sunny and other weather types directly into a tag set of a single machine learning model, the trained model does not distinguish well between different weather types because there is some crossover in concept (e.g., sunny and snowy) between cloudy and other weather (e.g., rainy and snowy). In view of this, the embodiment of the present specification proposes two sets of weather dividing criteria and accordingly adopts a double-model reasoning manner, so that scientific and accurate weather identification can be achieved. Specifically, the first predicted weather type is the output of the first prediction model, the second predicted weather type is the output of the second prediction model, the tag set of the first prediction model comprises a cloudy day and a sunny day, and the tag set of the second prediction model comprises a cloudy sunny day and at least one tag other than the cloudy sunny day. It should be noted that "cloudy and sunny" shall be regarded as two labels, and "cloudy sunny" shall be regarded as one label and the label refers in particular to a cloudy or sunny day where no weather indicated by other labels is present, e.g. no rainy/snowy/foggy cloudy or sunny day.
For example only, the tag set of the first prediction model is composed of two tags of a cloudy day and a sunny day, and the tag set of the second prediction model is composed of four tags of a cloudy day, a rainy day, a snowy day and a foggy day. It should be appreciated that the set of labels of the second predictive model may be flexibly configured according to weather identification requirements. For example, the set of labels of the second predictive model may be configured to exclude snow days for areas that are not snowy throughout the year, or during seasons other than winter. As another example, tags representing extreme weather may be added for a portion of the area.
In some embodiments, the prediction module 520 may pre-process the image of the target region. For example only, the preprocessing may include one or more of image noise reduction, image enhancement, image correction, and the like. Further, the prediction module 520 may derive the first predicted weather type and the second predicted weather type using the first prediction model and the second prediction model based on the image of the preprocessed target region.
In some embodiments, the first/second prediction model may include a feature extraction layer and a feature processing layer. Wherein the feature extraction layer is configured to determine feature values of one or more features (e.g., luminance features, edge features, texture features, etc.) of the image of the target region, and the feature processing layer is configured to determine a weather recognition result (i.e., a classification result) based on the feature values of the one or more features. In some embodiments, the feature extraction layer may be implemented based on conventional feature extraction algorithms, such as edge detection, texture analysis, and the like. In some embodiments, both the feature extraction layer and the feature processing layer may employ a neural network, for example, a convolutional neural network.
In some embodiments, when the image of the target region includes a first image portion of the target region including a sky corresponding to the target region and a second image portion of the target region corresponding to a ground of the target region, the prediction module 520 may divide the image of the target region into the first image portion (or referred to as a sky portion) and the second image portion (or referred to as a ground portion). The prediction module 520 may implement segmentation according to a preset segmentation scale. For example, after the wide-angle camera is fixed, a plurality of test images of the target area can be acquired, and the horizon lines in the plurality of test images are manually calibrated, so that the segmentation ratio is determined based on the horizon lines calibrated in the plurality of test images. For another example, the view angle of the wide-angle camera may be adjusted according to a preset division ratio (e.g., 1:1) such that the ratio of the sky portion and the ground portion conforms to the division ratio. For another example, the horizon in the image may be intelligently identified by means of machine learning, and image segmentation may be achieved based on the identified horizon. Upon completion of image segmentation, the prediction module 520 may: obtaining a first predicted weather type by using a first prediction model based on the first image portion; based on the second image portion, a second predicted weather type is derived using a second prediction model. That is, the first image portion (sky portion) is an input of the first prediction model, and the second image portion (ground portion) is an input of the second prediction model.
Illustratively, the inventors found that: for distinguishing the cloudy days from the sunny days, the ground part cannot provide stable useful image characteristics (such as brightness) due to adverse factors such as light shielding, and therefore, the sky part of the image is more beneficial to stably distinguishing the cloudy days from the sunny days; for other weather than cloudy and sunny days (e.g., rainy, snowy, and foggy weather), clear shooting of many dynamic scenes in the sky relies on expensive professional cameras, while many static scenes in the ground (e.g., ponding, light reflection, waves, snow accumulation, etc.) can be shot by non-professional cameras, so it is more advantageous to select ground parts (e.g., capable of clearly presenting route ponding at lower cost) as the basis for recognition of these weather.
The first predictive model and the second predictive model may employ any suitable machine learning model. For example only, both the first predictive model and the second predictive model may be obtained by training a pre-trained ResNet50 model.
In some embodiments, the prediction module 520 may manage the model via a Detector class. When the Detector class is created, model information is read according to the model file path, the model additional tag is imported into the system 500 by the file name (for example, sunny_clouedy. Pth), and the system 500 divides the tag according to the file name. For performance reasons, the Detector class provides only one detection method, detector_once (frame), which returns the weather identification result (i.e. predicted weather type) and the timestamp corresponding to the image when the detection is finished. An exemplary detection procedure is: the imported picture data is subjected to size transformation (for example, the picture is converted into a picture with the resolution of 224 x 224), the picture after the size transformation is converted into a tensor format required by model reasoning, and then the picture (for example, the picture with the size of 224 x 3) in the tensor format is imported into the model for reasoning, and a single result with the highest probability is returned.
Exemplary model training strategies are provided below: before training, 10% of the data needs to be extracted from each dataset as validation data; usually, the data set is relatively large, and in training, the required memory may be about ten times larger than the data set, so that a virtual memory with a proper size needs to be set in advance before training; for performance, the res net50 model takes as input a 224×224×3 picture, and can compress the pictures in the dataset before training; the method can use an open source library Pytorch to train a certain number of rounds (such as 300 rounds) of pre-trained ResNet50 models, retain models with verification accuracy higher than 75%, record the change condition of the accuracy, further select 3 models with highest accuracy before fitting as pending models for testing, and finally select an optimal model according to experimental results. It should be appreciated that both the first predictive model and the second predictive model may be obtained by the training strategy described above.
Step 630, determining a predicted weather type for the target area based on the first predicted weather type and the second predicted weather type.
In some embodiments, when the second predicted weather type is cloudy/sunny, the determination module 530 determines the first predicted weather type (cloudy/sunny) as the predicted weather type. When the second predicted weather type is not a cloudy day, the determination module 530 determines the second predicted weather type (e.g., rain/snow/fog) as the predicted weather type. In an alternative embodiment, the determination module 530 may also determine a combination of the first predicted weather type and the second predicted weather type (e.g., weather/overcast) as the predicted weather type when the second predicted weather type is not overcast.
In some embodiments, flow 600 further includes step 640 and step 650.
Step 640, PM2.5 and/or air humidity of the target zone is obtained.
PM2.5 of the target area is measured by a PM2.5 sensor, and air humidity of the target area is measured by a humidity sensor (e.g., a temperature and humidity sensor). It should be noted that the introduction of the PM2.5 sensor and the humidity sensor is controllable in cost, and the two sensors are small in size and convenient to integrate.
Step 650, correcting the predicted weather type of the target area according to the PM2.5 and/or air humidity of the target area.
Haze is small solid particles suspended in air and consists of dust, sulfuric acid, nitric acid and other compounds. Some weather and haze (e.g., fog and haze, cloudy days and light haze) are visually difficult to distinguish between PM2.5 and/or air humidity. By introducing a PM2.5 sensor and a humidity sensor, the resolution of the device to haze can be improved.
In some embodiments, the correction module 540 may make corrections via a Recognizer class.
In some embodiments, the at least one tag other than a cloudy day includes a foggy day, and accordingly, the specific correction procedure is as follows: when the predicted weather type of the target area is cloudy or foggy, judging whether PM2.5 of the target area exceeds a preset concentration threshold (e.g., 150 mug/m 3) and whether humidity of the target area exceeds a preset humidity threshold (e.g., 75%); and if the PM2.5 of the target area exceeds the concentration threshold and the humidity of the target area does not exceed the humidity threshold, correcting the predicted weather type of the target area to be haze. Conversely, if the PM2.5 of the target zone does not exceed the concentration threshold and the humidity of the target zone exceeds the humidity threshold, no correction is necessary. In alternative embodiments, the determination of air humidity may be omitted. Specifically, when the predicted weather type of the target area is cloudy or foggy, the correction module 540 may determine whether the PM2.5 of the target area exceeds a preset concentration threshold (e.g., 150 μg/m 3). If PM2.5 of the target area exceeds the concentration threshold, the correction module 540 may correct the predicted weather type of the target area to haze. Conversely, if the PM2.5 in the target region does not exceed the concentration threshold, no correction is necessary.
In some embodiments, when the predicted weather type of the target area is a foggy day, the correction module 540 may correct the predicted weather type of the target area according to the air humidity of the target area as follows: if the air humidity of the target area exceeds a first humidity threshold (e.g., 75%), correcting the predicted weather type of the target area to be light fog; if the air humidity of the target area exceeds a second humidity threshold (e.g., 90%), the predicted weather type of the target area is corrected to fog. Wherein the second humidity threshold is greater than the first humidity threshold.
It should be noted that in this document, relational terms such as "first" and "second", and the like are used to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article or apparatus that comprises the element.
The foregoing is merely an example of an implementation of the present application that would enable one skilled in the art to understand and practice the present application. Various modifications to the described embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the application. Thus, the present application is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and features disclosed herein.
Claims (10)
1. A weather identification method based on image data, comprising:
acquiring an image of a target area photographed by a camera;
based on the image of the target area, a first predicted weather type and a second predicted weather type of the target area are obtained by using a first prediction model and a second prediction model; the first prediction weather type is the output of the first prediction model, the second prediction weather type is the output of the second prediction model, the tag set of the first prediction model comprises a cloudy day and a sunny day, and the tag set of the second prediction model comprises a cloudy sunny day and at least one other tag except the cloudy sunny day;
a predicted weather type for the target area is determined based on the first predicted weather type and the second predicted weather type.
2. The method of claim 1, wherein the camera is a wide angle camera, the deriving the first predicted weather type and the second predicted weather type for the target area using a first prediction model and a second prediction model based on the image of the target area, comprising:
dividing an image of the target region into a first image portion and a second image portion; the first image part corresponds to the sky of the target area, and the second image part corresponds to the ground of the target area;
obtaining the first predicted weather type by using a first prediction model based on the first image portion;
and obtaining the second predicted weather type by using a second prediction model based on the second image part.
3. The method of claim 1, wherein the determining the predicted weather type for the target area based on the first predicted weather type and the second predicted weather type comprises:
when the second predicted weather type is sunny and sunny, determining the first predicted weather type as the predicted weather type of the target area;
and determining the second predicted weather type as the predicted weather type of the target area when the second predicted weather type is not sunny and sunny.
4. The method of claim 1, wherein the at least one other tag than a cloudy day includes a rainy day, a snowy day, and a foggy day.
5. The method of claim 1, wherein the method further comprises:
acquiring PM2.5 of the target area measured by a PM2.5 sensor and/or air humidity of the target area measured by a humidity sensor;
and correcting the predicted weather type of the target area according to PM2.5 and/or air humidity of the target area.
6. The method of claim 5, wherein the at least one other tag than a cloudy day includes a foggy day, the modifying comprising:
when the predicted weather type of the target area is cloudy or foggy, judging whether PM2.5 of the target area exceeds a preset concentration threshold value and whether humidity of the target area exceeds a preset humidity threshold value;
and if PM2.5 of the target area exceeds the concentration threshold and the humidity of the target area does not exceed the humidity threshold, correcting the predicted weather type of the target area to be haze.
7. The weather identification system based on the image data is characterized by comprising an acquisition module, a prediction module and a determination module;
the acquisition module is used for acquiring an image of a target area shot by the camera;
the prediction module is used for: based on the image of the target area, a first predicted weather type and a second predicted weather type of the target area are obtained by using a first prediction model and a second prediction model; the first prediction weather type is the output of the first prediction model, the second prediction weather type is the output of the second prediction model, the tag set of the first prediction model comprises a cloudy day and a sunny day, and the tag set of the second prediction model comprises a cloudy sunny day and at least one other tag except the cloudy sunny day;
the determining module is used for: a predicted weather type for the target area is determined based on the first predicted weather type and the second predicted weather type.
8. A computing device comprising a processor and a memory for storing instructions that, when executed by the processor, implement the image data-based weather identification method of any one of claims 1-6.
9. An internet of things device, comprising a camera for capturing an image of a target area, a processor and a memory for storing instructions, which when executed by the processor, implement the weather identification method based on image data according to any one of claims 1 to 6.
10. The internet of things device of claim 9, further comprising a PM2.5 sensor and/or a humidity sensor; wherein the PM2.5 sensor is used for measuring PM2.5 of the target area, and the humidity sensor is used for measuring humidity of the target area;
the processor, when executing instructions, implements the weather identification method based on image data as claimed in claim 5 or 6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410079143.XA CN117893909A (en) | 2024-01-19 | 2024-01-19 | Weather identification method and system based on image data, computing equipment and Internet of things equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410079143.XA CN117893909A (en) | 2024-01-19 | 2024-01-19 | Weather identification method and system based on image data, computing equipment and Internet of things equipment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN117893909A true CN117893909A (en) | 2024-04-16 |
Family
ID=90647206
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202410079143.XA Pending CN117893909A (en) | 2024-01-19 | 2024-01-19 | Weather identification method and system based on image data, computing equipment and Internet of things equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117893909A (en) |
-
2024
- 2024-01-19 CN CN202410079143.XA patent/CN117893909A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111460968B (en) | Unmanned aerial vehicle identification and tracking method and device based on video | |
CN109801265B (en) | Real-time transmission equipment foreign matter detection system based on convolutional neural network | |
CN110503092B (en) | Improved SSD monitoring video target detection method based on field adaptation | |
CN110334703B (en) | Ship detection and identification method in day and night image | |
CN111179302B (en) | Moving target detection method and device, storage medium and terminal equipment | |
US11756303B2 (en) | Training of an object recognition neural network | |
CN111835961B (en) | Information processing method and device for automatic exposure of camera and storage medium | |
CN112819858B (en) | Target tracking method, device, equipment and storage medium based on video enhancement | |
CN113160283A (en) | Target tracking method based on SIFT under multi-camera scene | |
CN116612103B (en) | Intelligent detection method and system for building structure cracks based on machine vision | |
CN117152513A (en) | Vehicle boundary positioning method for night scene | |
CN113537226A (en) | Smoke detection method based on deep learning | |
CN115965934A (en) | Parking space detection method and device | |
CN114998801A (en) | Forest fire smoke video detection method based on contrast self-supervision learning network | |
CN118038494A (en) | Cross-modal pedestrian re-identification method for damage scene robustness | |
CN112001300B (en) | Building monitoring method and device based on cross entropy according to position and electronic equipment | |
CN113657287A (en) | Target detection method based on deep learning improved YOLOv3 | |
CN117893909A (en) | Weather identification method and system based on image data, computing equipment and Internet of things equipment | |
CN114550016B (en) | Unmanned aerial vehicle positioning method and system based on context information perception | |
CN109686110A (en) | Parking stall sky expires condition discrimination method and apparatus | |
CN115187568A (en) | Power switch cabinet state detection method and system | |
CN113239931A (en) | Logistics station license plate recognition method | |
CN115018742B (en) | Target detection method, device, storage medium, electronic equipment and system | |
CN113392678A (en) | Pedestrian detection method, device and storage medium | |
CN113505685B (en) | Monitoring equipment installation positioning method and device, electronic equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |