CN117253148A - Carbon emission monitoring method and device, electronic equipment and storage medium - Google Patents

Carbon emission monitoring method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN117253148A
CN117253148A CN202311239246.XA CN202311239246A CN117253148A CN 117253148 A CN117253148 A CN 117253148A CN 202311239246 A CN202311239246 A CN 202311239246A CN 117253148 A CN117253148 A CN 117253148A
Authority
CN
China
Prior art keywords
carbon emission
image
emission source
target area
visible light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311239246.XA
Other languages
Chinese (zh)
Inventor
廉旭刚
詹新彬
陈春阳
罗岗
王波
张羽
胡海峰
陈阳
蔡音飞
刘成
刘涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Taiyuan University of Technology
Beijing Urban Construction Group Co Ltd
Original Assignee
Taiyuan University of Technology
Beijing Urban Construction Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Taiyuan University of Technology, Beijing Urban Construction Group Co Ltd filed Critical Taiyuan University of Technology
Priority to CN202311239246.XA priority Critical patent/CN117253148A/en
Publication of CN117253148A publication Critical patent/CN117253148A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/176Urban or other man-made structures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/7715Feature extraction, e.g. by transforming the feature space, e.g. multi-dimensional scaling [MDS]; Mappings, e.g. subspace methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/80Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
    • G06V10/806Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/07Target detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Artificial Intelligence (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Image Processing (AREA)

Abstract

The invention provides a carbon emission monitoring method, a device, electronic equipment and a storage medium, wherein the carbon emission monitoring method comprises the following steps: collecting a first image of a target area; performing processing on the first image to obtain a second image; the second image is subjected to image enhancement by utilizing gradient sharpening, and is identified by a target detection algorithm to obtain a carbon emission source, wherein the carbon emission source is used for representing an object generating greenhouse gases; the real-time carbon emission amount and the future time period carbon emission amount of the target region are monitored according to the carbon emission data of each carbon emission source. The beneficial effects of the invention are as follows: the method comprises the steps that visible light and infrared images of a target area are collected through an unmanned aerial vehicle, a carbon emission source of the target area is detected and identified through a target detection algorithm based on YOLO, and real-time monitoring of the target area is achieved through the emission quantity of the carbon emission source; by using the method, high-precision carbon emission monitoring is realized according to the power, the running time and the energy consumption of the carbon emission source.

Description

Carbon emission monitoring method and device, electronic equipment and storage medium
Technical Field
The present invention relates to the field of atmospheric monitoring and computer technologies, and in particular, to a carbon emission monitoring method, a carbon emission monitoring device, an electronic apparatus, and a storage medium.
Background
The building industry is one of the main sources of carbon emission, and the building carbon emission amount in 2020 is 50.8 hundred million tons and accounts for 50.9% of the total carbon emission, so the building industry is in need of reducing the carbon emission, and the primary factor of reducing the carbon emission is to accurately monitor the carbon emission. However, the current monitoring method still has certain problems and disadvantages.
The specific problems are:
(1) The carbon emission real-time monitoring system is imperfect, and the existing flue gas monitoring technology cannot meet the requirement of large-scale real-time monitoring;
(2) The carbon emission monitoring data has single source and imperfect means, and currently, carbon emission monitored by a carbon satellite is limited by the fact that the satellite has too low spatial resolution to be used as a carbon emission monitoring means in areas such as construction sites, and therefore, the reliable means for monitoring the carbon emission in a large range are lacking at present.
Disclosure of Invention
The embodiment of the invention mainly aims to provide a carbon emission monitoring method, a device, electronic equipment and a storage medium, which realize real-time carbon emission monitoring and improve the monitoring precision of carbon emission.
One aspect of the present invention provides a carbon emission monitoring method, including:
according to the monitoring request, acquiring a first image of a target area, wherein the first image is used for representing visible light and infrared images;
processing the first image to obtain a second image, wherein the second image is used for representing visible light and infrared images of the target area;
carrying out image gradient sharpening enhancement on the second image, and then identifying through a target detection algorithm to obtain a carbon emission source, wherein the carbon emission source is used for representing objects generating greenhouse gases;
and monitoring the real-time carbon emission amount of the target area and the carbon emission amount of a future time period according to the carbon emission data of each carbon emission source.
The carbon emission monitoring method according to claim, wherein the acquiring a first image of the target area according to the monitoring request includes:
cruising the target area through the unmanned aerial vehicle carrying the visible light load and the infrared load according to a preset route to obtain the first image.
The carbon emission monitoring method according to claim, wherein the processing is performed on the first image to obtain a second image, comprising:
and performing aerial triangle calculation on the first image to obtain the visible light and infrared images, wherein the visible light and infrared images are orthographic images.
The carbon emission monitoring method, wherein the identifying the second image through the target detection algorithm after the image gradient sharpening enhancement to obtain the carbon emission source comprises the following steps:
carrying out gradient sharpening enhancement on the visible light image;
the method for identifying the target based on the YOLO comprises the steps of adopting a dual-branch network structure to input a visible light image and an infrared image simultaneously, interacting the visible light image and the infrared image, generating corresponding attention weights respectively simultaneously, obtaining fusion characteristics by controlling fusion proportion, and predicting the fusion characteristics to obtain carbon emission source characteristics;
and identifying the characteristics of the carbon emission source by adopting at least one of the precision, recall and cross-union ratio to obtain the object type of the carbon emission source.
The carbon emission monitoring method of claim, wherein the carbon emission data comprises:
determining the energy consumption and the power of the carbon emission source according to the object type of the carbon emission source;
and determining the carbon emission amount of the single carbon emission source in unit time according to the energy consumption and the power.
The carbon emission monitoring method of claim, wherein the real-time carbon emission amount to the target region includes:
emission data of carbon emission sources in a historical time period, and the type, the quantity and the running time of the carbon emission sources in a target area, so as to obtain the real-time carbon emission amount of the construction area as follows
Wherein C is the instantaneous total carbon emission of the target zone, gamma a For the emission or operating power, sigma, of a carbon emission source of type a a Is the conversion coefficient of the energy consumption of the carbon emission source and standard carbon, Y a For the carbon emission source whether the full power operation coefficient is the power operation coefficient epsilon [0,1 ]]。
The carbon emission monitoring method, wherein the future period of time carbon emission is monitored, comprises:
acquiring an operation time node of a carbon emission source of a target area, and performing carbon emission C on the carbon emission source in a future time period Total (S) Calculated as
Wherein T is ab And (3) the time when the class a carbon emission source operates in the time node b, and l is the number of operating nodes.
Another aspect of an embodiment of the present invention provides a carbon emission monitoring device, including:
the first module is used for collecting a first image of the target area according to the monitoring request, and the first image is used for representing a visible light image and an infrared image;
the second module is used for performing processing on the first image to obtain a second image, and the second image is used for representing the visible light image and the infrared image of the target area;
the third module is used for carrying out image gradient sharpening enhancement on the second image, identifying through a target detection algorithm, and obtaining a carbon emission source, wherein the carbon emission source is used for representing an object generating greenhouse gases;
and a fourth module for monitoring the real-time carbon emission amount and the carbon emission amount of the future time period of the target area according to the carbon emission data of each carbon emission source.
Another aspect of an embodiment of the present invention provides an electronic device, including a processor and a memory;
the memory is used for storing programs;
the processor executes the program to implement the method as described above.
Embodiments of the present invention also disclose a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The computer instructions may be read from a computer-readable storage medium by a processor of a computer device, and executed by the processor, cause the computer device to perform the method described previously.
The beneficial effects of the invention are as follows: the method comprises the steps that visible light and infrared images of a target area are collected through an unmanned aerial vehicle, a carbon emission source of the target area is detected and identified through a target detection algorithm based on YOLO, and real-time monitoring of the target area is achieved through the emission quantity of the carbon emission source; by using the method, high-precision carbon emission monitoring is realized according to the power, the running time and the energy consumption of the carbon emission source.
Drawings
The foregoing and/or additional aspects and advantages of the invention will become apparent and may be better understood from the following description of embodiments taken in conjunction with the accompanying drawings in which:
FIG. 1 is a schematic diagram of a carbon emission monitoring system according to an embodiment of the present invention.
Fig. 2 is a schematic flow chart of a carbon emission monitoring method according to an embodiment of the invention.
Fig. 3 is an orthographic image of a terminal building in accordance with an embodiment of the present invention.
Fig. 4 is a target monitoring area of an orthographic image of a terminal building in accordance with an embodiment of the present invention.
Fig. 5 is a network structure diagram of a YOLO object recognition algorithm according to an embodiment of the present invention.
Fig. 6 is a schematic diagram of a dual-branch network detection flow of YOLO according to an embodiment of the present invention.
Fig. 7 is a schematic diagram of a carbon emission source identification result of a target monitoring area according to an embodiment of the present invention.
FIG. 8 is a graph of a loss function and a graph of an intersection ratio according to an embodiment of the present invention. Fig. 9 is a schematic diagram of a carbon emission monitoring device according to an embodiment of the present invention.
Detailed Description
The embodiments described below by referring to the drawings are illustrative only and are not to be construed as limiting the invention.
Embodiments of the present invention are described in detail below, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to like or similar elements or elements having like or similar functions throughout. In the following description, suffixes such as "module", "part" or "unit" for representing elements are used only for facilitating the description of the present invention, and have no particular meaning in themselves. Thus, "module," "component," or "unit" may be used in combination. "first", "second", etc. are used for the purpose of distinguishing between technical features only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated or implicitly indicating the precedence of the technical features indicated. In the following description, the continuous reference numerals of the method steps are used for facilitating examination and understanding, and the technical effects achieved by the technical scheme of the invention are not affected by adjusting the implementation sequence among the steps in combination with the overall technical scheme of the invention and the logic relations among the steps. The embodiments described below by referring to the drawings are illustrative only and are not to be construed as limiting the invention.
Referring to fig. 1, fig. 1 is a schematic diagram of a carbon emission monitoring system, which includes an unmanned aerial vehicle 100, a camera 200, a server 300, a client 400, and a carbon emission source 500, wherein the unmanned aerial vehicle 100 is configured to capture a visible light image of a target area through the camera 200, and send the visible light image and an infrared image to the server 300 through wireless communication, and the server 300 is configured to perform processing on the first image to obtain a second image, where the second image is configured to characterize the visible light image and the infrared image of the target area; carrying out image gradient sharpening enhancement on the second image, and identifying through a target detection algorithm to obtain a carbon emission source, wherein the carbon emission source is used for representing an object generating greenhouse gas; according to the carbon emission data of each carbon emission source, the real-time carbon emission amount of the target area and the carbon emission amount of the future time period are monitored, and the monitoring result is displayed in real time on the interface of the client 400.
In some embodiments, the unmanned aerial vehicle 100 is a low-altitude unmanned aerial vehicle capable of carrying high-precision visible light load, such as a flying horse D2000, a arctic puck 3, etc., and the unmanned aerial vehicle is equipped with a GNSS differential antenna, so that track information can be acquired in real time, and later image data processing is facilitated.
In some embodiments, where the carbon emission source 500 is a greenhouse gas capable device, such as a work machine, vehicle, or the like.
In some embodiments, referring to the carbon emission monitoring flow schematic shown in FIG. 2, it includes, but is not limited to, steps S100-S400:
s100, acquiring a first image of a target area according to a monitoring request, wherein the first image is used for representing a visible light image and an infrared image.
In some embodiments, the unmanned aerial vehicle carrying the visible light load and the infrared load cruises the target area according to a preset route to obtain the first image.
In some embodiments, the visual image is visible through the drone. Unmanned aerial vehicle possesses the ability of acquireing high-resolution image fast. The embodiment of the invention can continuously acquire the image of the area to be monitored by utilizing the visible light module or the visible light load and the infrared module or the infrared load after acquisition and analysis so as to realize monitoring the carbon emission condition in the target area at any time.
Exemplary, wherein the visible light module is ZH20T, the effective pixel is 2000 ten thousand, the lens parameter is 6.83-119.94mm zoom, the equivalent focal length is 31.7-556.2mm, the route distance is 44m, the speed is 5m/s, the flying height of the unmanned aerial vehicle is 140m, the side overlapping degree and the course overlapping degree are 70%, and the corresponding ground resolution is 1.96 cm/pixel; the infrared module is ZXT-S, the lens parameter is 19mm fixed focus, and other flight parameters are consistent with visible light.
S200, processing the first image to obtain a second image, wherein the second image is used for representing the visible light image and the infrared image of the target area.
In some embodiments, an aerial triangulation solution is performed on the first image resulting in a visible light image, an infrared image, wherein the visible light image, the infrared image are orthographic images.
In some embodiments, the gradient sharpening process includes
For image f (x, y), the gradient at (x, y) is defined as
To simplify the calculation, the gradient uses the following formula
grad(x,y)=|f x '|+|f y '|
Meanwhile, the Roberts operator is adopted to calculate the corresponding gradient of the image, and the calculation formula is as follows
f x '=|f(x+1,y+1)-f(x,y)|
f y '=|f(x+1,y)-f(x,y+1)|
Image sharpening is performed according to the following rules
With a non-negative threshold T, the edge profile is made to stand out while keeping the background smooth.
In some embodiments, reference is made to the terminal orthographic image shown in FIG. 3, and to the target area monitoring schematic of the terminal shown in FIG. 4. Fig. 3 is an orthographic view obtained by cruising shooting of the unmanned aerial vehicle, wherein the frame selected position in fig. 4 is a target monitoring area.
And S300, identifying the second image through a target detection algorithm after gradient sharpening enhancement to obtain a carbon emission source, wherein the carbon emission source is used for representing an object generating greenhouse gas.
In some embodiments, identifying the second image by using a YOLO-based target identification method to obtain a carbon emission source characteristic; and identifying the characteristics of the carbon emission source by adopting at least one of the precision, recall and cross-union ratio to obtain the object type of the carbon emission source.
In some embodiments, reference is made to FIG. 5 for a network structure of a target recognition algorithm for YOLO that derives carbon emission source characteristics by taking an orthographic image as input.
Illustratively, the operating vehicle and the engineering machinery are extracted using a YOLO frame-based lightweight high-precision object detection algorithm. The recognition effect is verified by using at least one of Precision, recall, and IOU (cross-correlation), and the calculation formula is as follows
Wherein TP is a positive sample with correct classification of a model (YOLO framework); FP is a negative sample of model classification errors; FN is a negative sample of model classification errors; b (B) g Is a true bounding box; b (B) p To detect bounding boxes; v (V) IOU (. Cndot.) is the value of IOU
In some embodiments, reference is made to the dual-branch network detection flow schematic of YOLO shown in fig. 6.
The YOLO algorithm adopted by the invention adopts a double-branch network structure, the visible light image and the infrared image are input at the same time, the visible light image and the infrared image are interacted through the interaction module, the corresponding attention weights are generated at the same time, and the fusion proportion is controlled. And extracting the construction mechanical characteristics of the fused images by using a convolution network to obtain a predicted value, and verifying the identification effect by using Precision (Precision), recall (Recall) and IoU (cross-merging ratio).
Illustratively, the interaction module operates as follows: on the visible light branch and the infrared branch, one branch is virtually taken as a public branch, as shown in fig. 5, before entering the public branch, the number of visible light and infrared light image channels is halved, the visible light and infrared light image channels are spliced together to form an image with the characteristics of visible light and infrared light at the same time, after a plurality of residual blocks, channel separation is carried out, and the separation result is fused with the output results of the visible light and infrared branches, so that the obtained result not only maintains the original input information, but also has shared branch information.
In some embodiments, the carbon emission source refers to the frame selection recognition result shown in fig. 7, which includes different engineering vehicles and construction machines.
In some embodiments, referring to the loss function graph and the cross-correlation graph shown in fig. 7, the confidence of identification except for a very small number in the target area is higher than 82% through analysis, so that the requirement of calculating the carbon emission precision is met.
And S400, monitoring the real-time carbon emission amount of the target area and the carbon emission amount of a future time period according to the carbon emission data of each carbon emission source.
In some embodiments, determining the energy usage and power of the carbon emissions source based on the object type of the carbon emissions source; and determining the carbon emission amount of the single carbon emission source in unit time according to the energy consumption and the power.
Illustratively, in combination with the on-site vehicle and machine operating power data (table 1 below) obtained by on-site investigation, the vehicle and machine instantaneous carbon emissions can be obtained by the carbon emissions algorithm as shown in table 2 below, with a carbon emissions of 324.16Kg/h in the test area.
Table 1 construction site vehicle and mechanical running power meter
Table 2 construction site vehicle and mechanical instant carbon emission meter
In some embodiments, after the construction vehicle and the machine operation time are input, the construction vehicle and the machine carbon emission data set established in advance are combined to obtain the real-time carbon emission of the construction area, wherein the carbon emission algorithm is that
Wherein C is the instantaneous total carbon emission of the construction vehicle and machinery; gamma ray a Displacement or operating power for a class a construction vehicle or machine; sigma (sigma) a Conversion coefficients of energy consumed for corresponding vehicle or mechanical emissions and standard carbon; y is Y a For the coefficient of whether the vehicle or machine is operating at full power, the coefficient e 0,1]。
In some embodiments, in combination with the construction time node table, the construction carbon emissions over a period of time may be obtained by the following specific algorithm
C in the formula Total (S) The total carbon emission of the construction vehicle and the construction machine in a period of time is shown; t (T) ab Time for class a construction vehicle or machine to run within the b time node; l is the number of construction nodes; the rest parameters have the same meaning as the carbon emission algorithm. Fig. 9 is a diagram of a carbon emission monitoring analysis device according to an embodiment of the present invention. The apparatus includes a first module 910, a second module 920, a third module 930, and a fourth module 940.
The first module is used for collecting a first image of the target area according to the monitoring request, wherein the first image is used for representing a visible light image and an infrared image; the second module is used for performing processing on the first image to obtain a second image, and the second image is used for representing the visible light image and the infrared image of the target area; the third module is used for identifying the second image through the target detection algorithm after gradient sharpening enhancement to obtain a carbon emission source, wherein the carbon emission source is used for representing objects generating greenhouse gases; and a fourth module for monitoring the real-time carbon emission amount of the target area and the carbon emission amount of the future period according to the carbon emission data of each carbon emission source.
The device of the embodiment may implement any of the foregoing carbon emission monitoring methods under the cooperation of the first module, the second module, the third module, and the fourth module in the device, that is, collect, according to a monitoring request, a first image of a target area, where the first image is used to represent a visible light image and an infrared image; processing the first image to obtain a second image, wherein the second image is used for representing a visible light image and an infrared image of the target area; carrying out image sharpening enhancement on the second image, and then identifying through a target detection algorithm to obtain a carbon emission source, wherein the carbon emission source is used for representing an object generating greenhouse gas; the real-time carbon emission amount and the future time period carbon emission amount of the target region are monitored according to the carbon emission data of each carbon emission source. The beneficial effects of the invention are as follows: the method comprises the steps that a visible light image and an infrared image of a target area are acquired through an unmanned aerial vehicle, a carbon emission source of the target area is detected and identified through a target detection algorithm based on YOLO, and real-time monitoring of the target area is achieved through the emission quantity of the carbon emission source; by using the method, high-precision carbon emission monitoring is realized according to the power, the running time and the energy consumption of the carbon emission source.
The embodiment of the invention also provides electronic equipment, which comprises a processor and a memory;
the memory stores a program;
the processor executes a program to perform the aforementioned carbon emission method; the electronic equipment has the function of carrying and running the carbon emission software system provided by the embodiment of the invention.
The embodiment of the present invention also provides a computer-readable storage medium storing a program that is executed by a processor to implement the carbon emission method as described above.
In some alternative embodiments, the functions/acts noted in the block diagrams may occur out of the order noted in the operational illustrations. For example, two blocks shown in succession may in fact be executed substantially concurrently or the blocks may sometimes be executed in the reverse order, depending upon the functionality/acts involved. Furthermore, the embodiments presented and described in the flowcharts of the present invention are provided by way of example in order to provide a more thorough understanding of the technology. The disclosed methods are not limited to the operations and logic flows presented herein. Alternative embodiments are contemplated in which the order of various operations is changed, and in which sub-operations described as part of a larger operation are performed independently.
Embodiments of the present invention also disclose a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The computer instructions may be read from a computer-readable storage medium by a processor of a computer device, and executed by the processor, to cause the computer device to perform the aforementioned carbon emission method.
Furthermore, while the invention is described in the context of functional modules, it should be appreciated that, unless otherwise indicated, one or more of the described functions and/or features may be integrated in a single physical device and/or software module or one or more functions and/or features may be implemented in separate physical devices or software modules. It will also be appreciated that a detailed discussion of the actual implementation of each module is not necessary to an understanding of the present invention. Rather, the actual implementation of the various functional modules in the apparatus disclosed herein will be apparent to those skilled in the art from consideration of their attributes, functions and internal relationships. Accordingly, one of ordinary skill in the art can implement the invention as set forth in the claims without undue experimentation. It is also to be understood that the specific concepts disclosed are merely illustrative and are not intended to be limiting upon the scope of the invention, which is to be defined in the appended claims and their full scope of equivalents.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on this understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a usb disk, a removable hard disk, a Read-only memory (ROM), a random access memory (RAM, randomAccessMemory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
Logic and/or steps represented in the flowcharts or otherwise described herein, e.g., a ordered listing of executable instructions for implementing logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). In addition, the computer readable medium may even be paper or other suitable medium on which the program is printed, as the program may be electronically captured, via, for instance, optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner, if necessary, and then stored in a computer memory.
It is to be understood that portions of the present invention may be implemented in hardware, software, firmware, or a combination thereof. In the above-described embodiments, the various steps or methods may be implemented in software or firmware stored in a memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, may be implemented using any one or combination of the following techniques, as is well known in the art: discrete logic circuits having logic gates for implementing logic functions on data signals, application specific integrated circuits having suitable combinational logic gates, programmable Gate Arrays (PGAs), field Programmable Gate Arrays (FPGAs), and the like.
In the description of the present specification, a description referring to terms "one embodiment," "some embodiments," "examples," "specific examples," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present invention. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiments or examples. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
While embodiments of the present invention have been shown and described, it will be understood by those of ordinary skill in the art that: many changes, modifications, substitutions and variations may be made to the embodiments without departing from the spirit and principles of the invention, the scope of which is defined by the claims and their equivalents.
While the preferred embodiment of the present invention has been described in detail, the present invention is not limited to the embodiments described above, and those skilled in the art can make various equivalent modifications or substitutions without departing from the spirit of the present invention, and these equivalent modifications or substitutions are included in the scope of the present invention as defined in the appended claims.

Claims (10)

1. A carbon emission monitoring method, comprising:
according to the monitoring request, acquiring a first image of a target area, wherein the first image is used for representing a visible light image and an infrared image;
processing the first image to obtain a second image, wherein the second image is used for representing a visible light image and an infrared image of the target area;
the second image is identified through a target detection algorithm after being subjected to gradient sharpening enhancement, so that a carbon emission source is obtained, and the carbon emission source is used for representing objects generating greenhouse gases;
and monitoring the real-time carbon emission amount of the target area and the carbon emission amount of a future time period according to the carbon emission data of each carbon emission source.
2. The carbon emission monitoring method of claim 1, wherein the acquiring a first image of the target area according to the monitoring request comprises:
cruising the target area through the unmanned aerial vehicle carrying the visible light load and the infrared load according to a preset route to obtain the first image.
3. The carbon emission monitoring method of claim 1, wherein the performing processing on the first image to obtain a second image comprises:
and performing aerial triangle calculation on the first image to obtain the visible light and infrared remote sensing images, wherein the visible light and infrared images are orthographic images.
4. The carbon emission monitoring method according to claim 1, wherein the identifying the second image by the target detection algorithm after the image gradient sharpening enhancement to obtain the carbon emission source comprises:
carrying out gradient sharpening enhancement on the visible light image;
the method for identifying the target based on the YOLO comprises the steps of adopting a dual-branch network structure to input a visible light image and an infrared image simultaneously, interacting the visible light image and the infrared image, generating corresponding attention weights respectively simultaneously, obtaining fusion characteristics by controlling fusion proportion, and predicting the fusion characteristics to obtain carbon emission source characteristics;
and identifying the characteristics of the carbon emission source by adopting at least one of the precision, recall and cross-union ratio to obtain the object type of the carbon emission source.
5. The carbon emission monitoring method of claim 4, wherein the carbon emission data comprises:
determining the energy consumption and the power of the carbon emission source according to the object type of the carbon emission source;
and determining the carbon emission amount of the single carbon emission source in unit time according to the energy consumption and the power.
6. The carbon emission monitoring method of claim 5, wherein the real-time carbon emission amount to the target area comprises:
emission data of carbon emission sources in a historical time period, and the type, the quantity and the running time of the carbon emission sources in a target area, so as to obtain the real-time carbon emission amount of the construction area as follows
Wherein C is the instantaneous total carbon emission of the target zone, gamma a For the emission or operating power, sigma, of a carbon emission source of type a a Is the conversion coefficient of the energy consumption of the carbon emission source and standard carbon, Y a For the carbon emission source whether the full power operation coefficient is the power operation coefficient epsilon [0,1 ]]。
7. The carbon emission monitoring method as recited in claim 6, wherein the future period of time carbon emission is monitored, comprising:
acquiring an operation time node of a carbon emission source of a target area, and performing carbon emission C on the carbon emission source in a future time period Total (S) Calculated as
Wherein T is ab And (3) the time when the class a carbon emission source operates in the time node b, and l is the number of operating nodes.
8. A carbon emission monitoring device, comprising:
the first module is used for collecting a first image of the target area according to the monitoring request, and the first image is used for representing visible light and infrared images;
the second module is used for performing processing on the first image to obtain a second image, and the second image is used for representing visible light and infrared images of the target area;
the third module is used for identifying the second image through a target detection algorithm after the image gradient sharpening enhancement to obtain a carbon emission source, and the carbon emission source is used for representing objects generating greenhouse gases;
and a fourth module for monitoring the real-time carbon emission amount and the carbon emission amount of the future time period of the target area according to the carbon emission data of each carbon emission source.
9. An electronic device comprising a processor and a memory;
the memory is used for storing programs;
the processor executing the program implements the carbon emission monitoring method of any one of claims 1-7.
10. A computer-readable storage medium, characterized in that the storage medium stores a program that is executed by a processor to implement the carbon emission monitoring method according to any one of claims 1 to 7.
CN202311239246.XA 2023-09-24 2023-09-24 Carbon emission monitoring method and device, electronic equipment and storage medium Pending CN117253148A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311239246.XA CN117253148A (en) 2023-09-24 2023-09-24 Carbon emission monitoring method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311239246.XA CN117253148A (en) 2023-09-24 2023-09-24 Carbon emission monitoring method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117253148A true CN117253148A (en) 2023-12-19

Family

ID=89134542

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311239246.XA Pending CN117253148A (en) 2023-09-24 2023-09-24 Carbon emission monitoring method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117253148A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118504424A (en) * 2024-07-17 2024-08-16 北京睿碳科技有限公司 Carbon emission information processing method, device and equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114924045A (en) * 2022-06-16 2022-08-19 中国科学院空天信息创新研究院 Unmanned aerial vehicle CO 2 Detection system and device
CN115063758A (en) * 2022-07-21 2022-09-16 北京微芯区块链与边缘计算研究院 Traffic carbon emission calculation method and system based on video data
CN116400016A (en) * 2023-03-28 2023-07-07 桂林电子科技大学 Carbon emission monitoring method
CN116596329A (en) * 2023-04-20 2023-08-15 中国电力科学研究院有限公司 Multi-energy complementary enterprise electric power carbon emission factor conduction calculation method and system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114924045A (en) * 2022-06-16 2022-08-19 中国科学院空天信息创新研究院 Unmanned aerial vehicle CO 2 Detection system and device
CN115063758A (en) * 2022-07-21 2022-09-16 北京微芯区块链与边缘计算研究院 Traffic carbon emission calculation method and system based on video data
CN116400016A (en) * 2023-03-28 2023-07-07 桂林电子科技大学 Carbon emission monitoring method
CN116596329A (en) * 2023-04-20 2023-08-15 中国电力科学研究院有限公司 Multi-energy complementary enterprise electric power carbon emission factor conduction calculation method and system

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
XIE YUMIN等: "YOLO-MS: Multispectral Object Detection via Feature Interaction and Self-Attention Guided Fusion", IEEE TRANSACTIONS ON COGNITIVE AND DEVELOPMENTAL SYSTEMS, vol. 15, no. 4, 19 January 2023 (2023-01-19), pages 2132 - 2143, XP011955960, DOI: 10.1109/TCDS.2023.3238181 *
魏松泽等: "无人机低空摄影技术在碳排放监测中的应用", 能源与环保, vol. 44, no. 10, 31 October 2022 (2022-10-31), pages 189 - 194 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118504424A (en) * 2024-07-17 2024-08-16 北京睿碳科技有限公司 Carbon emission information processing method, device and equipment

Similar Documents

Publication Publication Date Title
Arya et al. Transfer learning-based road damage detection for multiple countries
KR102008973B1 (en) Apparatus and Method for Detection defect of sewer pipe based on Deep Learning
Khan et al. Unmanned aerial vehicle–based traffic analysis: Methodological framework for automated multivehicle trajectory extraction
CN106290388A (en) A kind of insulator breakdown automatic testing method
CN117253148A (en) Carbon emission monitoring method and device, electronic equipment and storage medium
CN111610193A (en) System and method for inspecting structural defects of subway tunnel segment by adopting multi-lens shooting
CN113284144B (en) Tunnel detection method and device based on unmanned aerial vehicle
Haurum et al. Sewer Defect Classification using Synthetic Point Clouds.
US20220044027A1 (en) Photography system
CN114943858A (en) Data center inspection system, battery abnormity identification method, equipment and storage medium
CN112455676A (en) Intelligent monitoring and analyzing system and method for health state of photovoltaic panel
CN104634592A (en) Train running gear fault diagnosis method and train running gear fault diagnosis device
CN110969610A (en) Power equipment infrared chart identification method and system based on deep learning
CN116231504A (en) Remote intelligent inspection method, device and system for booster station
CN112487894A (en) Automatic inspection method and device for rail transit protection area based on artificial intelligence
CN116164704A (en) Tunnel defect real-time detection early warning system
CN116630267A (en) Roadbed settlement monitoring method based on unmanned aerial vehicle and laser radar data fusion
CN117037088A (en) Positioning method and system for thermal power plant coal ash transport vehicle based on edge calculation
Rakshit et al. Railway Track Fault Detection using Deep Neural Networks
CN115146209B (en) Method and system for monitoring soil and water conservation condition, storage medium and electronic equipment
Kong et al. Toward the automatic detection of access holes in disaster rubble
Bosurgi et al. Automatic crack detection results using a novel device for survey and analysis of road pavement condition
WO2022079629A1 (en) Method and system for detecting defects in civil works
CN114169404A (en) Method for intelligently acquiring quantitative information of slope diseases based on images
Manjusha et al. A review of advanced pavement distress evaluation techniques using unmanned aerial vehicles

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination