CN117809252A - Flame identification method, device, equipment and storage medium - Google Patents

Flame identification method, device, equipment and storage medium Download PDF

Info

Publication number
CN117809252A
CN117809252A CN202410003170.9A CN202410003170A CN117809252A CN 117809252 A CN117809252 A CN 117809252A CN 202410003170 A CN202410003170 A CN 202410003170A CN 117809252 A CN117809252 A CN 117809252A
Authority
CN
China
Prior art keywords
flame
generator
sample
feature
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410003170.9A
Other languages
Chinese (zh)
Inventor
顾一清
侯良生
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Merchant Ship Design and Research Institute
Original Assignee
Shanghai Merchant Ship Design and Research Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Merchant Ship Design and Research Institute filed Critical Shanghai Merchant Ship Design and Research Institute
Priority to CN202410003170.9A priority Critical patent/CN117809252A/en
Publication of CN117809252A publication Critical patent/CN117809252A/en
Pending legal-status Critical Current

Links

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Image Analysis (AREA)

Abstract

The invention discloses a flame identification method, a device, equipment and a storage medium, belonging to the technical field of data processing, wherein the method comprises the following steps: acquiring an image of a generator to be identified; and performing flame identification on the generator image to be identified by adopting a flame identification model to obtain a target flame identification result. According to the invention, the flame recognition model is adopted to perform flame recognition on the generator image to be recognized, so that the ignition position in the generator image to be recognized is determined, a thermal imager is not required to be installed, and the cost of ignition detection on the engine is reduced; meanwhile, the ignition position of the generator can be quickly identified in the early stage of ignition of the generator, the identification efficiency is high, and the loss is reduced.

Description

Flame identification method, device, equipment and storage medium
Technical Field
The present invention relates to the field of data processing technologies, and in particular, to a flame identification method, apparatus, device, and storage medium.
Background
The ship generator provides power and electricity for the ship, and is an important device for the ship. Once the ship generator fires, it will cause significant damage to the ship. Therefore, it is important to detect the ignition state of the ship generator in real time.
However, in the existing ship fire detection method, for example, a sensor-based fire detection method, a detector only acts when the fire radiation heat reaches a certain energy, and the fire is detected later, so that the sensitivity is not high. For another example, a fire detection method based on thermal imaging has the problems that a thermal imager is expensive and has weak penetration capability.
Disclosure of Invention
The invention provides a flame identification method, a flame identification device, flame identification equipment and a storage medium, which are used for quickly identifying the ignition position of an engine in early stage while reducing the cost of ignition detection of the engine.
According to an aspect of the present invention, there is provided a flame identification method, the method comprising:
acquiring an image of a generator to be identified;
and performing flame identification on the generator image to be identified by adopting a flame identification model to obtain a target flame identification result.
According to another aspect of the present invention, there is provided a flame identification device, the device comprising:
the generator image acquisition module is used for acquiring a generator image to be identified;
the flame identification result determining module is used for carrying out flame identification on the generator image to be identified by adopting the flame identification model to obtain a target flame identification result.
According to another aspect of the present invention, there is provided an electronic apparatus including:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the flame identification method of any one of the embodiments of the invention.
According to another aspect of the present invention, there is provided a computer readable storage medium storing computer instructions for causing a processor to perform the flame identification method of any of the embodiments of the present invention.
According to the technical scheme, the generator image to be identified is obtained; and performing flame identification on the generator image to be identified by adopting a flame identification model to obtain a target flame identification result. According to the technical scheme, the flame identification model is adopted to carry out flame identification on the generator image to be identified, so that the ignition position in the generator image to be identified is determined, a thermal imager is not required to be installed, and the cost of ignition detection on the engine is reduced; meanwhile, the ignition position of the generator can be quickly identified in the early stage of ignition of the generator, the identification efficiency is high, and the loss is reduced.
It should be understood that the description in this section is not intended to identify key or critical features of the embodiments of the invention or to delineate the scope of the invention. Other features of the present invention will become apparent from the description that follows.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a flame identification method according to a first embodiment of the present invention;
FIG. 2 is a flow chart of a flame identification method according to a second embodiment of the present invention;
FIG. 3 is a schematic view of a flame identification device according to a third embodiment of the present invention;
fig. 4 is a schematic structural diagram of an electronic device implementing a flame identification method according to an embodiment of the present invention.
Detailed Description
In order that those skilled in the art will better understand the present invention, a technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
It should be noted that the terms "object," "sample," "first," and "second," etc. in the description and claims of the present invention and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
In addition, it should be noted that, in the technical scheme of the invention, the related processes of collecting, storing, using, processing, transmitting, providing, disclosing and the like of the generator image to be identified, the normal generator image, the flame generator image and the like all conform to the regulations of the related laws and regulations and do not violate the popular regulations.
Example 1
Fig. 1 is a flowchart of a flame identification method according to a first embodiment of the present invention, where the method is applicable to the case of identifying an early ignition position of a generator, and is particularly applicable to the case of identifying an early ignition position of a ship generator, and the method may be performed by a flame identification device, which may be implemented in hardware and/or software, and may be configured in an electronic device. As shown in fig. 1, the method includes:
s101, acquiring an image of the generator to be identified.
The generator image to be identified is an image required to identify the ignition position of the generator. It should be noted that the generator image to be identified depends on the detection object, for example, if the detection object is a ship, the generator image to be identified is a ship generator image, and for example, if the detection object is an automobile, the generator image to be identified is an automobile generator image. Wherein, the detection object is an object needing to identify the ignition position of the generator; alternatively, the detection object may be a ship.
Specifically, the image of the generator to be identified can be obtained through the monitoring camera.
By way of example, the ship generator can be photographed by a monitoring camera, and a generator image to be identified can be obtained.
S102, flame identification is carried out on the generator image to be identified by adopting a flame identification model, and a target flame identification result is obtained.
The target flame recognition result can be a flame image at the ignition position of the generator in the generator image to be recognized or can be none, which indicates that the generator in the generator image to be recognized is normal.
Specifically, the method can be based on a feature extraction network of a flame identification model, and downsampling is performed on a generator image to be identified to obtain at least one target downsampling feature; based on a feature fusion network of the flame identification model, carrying out feature fusion on at least one target downsampling feature to obtain a target fusion feature; and carrying out flame identification on the target fusion characteristics based on a characteristic identification network of the flame identification model to obtain a target flame identification result.
The feature extraction network may be preset according to actual service requirements, for example, the feature extraction network may be a trunk feature extraction network, which is not specifically limited in the embodiment of the present invention. The target downsampling feature refers to a feature obtained by downsampling the generator image to be identified. The feature fusion network may be preset according to actual service requirements, for example, the feature extraction network may be a neck network, which is not specifically limited in the embodiment of the present invention. The target fusion feature refers to a feature obtained by feature fusion of at least one target downsampled feature.
By way of example, 8-fold, 16-fold and 32-fold downsampling of the generator image to be identified can be performed based on a feature extraction network of the flame identification model, so as to obtain 8-fold, 16-fold and 32-fold target downsampling features; then, based on a feature fusion network of the flame identification model, carrying out feature fusion on 8 times, 16 times and 32 times of target downsampling features to obtain target fusion features; and further, based on a characteristic recognition network of the flame recognition model, performing flame recognition on the target fusion characteristics to obtain a target flame recognition result.
According to the technical scheme, the generator image to be identified is obtained; and performing flame identification on the generator image to be identified by adopting a flame identification model to obtain a target flame identification result. According to the technical scheme, the flame identification model is adopted to carry out flame identification on the generator image to be identified, so that the ignition position in the generator image to be identified is determined, a thermal imager is not required to be installed, and the cost of ignition detection on the engine is reduced; meanwhile, the ignition position of the generator can be quickly identified in the early stage of ignition of the generator, the identification efficiency is high, and the loss is reduced.
On the basis of the above embodiment, as an alternative manner of the embodiment of the present invention, the early warning information may also be generated according to the target flame recognition result.
Specifically, under the condition that the target flame identification result is a flame image of the ignition position of the generator in the generator image to be identified, early warning information is generated to prompt related personnel to timely process the ignition position of the generator, so that larger loss is avoided.
In addition, the flame image at the ignition position of the generator in the generator image to be identified can be stored in the sample flame image set of the flame identification model, so that the sample flame image set of the flame identification model is updated and enriched. Wherein the sample flame image set refers to a set consisting of at least one sample flame image; the sample flame image refers to a flame image of early ignition of the generator required for determining the flame identification model.
Example two
Fig. 2 is a flowchart of a flame identification method according to a second embodiment of the present invention, and this embodiment provides an alternative implementation manner of determining a flame identification model based on the foregoing embodiment. In the embodiments of the present invention, parts not described in detail may be referred to for related expressions of other embodiments.
As shown in fig. 2, the method includes:
s201, determining a sample flame image set according to the normal generator image set and the flame generator image set.
Wherein the normal generator image set refers to a set consisting of at least one normal generator image; the normal generator image is an image of the normal operation of the generator of the detection object shot at the historical moment; alternatively, the detection object may be a ship; accordingly, the normal generator image may refer to an image of the normal operation of the ship generator. The flame generator image set refers to a set consisting of at least one flame generator image; the flame generator image refers to a flame image of an early firing of the generator. Alternatively, the flame generator image may be a flame image of an early ignition of the generator which is a non-detection object. A sample flame image set refers to a set of at least one sample flame image; the sample flame image refers to a flame image of early ignition of the generator required for determining the flame identification model.
Specifically, at least one image of the normal operation of the generator to be detected is extracted from a generator image set of the detected object shot in history, the images are stored in a first data set, and the first data set is used as a normal generator image set; meanwhile, based on the web crawler technology, flame images of early ignition of non-detection objects can be acquired from the Internet, the images are stored in a second data set, and the second data set is used as a flame generator image set; and fusing the normal generator image set and the flame generator image set to obtain a sample flame image set.
Optionally, the normal generator image set and the flame generator image set are fused to obtain a sample flame image set, which may be: randomly or proportionally dividing the flame generator image set into a first image set and a second image set; the preset proportion can be preset according to actual service requirements, for example, the preset proportion can be 2:1, and the embodiment of the invention does not limit the specific proportion; fusing the ignition position of the generator in the flame generator image in the first image set with the key position of the generator in the normal generator image set to obtain a first fused image set; wherein, the key positions of the generator include, but are not limited to, a cylinder, a supercharger, an exhaust pipe and an oil pan; fusing the firing position of the generator in the flame generator image in the second image set with the normal generator image in the normal generator image set at any position to obtain a second fused image set; and taking a set formed by the first fused image set and the second fused image set as a sample flame image set.
S202, downsampling a sample flame image set to obtain at least one sample downsampling characteristic.
The sample downsampling feature refers to a feature obtained by downsampling a sample flame image of a sample flame image set.
Specifically, the sample flame image in the sample flame image set can be downsampled through the feature extraction network in the neural network model, so as to obtain at least one sample downsampling feature. The feature extraction network may be preset according to actual service requirements, for example, the feature extraction network may be a trunk feature extraction network, which is not specifically limited in the embodiment of the present invention.
For example, the sample flame image in the sample flame image set may be downsampled by the backbone feature extraction network in the neural network model to obtain at least one sample downsampled feature. Alternatively, the backbone feature extraction network layer may include ELAN, SPPCSPC, and MP.
S203, carrying out feature fusion on at least one sample downsampling feature to obtain a sample fusion feature.
The sample fusion feature refers to a feature obtained by carrying out feature fusion on at least one sample downsampling feature.
Specifically, feature fusion can be performed on at least one sample downsampling feature through a feature fusion network in the neural network model, so as to obtain a sample fusion feature. The feature fusion network may be preset according to actual service requirements, for example, the feature extraction network may be a neck network, which is not specifically limited in the embodiment of the present invention.
By way of example, feature fusion may be performed on at least one sample downsampling feature via a neck network in a neural network model to obtain a sample fusion feature. Optionally, the neck network may include a feature pyramid network (Feature Pyramid Network, FPN) and a path aggregation network (Path Aggregation Network, PAN).
S204, training the neural network model according to the sample fusion characteristics and the label data of the sample flame image set to obtain a flame identification model.
The tag data can be obtained by performing flame labeling on the sample flame images in the sample flame image set. Alternatively, the neural network model may include a feature extraction network, a feature fusion network, and a feature recognition network.
Specifically, the neural network model can be sampled to perform flame identification on the sample fusion characteristics, so that a sample flame identification result is obtained; determining training loss according to the sample flame identification result and the label data of the sample flame image set; and training the neural network model by sampling the training loss to obtain a flame identification model. The sample flame identification result can be a flame image at the ignition position of the generator in the sample flame image or can be none, which indicates that the generator in the sample flame image is normal.
More specifically, a characteristic recognition network in a neural network model can be adopted to perform flame recognition on the sample fusion characteristics so as to obtain a sample flame recognition result; based on a preset loss function, determining training loss according to a sample flame identification result and label data of a sample flame image set; and training the neural network model by sampling the training loss until the training loss reaches a preset range or the training iteration times reach preset times, stopping training the neural network model, and taking the neural network model when the training is stopped as a flame recognition model. The preset loss function, the preset range and the preset times are all preset according to the actual service requirement, and the embodiment of the invention does not limit the method specifically.
It can be understood that the training loss is determined according to the sample flame identification result and the label data of the sample flame image set, so that the determined training loss is more accurate, and the obtained flame identification model has higher flame identification precision.
S205, acquiring an image of the generator to be identified.
S206, performing flame identification on the generator image to be identified by adopting the flame identification model to obtain a target flame identification result.
The technical scheme of the embodiment of the invention provides a method for determining a flame identification model, namely, a sample flame image set is determined according to a normal generator image set and a flame generator image set; downsampling a sample flame image set to obtain at least one sample downsampling feature; performing feature fusion on at least one sample downsampling feature to obtain a sample fusion feature; training the neural network model according to the sample fusion characteristics and the label data of the sample flame image set to obtain a flame identification model. According to the technical scheme, based on the sample flame image and the label data of the sample flame image, the flame identification model is obtained by training the neural network model, so that the flame identification efficiency of the determined flame identification model is higher, and the flame identification result is more accurate.
Example III
Fig. 3 is a schematic structural diagram of a flame identification device according to a third embodiment of the present invention, where the present embodiment is applicable to the situation of identifying an early ignition position of a generator, and particularly to the situation of identifying an early ignition position of a ship generator, and the device may be implemented in a hardware and/or software manner and may be configured in an electronic device. As shown in fig. 3, the apparatus includes:
a generator image acquisition module 301, configured to acquire a generator image to be identified;
the flame identification result determining module 302 is configured to perform flame identification on the generator image to be identified by using the flame identification model, so as to obtain a target flame identification result.
According to the technical scheme, the generator image to be identified is obtained; and performing flame identification on the generator image to be identified by adopting a flame identification model to obtain a target flame identification result. According to the technical scheme, the flame identification model is adopted to carry out flame identification on the generator image to be identified, so that the ignition position in the generator image to be identified is determined, a thermal imager is not required to be installed, and the cost of ignition detection on the engine is reduced; meanwhile, the ignition position of the generator can be quickly identified in the early stage of ignition of the generator, the identification efficiency is high, and the loss is reduced.
Optionally, the flame identification result determining module 302 is specifically configured to:
based on a feature extraction network of the flame recognition model, downsampling a generator image to be recognized to obtain at least one target downsampling feature;
based on a feature fusion network of the flame identification model, carrying out feature fusion on at least one target downsampling feature to obtain a target fusion feature;
and carrying out flame identification on the target fusion characteristics based on a characteristic identification network of the flame identification model to obtain a target flame identification result.
Optionally, the apparatus further comprises:
and the early warning module is used for generating early warning information according to the target flame identification result.
Optionally, the apparatus further includes a flame identification model determination module, the flame identification model determination module including:
the sample flame image set determining unit is used for determining a sample flame image set according to the normal generator image set and the flame generator image set;
the sample downsampling characteristic determining unit is used for downsampling the sample flame image set to obtain at least one sample downsampling characteristic;
the sample fusion feature determining unit is used for carrying out feature fusion on at least one sample downsampling feature to obtain a sample fusion feature;
the flame identification model determining unit is used for training the neural network model according to the sample fusion characteristics and the label data of the sample flame image set to obtain a flame identification model.
Optionally, the sample flame image set determining unit is specifically configured to:
and fusing the normal generator image set and the flame generator image set to obtain a sample flame image set.
Optionally, the flame identification model determining unit is specifically configured to:
the sampling neural network model performs flame identification on the sample fusion characteristics to obtain a sample flame identification result;
determining training loss according to the sample flame identification result and the label data of the sample flame image set;
and training the neural network model by sampling the training loss to obtain a flame identification model.
The flame identification device provided by the embodiment of the invention can execute the flame identification method provided by any embodiment of the invention, and has the corresponding functional modules and beneficial effects of executing the flame identification methods.
Example IV
Fig. 4 shows a schematic diagram of the structure of an electronic device 10 that may be used to implement an embodiment of the invention. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. Electronic equipment may also represent various forms of mobile devices, such as personal digital processing, cellular telephones, smartphones, wearable devices (e.g., helmets, glasses, watches, etc.), and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the inventions described and/or claimed herein.
As shown in fig. 4, the electronic device 10 includes at least one processor 11, and a memory, such as a Read Only Memory (ROM) 12, a Random Access Memory (RAM) 13, etc., communicatively connected to the at least one processor 11, in which the memory stores a computer program executable by the at least one processor, and the processor 11 may perform various appropriate actions and processes according to the computer program stored in the Read Only Memory (ROM) 12 or the computer program loaded from the storage unit 18 into the Random Access Memory (RAM) 13. In the RAM13, various programs and data required for the operation of the electronic device 10 may also be stored. The processor 11, the ROM12 and the RAM13 are connected to each other via a bus 14. An input/output (I/O) interface 15 is also connected to bus 14.
Various components in the electronic device 10 are connected to the I/O interface 15, including: an input unit 16 such as a keyboard, a mouse, etc.; an output unit 17 such as various types of displays, speakers, and the like; a storage unit 18 such as a magnetic disk, an optical disk, or the like; and a communication unit 19 such as a network card, modem, wireless communication transceiver, etc. The communication unit 19 allows the electronic device 10 to exchange information/data with other devices via a computer network, such as the internet, and/or various telecommunication networks.
The processor 11 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of processor 11 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various processors running machine learning model algorithms, digital Signal Processors (DSPs), and any suitable processor, controller, microcontroller, etc. The processor 11 performs the various methods and processes described above, such as flame identification methods.
In some embodiments, the flame identification method may be implemented as a computer program tangibly embodied on a computer-readable storage medium, such as the storage unit 18. In some embodiments, part or all of the computer program may be loaded and/or installed onto the electronic device 10 via the ROM12 and/or the communication unit 19. When the computer program is loaded into RAM13 and executed by processor 11, one or more steps of the flame identification method described above may be performed. Alternatively, in other embodiments, the processor 11 may be configured to perform the flame identification method in any other suitable manner (e.g., by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuit systems, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems On Chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
A computer program for carrying out methods of the present invention may be written in any combination of one or more programming languages. These computer programs may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the computer programs, when executed by the processor, cause the functions/acts specified in the flowchart and/or block diagram block or blocks to be implemented. The computer program may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of the present invention, a computer-readable storage medium may be a tangible medium that can contain, or store a computer program for use by or in connection with an instruction execution system, apparatus, or device. The computer readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. Alternatively, the computer readable storage medium may be a machine readable signal medium. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on an electronic device having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) through which a user can provide input to the electronic device. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), blockchain networks, and the internet.
The computing system may include clients and servers. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server can be a cloud server, also called a cloud computing server or a cloud host, and is a host product in a cloud computing service system, so that the defects of high management difficulty and weak service expansibility in the traditional physical hosts and VPS service are overcome.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps described in the present invention may be performed in parallel, sequentially, or in a different order, so long as the desired results of the technical solution of the present invention are achieved, and the present invention is not limited herein.
The above embodiments do not limit the scope of the present invention. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present invention should be included in the scope of the present invention.

Claims (10)

1. A flame identification method, comprising:
acquiring an image of a generator to be identified;
and performing flame identification on the generator image to be identified by adopting a flame identification model to obtain a target flame identification result.
2. The method of claim 1, wherein the performing flame recognition on the generator image to be recognized using a flame recognition model to obtain a target flame recognition result comprises:
based on a feature extraction network of the flame identification model, downsampling the generator image to be identified to obtain at least one target downsampling feature;
based on a feature fusion network of the flame identification model, carrying out feature fusion on the at least one target downsampling feature to obtain a target fusion feature;
and based on a characteristic recognition network of the flame recognition model, performing flame recognition on the target fusion characteristics to obtain a target flame recognition result.
3. The method according to claim 1, wherein the method further comprises:
and generating early warning information according to the target flame identification result.
4. The method of claim 1, wherein the flame identification model is determined as follows:
determining a sample flame image set according to the normal generator image set and the flame generator image set;
downsampling the sample flame image set to obtain at least one sample downsampling feature;
performing feature fusion on the at least one sample downsampling feature to obtain a sample fusion feature;
training the neural network model according to the sample fusion characteristics and the label data of the sample flame image set to obtain a flame identification model.
5. The method of claim 4, wherein determining a sample flame image set from the normal generator image set and the flame generator image set comprises:
and fusing the normal generator image set and the flame generator image set to obtain a sample flame image set.
6. The method of claim 4, wherein training the neural network model based on the sample fusion features and the tag data of the sample flame image set to obtain a flame identification model comprises:
the sampling neural network model carries out flame identification on the sample fusion characteristics to obtain a sample flame identification result;
determining training loss according to the sample flame identification result and the label data of the sample flame image set;
and sampling the training loss to train the neural network model to obtain a flame identification model.
7. A flame identification device, comprising:
the generator image acquisition module is used for acquiring a generator image to be identified;
and the flame identification result determining module is used for carrying out flame identification on the generator image to be identified by adopting a flame identification model to obtain a target flame identification result.
8. The apparatus of claim 7, wherein the flame identification result determination module is specifically configured to:
based on a feature extraction network of the flame identification model, downsampling the generator image to be identified to obtain at least one target downsampling feature;
based on a feature fusion network of the flame identification model, carrying out feature fusion on the at least one target downsampling feature to obtain a target fusion feature;
and based on the recognition network of the flame recognition model, performing flame recognition on the target fusion characteristics to obtain a target flame recognition result.
9. An electronic device, the electronic device comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the flame identification method of any of claims 1-6.
10. A computer readable storage medium storing computer instructions for causing a processor to perform the flame identification method of any of claims 1-6.
CN202410003170.9A 2024-01-02 2024-01-02 Flame identification method, device, equipment and storage medium Pending CN117809252A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410003170.9A CN117809252A (en) 2024-01-02 2024-01-02 Flame identification method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410003170.9A CN117809252A (en) 2024-01-02 2024-01-02 Flame identification method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117809252A true CN117809252A (en) 2024-04-02

Family

ID=90419742

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410003170.9A Pending CN117809252A (en) 2024-01-02 2024-01-02 Flame identification method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117809252A (en)

Similar Documents

Publication Publication Date Title
CN113947188A (en) Training method of target detection network and vehicle detection method
CN112613569A (en) Image recognition method, and training method and device of image classification model
CN116310903A (en) Method and device for identifying fault type of photovoltaic module and electronic equipment
CN117197571A (en) Photovoltaic module fault detection method and device, electronic equipment and storage medium
CN116486125A (en) Equipment detection method, device, equipment and medium
CN116465367A (en) Equipment checking method, device, equipment and medium
CN117809252A (en) Flame identification method, device, equipment and storage medium
CN112818972B (en) Method and device for detecting interest point image, electronic equipment and storage medium
CN113807209A (en) Parking space detection method and device, electronic equipment and storage medium
CN117746069B (en) Graph searching model training method and graph searching method
CN115620496B (en) Fault alarm method, device, equipment and medium applied to power transmission line
CN117422861A (en) Method, device, equipment and medium for evaluating risk of maintaining power equipment
CN118135323A (en) Distribution line defect identification method, device, equipment and storage medium
CN117612099A (en) Operation and maintenance field monitoring method, device and equipment
CN116778276A (en) Safe production model training method, application method, device, equipment and medium
CN115355996A (en) Temperature anomaly early warning method and device, electronic equipment and storage medium
CN117351462A (en) Construction operation detection model training method, device, equipment and storage medium
CN117853789A (en) Image detection method and device
CN117710459A (en) Method, device and computer program product for determining three-dimensional information
CN118279879A (en) Feeding detection method, feeding detection device, electronic equipment and storage medium
CN117030024A (en) Temperature detection system and method for power transmission line splicing sleeve, electronic equipment and medium
CN117765701A (en) Information detection method and device, electronic equipment and storage medium
CN118262283A (en) Data processing method, device, electronic equipment and storage medium
CN116630863A (en) Power distribution network safety identification recognition method, device, equipment and storage medium
CN117576077A (en) Defect detection method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination