CN110428579B - Indoor monitoring system, method and device based on image recognition - Google Patents

Indoor monitoring system, method and device based on image recognition Download PDF

Info

Publication number
CN110428579B
CN110428579B CN201910731493.9A CN201910731493A CN110428579B CN 110428579 B CN110428579 B CN 110428579B CN 201910731493 A CN201910731493 A CN 201910731493A CN 110428579 B CN110428579 B CN 110428579B
Authority
CN
China
Prior art keywords
image
unit
flame
edge detection
sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910731493.9A
Other languages
Chinese (zh)
Other versions
CN110428579A (en
Inventor
刘宝鑫
宋馨芳
冯仙武
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Liu Baoxin
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN201910731493.9A priority Critical patent/CN110428579B/en
Publication of CN110428579A publication Critical patent/CN110428579A/en
Application granted granted Critical
Publication of CN110428579B publication Critical patent/CN110428579B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D21/00Measuring or testing not otherwise provided for
    • G01D21/02Measuring two or more variables by means not covered by a single other subclass
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B17/00Fire alarms; Alarms responsive to explosion
    • G08B17/12Actuation by presence of radiation or particles, e.g. of infrared radiation or of ions
    • G08B17/125Actuation by presence of radiation or particles, e.g. of infrared radiation or of ions by using a video camera to detect fire or smoke
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Abstract

The invention belongs to the technical field of image monitoring, and particularly relates to an indoor monitoring system, method and device based on image recognition. The system comprises: the system comprises a field monitoring subsystem, a wireless communication network and a user side subsystem; the on-site monitoring subsystem at least comprises: the system comprises a smoke sensor, a temperature sensor, a humidity sensor, an image sensor, an infrared sensor and a microcontroller; the microcontroller comprises: a data conversion unit, an image processing unit and a judgment unit; the judging units are respectively in signal connection with the data conversion unit; the data conversion unit is respectively in signal connection with the smoke sensor, the temperature sensor, the humidity sensor, the infrared sensor and the image processing unit; the image processing unit is connected with the image sensor through signals; the judging unit judges according to the received data and preset conditions, and sends a signal to the image processing unit under the condition of judging abnormity. The method has the advantages of high efficiency, high accuracy, low false alarm rate and high intelligent degree.

Description

Indoor monitoring system, method and device based on image recognition
Technical Field
The invention belongs to the technical field of image monitoring, and particularly relates to an indoor monitoring system, method and device based on image recognition.
Background
Fire causes great economic loss and casualties to human society, and a fire alarm device serving as a fire pre-warning and alarming device plays an unappreciable role in reducing loss caused by fire. The existing fire alarm can detect the disaster and transmit the disaster to the monitoring center in a wired mode. However, the alarm for transmitting disaster signals in a wired manner is likely to affect normal transmission of signals due to line faults when a fire occurs, so that the fire alarm has poor use effect.
The edge can be defined as the boundary of the region with the sharp change of the gray level in the image, which is the most basic characteristic of the image, is an essential link before the image analysis and identification, and is an important image preprocessing technology. The edge detection is mainly the measurement, detection and positioning of gray scale change (of an image), is a main feature extraction means for image analysis and pattern recognition, plays an important role in computer vision, image analysis and other applications, and is a hotspot problem in research in image analysis and processing.
Disclosure of Invention
In view of the above, the main object of the present invention is to provide an indoor monitoring system, method and device based on image recognition, which have the advantages of high efficiency, high accuracy, low false alarm rate and high intelligence degree.
In order to achieve the purpose, the technical scheme of the invention is realized as follows:
an indoor monitoring system based on image recognition, the system comprising: the system comprises a field monitoring subsystem, a wireless communication network and a user side subsystem; the on-site monitoring subsystem at least comprises: the system comprises a smoke sensor, a temperature sensor, a humidity sensor, an image sensor, an infrared sensor and a microcontroller; the microcontroller comprises: a data conversion unit, an image processing unit and a judgment unit; the judging units are respectively in signal connection with the data conversion unit; the data conversion unit is respectively in signal connection with the smoke sensor, the temperature sensor, the humidity sensor, the infrared sensor and the image processing unit; the image processing unit is connected with the image sensor through signals; the judging unit judges the received data according to preset conditions, and sends a signal to the image processing unit under the condition of abnormal judgment, and the image processing unit starts the image sensor to acquire image information according to the received signal; the image processing unit is in signal connection with a wireless communication network; the wireless communication network signal is connected to the user side subsystem; the data conversion unit converts analog data received by the smoke sensor, the temperature sensor, the humidity sensor, the infrared sensor and the image processing unit into digital data and then sends the digital data to the judgment unit; the judging unit judges whether the data acquired by the smoke sensor is abnormal or not, judges whether the data acquired by the temperature sensor is abnormal or not, judges whether the data acquired by the humidity sensor is abnormal or not and judges whether the data acquired by the infrared sensor is abnormal or not according to the received digital data and a preset threshold value, and sends a signal to the image processing unit when the number of abnormal numerical values is more than or equal to 3; the image sensor is used for acquiring original image information; the image processing unit includes: the edge detection unit is used for receiving the original image information sent by the data transmission device, carrying out edge detection on the original image and generating an intermediate image after the edge detection; the flame area identification unit judges and identifies a flame area in the intermediate image after the edge detection, intercepts the flame area in the intermediate image and generates a flame area image; the fire early warning unit judges the severity of fire according to the generated flame area image and the flame image sample input into the fire early warning unit, and carries out early warning; the edge detection unit includes: an edge detection template generating unit for generating a convolution template for edge detection; a template moving unit for moving the convolution template in the image in a raster scanning manner; the convolution unit is used for performing convolution operation on the coefficient of the convolution template and the corresponding pixel under the template; the intermediate image generating unit is used for generating an intermediate image after edge detection according to the operation result of the convolution unit; the flame region identification unit includes: an area change rate calculation unit for calculating an area change of each region in the intermediate image after the edge detection; a circularity calculation unit configured to calculate a circularity of each region in the intermediate image after the edge detection; a flame region identification unit for identifying a flame region in the intermediate image based on the results calculated by the circularity calculation unit and the area change rate calculation unit;
in the intermediate image after the edge detection, the method for judging and identifying the flame area in the intermediate image comprises the following steps: define the area change rate as: (ii) a In the formula: AR represents the rate of change of area of highlight regions between adjacent intermediate images; a (n) represents the area of the suspicious region in the current intermediate image; a (n +1) represents the area of the suspicious region in the next intermediate image, and eps is a set minimum value;
the circularity is defined as: (ii) a In the formula: ck represents the circularity of the primitive numbered k; pk is the perimeter of the kth graphic primitive;
the fire early warning unit includes: recording a training set of the flame image sample; and the deep learning neural network is used for training the training set, establishing a deep learning model and judging the fire severity of the flame region image according to the model.
An indoor monitoring method based on image recognition, the method executes the following steps:
step S1: judging whether an abnormality occurs according to the smoke data, the temperature data, the humidity data and the infrared data monitored by each sensor, and starting an image monitoring program if the abnormality occurs; .
Step S2: after an image monitoring program is started, acquiring original image information;
step S3: carrying out edge detection on the original image to generate an intermediate image after edge detection;
in the intermediate image after the edge detection, the method for judging and identifying the flame area in the intermediate image comprises the following steps: define the area change rate as: (ii) a In the formula: AR represents the rate of change of area of highlight regions between adjacent intermediate images; a (n) represents the area of the suspicious region in the current intermediate image; a (n +1) represents the area of the suspicious region in the next intermediate image, and eps is a set minimum value;
the circularity is defined as: (ii) a In the formula: ck represents the circularity of the primitive numbered k; pk is the perimeter of the kth graphic primitive;
step S5: and judging the severity of the fire according to the generated flame area image and the flame image sample recorded into the fire early warning unit, and early warning.
Further, in step S2: the method for receiving the information of the original image, carrying out edge detection on the original image and generating the intermediate image after the edge detection comprises the following steps:
step S2.1: the convolution template is used for generating a convolution template for edge detection, and the convolution template is a 3 x 3 template;
step S2.2: the following formula is used:
Figure 100002_DEST_PATH_IMAGE002
(ii) a Performing convolution operation on the convolution template and the original image in the step S2.1; wherein the content of the first and second substances,
Figure 100002_DEST_PATH_IMAGE004
in order to generate the intermediate image(s),
Figure 100002_DEST_PATH_IMAGE006
in order to be the original image, the image is processed,
Figure 100002_DEST_PATH_IMAGE008
is a convolution template.
Further, in step S5: judging the severity of the fire according to the generated flame area image and the flame image sample recorded into the fire early warning unit, and executing the following steps by the early warning method:
step S4.1: inputting an existing flame region image;
step S4.2: processing the existing flame area image to generate a plurality of training samples;
step S4.3: the training unit utilizes the training samples to perform image recognition training based on deep learning to generate a training model;
step S4.4: the method is used for detecting whether the flame area image to be detected contains the training sample or not by using the training model.
An image recognition based indoor monitoring device, the device being a non-transitory computer readable storage medium storing computing instructions, comprising: judging whether an abnormality occurs according to the smoke data, the temperature data, the humidity data and the infrared data which are monitored by each sensor, and starting a code segment of an image monitoring program if the abnormality occurs; after the image monitoring program is started, acquiring a code segment of original image information; a code segment for performing edge detection on the original image to generate an intermediate image after edge detection; a code segment for judging and identifying the flame area in the intermediate image after the edge detection, intercepting the flame area in the intermediate image and generating a flame area image; and a code segment for judging the severity of the fire and carrying out early warning according to the generated flame area image and the flame image sample recorded into the fire early warning unit.
Drawings
Fig. 1 is a schematic system structure diagram of an indoor monitoring system based on image recognition according to an embodiment of the present invention;
fig. 2 is a schematic flowchart of a method of an indoor monitoring method based on image recognition according to an embodiment of the present invention.
Detailed Description
The method of the present invention will be described in further detail below with reference to the accompanying drawings and embodiments of the invention.
Example 1
An indoor monitoring system based on image recognition, the system comprising: the system comprises a field monitoring subsystem, a wireless communication network and a user side subsystem; the on-site monitoring subsystem at least comprises: the system comprises a smoke sensor, a temperature sensor, a humidity sensor, an image sensor, an infrared sensor and a microcontroller; the microcontroller comprises: a data conversion unit, an image processing unit and a judgment unit; the judging units are respectively in signal connection with the data conversion unit; the data conversion unit is respectively in signal connection with the smoke sensor, the temperature sensor, the humidity sensor, the infrared sensor and the image processing unit; the image processing unit is connected with the image sensor through signals; the judging unit judges the received data according to preset conditions, and sends a signal to the image processing unit under the condition of abnormal judgment, and the image processing unit starts the image sensor to acquire image information according to the received signal; the image processing unit is in signal connection with a wireless communication network; the wireless communication network signal is connected to the user side subsystem.
Example 2
On the basis of the previous embodiment, the data conversion unit converts analog data received by the smoke sensor, the temperature sensor, the humidity sensor, the infrared sensor and the image processing unit into digital data and then sends the digital data to the judgment unit; the judging unit judges whether the data acquired by the smoke sensor is abnormal or not, judges whether the data acquired by the temperature sensor is abnormal or not, judges whether the data acquired by the humidity sensor is abnormal or not and judges whether the data acquired by the infrared sensor is abnormal or not according to the received digital data and a preset threshold, and sends a signal to the image processing unit when the number of abnormal numerical values is more than or equal to 3.
Specifically, the invention determines whether to start image monitoring or not by judging whether the data acquired by the smoke sensor is abnormal or not, judging whether the data acquired by the temperature sensor is abnormal or not, judging whether the data acquired by the humidity sensor is abnormal or not and judging whether the data acquired by the infrared sensor is abnormal or not, so that the image monitoring can be ensured to be free from working under unnecessary conditions, and the efficiency of the system is further improved.
Example 3
On the basis of the previous embodiment, the image sensor is used for acquiring original image information; the image processing unit includes: the edge detection unit is used for receiving the original image information sent by the data transmission device, carrying out edge detection on the original image and generating an intermediate image after the edge detection; the flame area identification unit judges and identifies a flame area in the intermediate image after the edge detection, intercepts the flame area in the intermediate image and generates a flame area image; and the fire early warning unit judges the severity of the fire according to the generated flame area image and the flame image sample recorded into the fire early warning unit, and performs early warning.
Example 4
On the basis of the above embodiment, the edge detection unit includes: an edge detection template generating unit for generating a convolution template for edge detection; a template moving unit for moving the convolution template in the image in a raster scanning manner; the convolution unit is used for performing convolution operation on the coefficient of the convolution template and the corresponding pixel under the template; and the intermediate image generating unit is used for generating an intermediate image after edge detection according to the operation result of the convolution unit.
Example 5
On the basis of the above embodiment, the flame region identification unit includes: an area change rate calculation unit for calculating an area change of each region in the intermediate image after the edge detection; a circularity calculation unit configured to calculate a circularity of each region in the intermediate image after the edge detection; and the flame region identification unit is used for identifying the flame region in the intermediate image according to the results calculated by the circularity calculation unit and the area change rate calculation unit.
Specifically, in the intermediate image after the edge detection, the method for judging and identifying the flame region in the intermediate image performs the following steps: define the area change rate as:
Figure 100002_DEST_PATH_IMAGE010
(ii) a In the formula: AR represents the rate of change of area of highlight regions between adjacent intermediate images; a (n) represents the area of the suspicious region in the current intermediate image; a (n +1) represents the area of the suspicious region in the next intermediate image, and eps is a set minimum value; the range of the area change rate of the flame is 0.1-0.4, the area change rate of the fixed light source is close to 0, and the area change rate of the rapidly-flashing object is close to 1; the circularity is defined as:
Figure 100002_DEST_PATH_IMAGE012
(ii) a In the formula: ck represents the circularity of the primitive numbered k; pk is the perimeter of the kth primitive, namely the boundary length of the suspected primitive, and is obtained by calculating a boundary chain code, wherein in the boundary chain code, the chain code step length in the horizontal and vertical directions is unit length 1, the chain code step length in the diagonal direction is unit length, and the chain code step length in the right angle direction is unit length; ak is the area of the kth primitive, and is obtained by calculating the number of bright points in a suspicious primitive for a gray level image and calculating the number of pixel points with the pixel value of 1 for a binary image; n is the number of suspicious flame primitives in the image; the circularity of the flame zone is less than 0.5;
step S3.3: and judging and identifying the flame region in the intermediate image by calculating the area change rate and the circularity of the intermediate image.
Example 6
On the basis of the above embodiment, the fire early warning unit includes: recording a training set of the flame image sample; and the deep learning neural network is used for training the training set, establishing a deep learning model and judging the fire severity of the flame region image according to the model.
Example 7
An indoor monitoring method based on image recognition, the method executes the following steps:
step S1: judging whether an abnormality occurs according to the smoke data, the temperature data, the humidity data and the infrared data monitored by each sensor, and starting an image monitoring program if the abnormality occurs;
step S2: after an image monitoring program is started, acquiring original image information;
step S3: carrying out edge detection on the original image to generate an intermediate image after edge detection;
step S4: judging and identifying a flame region in the intermediate image after the edge detection, intercepting the flame region in the intermediate image, and generating a flame region image;
step S5: and judging the severity of the fire according to the generated flame area image and the flame image sample recorded into the fire early warning unit, and early warning.
Example 8
On the basis of the above embodiment, in the step S2: the method for receiving the information of the original image, carrying out edge detection on the original image and generating the intermediate image after the edge detection comprises the following steps:
step S2.1: the convolution template is used for generating a convolution template for edge detection, and the convolution template is a 3 x 3 template;
step S2.2: the following formula is used:
Figure DEST_PATH_IMAGE014
(ii) a Performing convolution operation on the convolution template and the original image in the step S2.1; wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE004A
in order to generate the intermediate image(s),
Figure DEST_PATH_IMAGE006A
in order to be the original image, the image is processed,
Figure DEST_PATH_IMAGE008A
is a convolution template.
Example 9
On the basis of the above embodiment, in the step S5: judging the severity of the fire according to the generated flame area image and the flame image sample recorded into the fire early warning unit, and executing the following steps by the early warning method:
step S4.1: inputting an existing flame region image;
step S4.2: processing the existing flame area image to generate a plurality of training samples;
step S4.3: the training unit utilizes the training samples to perform image recognition training based on deep learning to generate a training model;
step S4.4: the method is used for detecting whether the flame area image to be detected contains the training sample or not by using the training model.
Example 10
An image recognition based indoor monitoring device, the device being a non-transitory computer readable storage medium storing computing instructions, comprising: judging whether an abnormality occurs according to the smoke data, the temperature data, the humidity data and the infrared data which are monitored by each sensor, and starting a code segment of an image monitoring program if the abnormality occurs; after the image monitoring program is started, acquiring a code segment of original image information; a code segment for performing edge detection on the original image to generate an intermediate image after edge detection; a code segment for judging and identifying the flame area in the intermediate image after the edge detection, intercepting the flame area in the intermediate image and generating a flame area image; and a code segment for judging the severity of the fire and carrying out early warning according to the generated flame area image and the flame image sample recorded into the fire early warning unit.
The above description is only an embodiment of the present invention, but not intended to limit the scope of the present invention, and any structural changes made according to the present invention should be considered as being limited within the scope of the present invention without departing from the spirit of the present invention.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working process and related description of the system described above may refer to the corresponding process in the foregoing method embodiments, and will not be described herein again.
It should be noted that, the system provided in the foregoing embodiment is only illustrated by dividing the functional modules, and in practical applications, the functions may be distributed by different functional modules according to needs, that is, the modules or steps in the embodiment of the present invention are further decomposed or combined, for example, the modules in the foregoing embodiment may be combined into one module, or may be further split into multiple sub-modules, so as to complete all or part of the functions described above. The names of the modules and steps involved in the embodiments of the present invention are only for distinguishing the modules or steps, and are not to be construed as unduly limiting the present invention.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes and related descriptions of the storage device and the processing device described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
Those of skill in the art would appreciate that the various illustrative modules, method steps, and modules described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that programs corresponding to the software modules, method steps may be located in Random Access Memory (RAM), memory, Read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art. To clearly illustrate this interchangeability of electronic hardware and software, various illustrative components and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as electronic hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
The terms "first," "second," and the like are used for distinguishing between similar elements and not necessarily for describing or implying a particular order or sequence.
The terms "comprises," "comprising," or any other similar term are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
So far, the technical solutions of the present invention have been described in connection with the preferred embodiments shown in the drawings, but it is easily understood by those skilled in the art that the scope of the present invention is obviously not limited to these specific embodiments. Equivalent changes or substitutions of related technical features can be made by those skilled in the art without departing from the principle of the invention, and the technical scheme after the changes or substitutions can fall into the protection scope of the invention.
The above description is only a preferred embodiment of the present invention, and is not intended to limit the scope of the present invention.

Claims (5)

1. Indoor monitored control system based on image recognition, its characterized in that, the system includes: the system comprises a field monitoring subsystem, a wireless communication network and a user side subsystem; the on-site monitoring subsystem at least comprises: the system comprises a smoke sensor, a temperature sensor, a humidity sensor, an image sensor, an infrared sensor and a microcontroller; the microcontroller comprises: a data conversion unit, an image processing unit and a judgment unit; the judging units are respectively in signal connection with the data conversion unit; the data conversion unit is respectively in signal connection with the smoke sensor, the temperature sensor, the humidity sensor, the infrared sensor and the image processing unit; the image processing unit is connected with the image sensor through signals; the judging unit judges the received data according to preset conditions, and sends a signal to the image processing unit under the condition of abnormal judgment, and the image processing unit starts the image sensor to acquire image information according to the received signal; the image processing unit is in signal connection with a wireless communication network; the wireless communication network signal is connected to the user side subsystem;
the data conversion unit converts analog data received by the smoke sensor, the temperature sensor, the humidity sensor, the infrared sensor and the image processing unit into digital data and then sends the digital data to the judgment unit; the judging unit judges whether the data acquired by the smoke sensor is abnormal or not, judges whether the data acquired by the temperature sensor is abnormal or not, judges whether the data acquired by the humidity sensor is abnormal or not and judges whether the data acquired by the infrared sensor is abnormal or not according to the received digital data and a preset threshold value, and sends a signal to the image processing unit when the number of abnormal numerical values is more than or equal to 3;
the image sensor is used for acquiring original image information; the image processing unit includes: the edge detection unit is used for receiving the original image information sent by the data transmission device, carrying out edge detection on the original image and generating an intermediate image after the edge detection; the flame area identification unit judges and identifies a flame area in the intermediate image after the edge detection, intercepts the flame area in the intermediate image and generates a flame area image; the fire early warning unit judges the severity of fire according to the generated flame area image and the flame image sample input into the fire early warning unit, and carries out early warning;
the edge detection unit includes: an edge detection template generating unit for generating a convolution template for edge detection; a template moving unit for moving the convolution template in the image in a raster scanning manner; the convolution unit is used for performing convolution operation on the coefficient of the convolution template and the corresponding pixel under the template; the intermediate image generating unit is used for generating an intermediate image after edge detection according to the operation result of the convolution unit;
the flame region identification unit includes: an area change rate calculation unit for calculating an area change of each region in the intermediate image after the edge detection; a circularity calculation unit configured to calculate a circularity of each region in the intermediate image after the edge detection; a flame region identification unit for identifying a flame region in the intermediate image based on the results calculated by the circularity calculation unit and the area change rate calculation unit;
in the intermediate image after the edge detection, the method for judging and identifying the flame area in the intermediate image comprises the following steps: define the area change rate as:
Figure DEST_PATH_IMAGE002
(ii) a In the formula: AR represents the rate of change of area of highlight regions between adjacent intermediate images; a (n) represents the area of the suspicious region in the current intermediate image; a (n +1) represents the area of the suspicious region in the next intermediate image, and eps is a set minimum value;
the circularity is defined as:
Figure DEST_PATH_IMAGE004
(ii) a In the formula: ck represents the circularity of the primitive numbered k; pk is the perimeter of the kth graphic primitive;
the fire early warning unit includes: recording a training set of the flame image sample; and the deep learning neural network is used for training the training set, establishing a deep learning model and judging the fire severity of the flame region image according to the model.
2. An image recognition-based indoor monitoring method based on the system of claim 1, characterized in that the method performs the following steps:
step S1: judging whether an abnormality occurs according to the smoke data, the temperature data, the humidity data and the infrared data monitored by each sensor, and starting an image monitoring program if the abnormality occurs;
step S2: after an image monitoring program is started, acquiring original image information;
step S3: carrying out edge detection on the original image to generate an intermediate image after edge detection;
step S4: judging and identifying a flame region in the intermediate image after the edge detection, intercepting the flame region in the intermediate image, and generating a flame region image;
in the intermediate image after the edge detection, the method for judging and identifying the flame area in the intermediate image comprises the following steps: define the area change rate as:
Figure 731625DEST_PATH_IMAGE002
(ii) a In the formula: AR represents the rate of change of area of highlight regions between adjacent intermediate images; a (n) represents the area of the suspicious region in the current intermediate image; a (n +1) represents the area of the suspicious region in the next intermediate image, and eps is a set minimum value;
the circularity is defined as:
Figure 179924DEST_PATH_IMAGE004
(ii) a In the formula: ck represents the circularity of the primitive numbered k; pk is the perimeter of the kth graphic primitive;
step S5: and judging the severity of the fire according to the generated flame area image and the flame image sample recorded into the fire early warning unit, and early warning.
3. The method according to claim 2, wherein in step S2: the method for receiving the information of the original image, carrying out edge detection on the original image and generating the intermediate image after the edge detection comprises the following steps:
step S2.1: the convolution template is used for generating a convolution template for edge detection, and the convolution template is a 3 x 3 template;
step S2.2: the following formula is used:
Figure DEST_PATH_IMAGE006
(ii) a Performing convolution operation on the convolution template and the original image in the step S2.1; wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE008
in order to generate the intermediate image(s),
Figure DEST_PATH_IMAGE010
in order to be the original image, the image is processed,
Figure DEST_PATH_IMAGE012
is a convolution template.
4. The method according to claim 3, wherein in step S5: judging the severity of the fire according to the generated flame area image and the flame image sample recorded into the fire early warning unit, and executing the following steps by the early warning method:
step S4.1: inputting an existing flame region image;
step S4.2: processing the existing flame area image to generate a plurality of training samples;
step S4.3: the training unit performs image recognition training based on deep learning by using the plurality of training samples to generate a training model;
step S4.4: the method is used for detecting whether the flame area image to be detected contains the training sample or not by using the training model.
5. An image recognition-based indoor monitoring device based on the method of any one of claims 2 to 4, wherein the device is a non-transitory computer-readable storage medium storing computing instructions, and the computing instructions comprise: judging whether an abnormality occurs according to the smoke data, the temperature data, the humidity data and the infrared data which are monitored by each sensor, and starting a code segment of an image monitoring program if the abnormality occurs; after the image monitoring program is started, acquiring a code segment of original image information; a code segment for performing edge detection on the original image to generate an intermediate image after edge detection; a code segment for judging and identifying the flame area in the intermediate image after the edge detection, intercepting the flame area in the intermediate image and generating a flame area image; and a code segment for judging the severity of the fire and carrying out early warning according to the generated flame area image and the flame image sample recorded into the fire early warning unit.
CN201910731493.9A 2019-08-08 2019-08-08 Indoor monitoring system, method and device based on image recognition Active CN110428579B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910731493.9A CN110428579B (en) 2019-08-08 2019-08-08 Indoor monitoring system, method and device based on image recognition

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910731493.9A CN110428579B (en) 2019-08-08 2019-08-08 Indoor monitoring system, method and device based on image recognition

Publications (2)

Publication Number Publication Date
CN110428579A CN110428579A (en) 2019-11-08
CN110428579B true CN110428579B (en) 2022-01-18

Family

ID=68413382

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910731493.9A Active CN110428579B (en) 2019-08-08 2019-08-08 Indoor monitoring system, method and device based on image recognition

Country Status (1)

Country Link
CN (1) CN110428579B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111739248B (en) * 2020-06-11 2022-04-01 湖北美和易思教育科技有限公司 Artificial intelligent Internet of things security system and control method
CN111899458A (en) * 2020-07-27 2020-11-06 山东工商学院 Artificial intelligence-based fire smoke image identification method
CN112185051A (en) * 2020-09-27 2021-01-05 广州华安消防有限公司 Intelligent building and installation method of fire-fighting power machine thereof

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102063704A (en) * 2010-11-19 2011-05-18 中国航空无线电电子研究所 Airborne vision enhancement method and device
CN103778418A (en) * 2014-01-28 2014-05-07 华南理工大学 Mountain fire image identification method of image monitoring system of electric transmission line tower
CN203931066U (en) * 2014-07-09 2014-11-05 成都远航科技有限公司 A kind of warehouse abnormal conditions initiative alarming device
CN105931409A (en) * 2016-05-30 2016-09-07 重庆大学 Infrared and visible light camera linkage-based forest fire monitoring method
WO2019026518A1 (en) * 2017-08-04 2019-02-07 モリタ宮田工業株式会社 Fire identification device
CN109522819A (en) * 2018-10-29 2019-03-26 西安交通大学 A kind of fire image recognition methods based on deep learning

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101888709B (en) * 2009-05-11 2013-01-23 北京融商惠通投资有限公司 Wireless sensor system and wireless sensor device
US9064394B1 (en) * 2011-06-22 2015-06-23 Alarm.Com Incorporated Virtual sensors
CN109829894B (en) * 2019-01-09 2022-04-26 平安科技(深圳)有限公司 Segmentation model training method, OCT image segmentation method, device, equipment and medium
CN109884905A (en) * 2019-02-14 2019-06-14 太仓怡泰霖智能科技有限公司 A kind of smart home management platform and its smart home monitoring control system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102063704A (en) * 2010-11-19 2011-05-18 中国航空无线电电子研究所 Airborne vision enhancement method and device
CN103778418A (en) * 2014-01-28 2014-05-07 华南理工大学 Mountain fire image identification method of image monitoring system of electric transmission line tower
CN203931066U (en) * 2014-07-09 2014-11-05 成都远航科技有限公司 A kind of warehouse abnormal conditions initiative alarming device
CN105931409A (en) * 2016-05-30 2016-09-07 重庆大学 Infrared and visible light camera linkage-based forest fire monitoring method
WO2019026518A1 (en) * 2017-08-04 2019-02-07 モリタ宮田工業株式会社 Fire identification device
CN109522819A (en) * 2018-10-29 2019-03-26 西安交通大学 A kind of fire image recognition methods based on deep learning

Also Published As

Publication number Publication date
CN110428579A (en) 2019-11-08

Similar Documents

Publication Publication Date Title
CN110428579B (en) Indoor monitoring system, method and device based on image recognition
CN110807429B (en) Construction safety detection method and system based on tiny-YOLOv3
CN107437318B (en) Visible light intelligent recognition algorithm
CN111739250A (en) Fire detection method and system combining image processing technology and infrared sensor
CN110737201B (en) Monitoring method and device, storage medium and air conditioner
AU2020334968A1 (en) Flame finding with automated image analysis
CN108298393A (en) Method based on the wrong report of depth network filtering elevator malfunction
CN111062281A (en) Abnormal event monitoring method and device, storage medium and electronic equipment
CN113887324A (en) Fire point detection method based on satellite remote sensing data
CN111797726A (en) Flame detection method and device, electronic equipment and storage medium
CN114492629A (en) Abnormality detection method, abnormality detection device, electronic apparatus, and storage medium
CN116208417A (en) Communication abnormity sensing system and method based on big data
CN115049955A (en) Fire detection analysis method and device based on video analysis technology
CN107704818A (en) A kind of fire detection system based on video image
CN111325708B (en) Transmission line detection method and server
CN116863629A (en) Alarm device, alarm method and electronic equipment based on multiple sensors
CN115862296B (en) Fire risk early warning method, system, equipment and medium for railway construction site
CN110120142B (en) Fire smoke video intelligent monitoring early warning system and early warning method
CN113516091B (en) Method for identifying electric spark image of transformer substation
CN111753587A (en) Method and device for detecting falling to ground
CN115841730A (en) Video monitoring system and abnormal event detection method
CN113299034A (en) Flame identification early warning method suitable for multiple scenes
JPH11203567A (en) Image processor for monitor
CN111191575A (en) Naked flame detection method and system based on flame jumping modeling
CN115909640B (en) Intelligent convenience store danger identification system based on edge intelligence

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB03 Change of inventor or designer information

Inventor after: Liu Baoxin

Inventor after: Song Xinfang

Inventor after: Feng Xianwu

Inventor before: Feng Xianwu

CB03 Change of inventor or designer information
TA01 Transfer of patent application right

Effective date of registration: 20211230

Address after: No. 265, Cao Guan Zhuang village, Lingang sub district office, Licheng District, Jinan City, Shandong Province

Applicant after: Liu Baoxin

Address before: 510000 Xiangjiang, feicui oasis, Zengcheng District, Guangzhou City, Guangdong Province

Applicant before: Feng Xianwu

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant