CN109377711B - Fire hazard classification method and device - Google Patents

Fire hazard classification method and device Download PDF

Info

Publication number
CN109377711B
CN109377711B CN201811095932.3A CN201811095932A CN109377711B CN 109377711 B CN109377711 B CN 109377711B CN 201811095932 A CN201811095932 A CN 201811095932A CN 109377711 B CN109377711 B CN 109377711B
Authority
CN
China
Prior art keywords
fire
fire occurrence
occurrence position
pixel
points
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811095932.3A
Other languages
Chinese (zh)
Other versions
CN109377711A (en
Inventor
李琳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SHANGHAI HUAGONG SECURITY TECHNOLOGY SERVICE Co.,Ltd.
Original Assignee
Shanghai Huagong Security Technology Service Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Huagong Security Technology Service Co ltd filed Critical Shanghai Huagong Security Technology Service Co ltd
Priority to CN201811095932.3A priority Critical patent/CN109377711B/en
Publication of CN109377711A publication Critical patent/CN109377711A/en
Application granted granted Critical
Publication of CN109377711B publication Critical patent/CN109377711B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B17/00Fire alarms; Alarms responsive to explosion
    • G08B17/12Actuation by presence of radiation or particles, e.g. of infrared radiation or of ions

Abstract

The invention provides a fire hazard classification method and a device, wherein the method comprises the steps of acquiring a spectral image set of a target area, wherein the number of spectral images in the spectral image set is not less than 100; analyzing the fire occurrence position points of each spectral image, and extracting the fire occurrence position points in the spectral images; calculating the times of fire occurrence position points of the pixels; counting the times of the fire occurrence position points of the image element points in the analysis result of the spectral image set by taking the image element points as research units; and carrying out fire hazard classification on the target area according to the times of the position coincidence of each pixel point and the position of the fire occurrence position. The invention originally provides a fire occurrence position point extraction algorithm and a fire occurrence position point number statistical calculation method, and a fire hazard acquisition algorithm taking a pattern spot as a unit, thereby obtaining a more scientific evaluation result of the fire hazard risk of the pattern spot.

Description

Fire hazard classification method and device
Technical Field
The invention relates to the field of fire prevention and control, in particular to a fire hazard classification method and a fire hazard classification device.
Background
A fire refers to a catastrophic combustion event that loses control over time or space. Among the various disasters, fire is one of the main disasters that threaten public safety and social development most often and most generally. The human being can utilize and control the fire, which is an important mark of the civilization progress. Therefore, the history of using fire by human beings and the history of fighting against fire are concomitant, people continuously summarize the fire occurrence rule while using fire, and the fire and the harm to human beings are reduced as much as possible. People need to escape safely and quickly in case of fire.
For better fire prevention and control, it is necessary to perform an evaluation of the degree of risk of fire and the probability of occurrence of the risk.
Disclosure of Invention
In order to solve the above technical problems, the present invention provides a fire hazard classification method and apparatus.
The invention is realized by the following technical scheme:
a method of fire hazard classification, the method comprising:
acquiring a spectral image set of a target area, wherein the number of spectral images in the spectral image set is not less than 100;
analyzing the fire occurrence position points of each spectral image, and extracting the fire occurrence position points in the spectral images;
calculating the times of fire occurrence position points of the pixels; counting the times of the fire occurrence position points of the image element points in the analysis result of the spectral image set by taking the image element points as research units;
and carrying out fire hazard classification on the target area according to the times of the position coincidence of each pixel point and the position of the fire occurrence position.
Further, the plurality of spectral images have the same specification, that is, the same length, width and resolution, and all correspond to the same target region.
Further, spectral images were obtained using MODIS.
Further, a spectral image statistical result and a preset judgment condition set are obtained, and fire occurrence position point detection is carried out according to the relation between the statistical result and the judgment condition set.
A fire hazard classification apparatus comprising:
the spectral image set acquisition module is used for acquiring a spectral image set of a target area;
the fire occurrence position extraction module is used for analyzing the fire occurrence position of each spectral image and extracting the fire occurrence position in the spectral images;
the counting module is used for calculating the times of the fire occurrence position points of the pixels;
and the grading module is used for grading the fire hazard of the target area according to the times of the position coincidence of each pixel point and the fire occurrence position.
In the description of the present invention, it is to be understood that the terms "central," "longitudinal," "lateral," "upper," "lower," "front," "rear," "left," "right," "vertical," "horizontal," "top," "bottom," "inner," "outer," and the like are used in the orientation or positional relationship indicated in the drawings, which are merely for convenience in describing the invention and to simplify the description, and are not intended to indicate or imply that the referenced device or element must have a particular orientation, be constructed and operated in a particular orientation, and are therefore not to be construed as limiting the invention. Furthermore, the terms "first", "second", etc. are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first," "second," etc. may explicitly or implicitly include one or more of that feature. In the description of the invention, the meaning of "a plurality" is two or more unless otherwise specified.
In the description of the invention, it is to be noted that, unless otherwise explicitly specified or limited, the terms "mounted", "connected" and "connected" are to be construed broadly, e.g. as being fixed or detachable or integrally connected; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meaning of the above terms in the creation of the present invention can be understood by those of ordinary skill in the art through specific situations.
The invention has the beneficial effects that:
the invention provides a fire hazard classification method and a fire hazard classification device, which can obtain a scientific evaluation result of the fire hazard of a pattern spot by originally proposing a fire hazard occurrence position point extraction algorithm, a fire hazard occurrence position point number statistical calculation method and a fire hazard acquisition algorithm taking the pattern spot as a unit.
Drawings
FIG. 1 is a flow chart of a fire risk classification method according to the present embodiment;
FIG. 2 is a flowchart of a method for extracting a fire occurrence location point in a spectral image according to the present embodiment;
fig. 3 is a flowchart of a method for extracting a fire occurrence location point according to the present embodiment;
FIG. 4 is a flowchart of a fire risk classification method for a target area according to the number of times each pixel coincides with a fire occurrence location according to the present embodiment;
FIG. 5 is a block diagram of a fire hazard classification apparatus according to the present embodiment;
fig. 6 is a block diagram of a fire occurrence location point extraction module provided in the present embodiment;
fig. 7 is a block diagram of a fire occurrence location point extraction unit provided in the present embodiment;
fig. 8 is a block diagram of a grading module provided in the present embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be described in further detail below.
An embodiment of the present invention provides a fire risk classification method, as shown in fig. 1, including:
s101, obtaining a spectral image set of a target area, wherein the spectral image set comprises a plurality of spectral images.
Specifically, the plurality of spectral images have the same specification, i.e., the length, the width and the resolution are the same, and all correspond to the same target region. The greater the number of spectral images, the more accurate the fire hazard classification. The number of the original spectrum images is not less than 100 in the embodiment of the invention.
And S102, carrying out fire occurrence position point analysis on each spectral image, and extracting fire occurrence position points in the spectral images.
And S103, calculating the times of the fire occurrence position points of the pixels.
And counting the number of times of the position coincidence of each pixel point and the fire position by taking the pixel point as a unit, namely the number of times of the fire position.
And counting the times of the pixel points belonging to the fire occurrence position points in the analysis result of the spectral image set by taking the pixel points as research units.
And S104, carrying out fire hazard classification on the target area according to the number of times that each pixel point is superposed with the position of the fire occurrence position.
Obviously, the more times, the greater the probability of fire at the position of the pixel point.
Specifically, in the embodiment of the present invention, the spectral image is obtained using MODIS. The full name of MODIS is medium resolution imaging spectrometer. An important sensor carried by MODIS on terra and aqua satellites is a satellite-borne instrument which only broadcasts real-time observation data to the whole world directly through an x wave band and can receive the data free and use the data free, and MODIS data is received and used by many countries and regions all over the world.
Specifically, the performing of the fire occurrence location point analysis on each spectral image and extracting the fire occurrence location point in the spectral image as shown in fig. 2 includes:
and S1021, extracting suspected fire occurrence position points.
Let T4And T11Luminance temperatures of 4 and 11 microns for the picture element, respectively, if the picture element satisfies the following condition:
in the daytime: (T)4φ315K AND T4-T11φ15K)
Or the like, or, alternatively,
at night: t is4φ305KAND T4-T11φ15K
It is determined as a suspected fire occurrence location point.
And S1022, extracting fire occurrence position points.
Specifically, the extracting of the fire occurrence location point, as shown in fig. 3, includes the steps of:
s10221, taking a suspected fire occurrence position as a center, establishing a square observation window, and performing statistical analysis based on temperature characteristics on pixels in the observation window.
The following conditions are simultaneously satisfied in the observation window:
day time T4φ330K AND T4-T11φ15K
Or the like, or, alternatively,
at night: t is4φ315KAND T4-T11φ15K
The pixel of (1) is marked as a pre-judging fire occurrence position point, and other pixels in the observation window are marked as pixels to be counted. Setting the brightness temperature of 4 microns as the first brightness temperature and the brightness temperature of 11 microns as the second brightness temperature, recording
Figure BDA0001805538380000051
δ4
Figure BDA0001805538380000052
δ11、ΔT、δΔTRespectively as follows:
calculating a first brightness temperature mean value of the pixel to be counted;
the average absolute deviation of the first brightness temperature of the pixel to be counted;
calculating a second brightness temperature mean value of the pixel to be counted;
the average absolute deviation of the second brightness temperature of the pixel to be counted;
calculating the average value of the first brightness temperature and the second brightness temperature of the pixel;
calculating the average absolute deviation of the first brightness temperature and the second brightness temperature of the pixel to be counted;
note also:
Figure BDA0001805538380000053
δ4' the average value and the average absolute deviation of the first brightness temperature of the pixel of the position point where the fire disaster is predicted to occur in the observation window are respectively.
S10222, acquiring a judgment condition set.
Specifically, the set of judgment conditions includes the following judgment conditions:
jug 1: in the daytime: t is4Phi 365K or, night: t is4φ330K
Jug2:
Figure BDA0001805538380000054
Jug3:
Figure BDA0001805538380000055
Jug4:
Figure BDA0001805538380000056
Jug5:
Figure BDA0001805538380000061
Jug6:δ4′φ5.5K
S10223, acquiring the statistical analysis result and the condition that the judgment condition is consistent with the centralized judgment condition, and identifying the fire occurrence position.
Specifically, the fire occurrence location point identification algorithm is as follows:
jug1OR ((Jug2 AND Jug3 AND Jug4) AND (Jug5 OR Jug6)) OR (Jug2 AND Jug3 AND Jug4) is true, the suspected fire occurrence location point is identified as the fire occurrence location point.
Further, the fire hazard classification of the target area according to the number of times that each pixel point coincides with the position of the fire occurrence position is shown in fig. 4, and includes:
s1041, acquiring an image map of the target area.
S1042, calculating the sum of the times of fire occurrence position points corresponding to the pixels included in each pattern spot in the image map.
S1043, calculating fire danger values of all the pattern spots in the image map.
The same pattern spots belong to the same ground object, and the fire in the ground object can spread to the whole ground object, so the greater the pattern spots are, the greater the fire risk is. Specifically, in the embodiment of the invention, the fire risk value of the pattern spot is equal to the sum of the times of fire occurrence position points multiplied by the area of the minimum circumscribed rectangle of the pattern spot.
S1044, obtaining the fire hazard grade of the pattern spots according to the fire hazard values of the pattern spots.
Specifically, a boundary threshold value of each fire risk level may be set, and the fire risk level of the pattern spot may be obtained according to a corresponding relationship between the boundary threshold value and the fire risk value. The boundary threshold may be set manually, and is not particularly limited in the embodiment of the present invention.
A fire hazard classification apparatus according to an embodiment of the present invention, as shown in fig. 5, includes:
the spectral image set acquisition module 201 is configured to acquire a spectral image set of a target area, where the spectral image set includes a plurality of spectral images;
a fire occurrence location point extraction module 202, configured to perform fire occurrence location point analysis on each spectral image, and extract a fire occurrence location point in the spectral image;
the counting module 203 is used for calculating the times of the fire occurrence position points of the pixels;
and the grading module 204 is used for grading the fire hazard of the target area according to the number of times that each pixel point is superposed with the position of the fire occurrence position.
Wherein the statistical module 203 and the classification module 204 constitute a fire risk classification module.
The fire occurrence location point extraction module 202, as shown in fig. 6, includes:
a suspected fire occurrence location point extracting unit 2021 for extracting a suspected fire occurrence location point.
A fire occurrence location point extracting unit 2022 for extracting a fire occurrence location point.
As shown in fig. 7, the fire occurrence location point extraction unit 2022 includes:
and the statistical subunit is used for establishing a square observation window by taking the suspected fire occurrence position point as the center, and performing statistical analysis based on temperature characteristics on the pixels in the observation window.
A condition acquisition unit for acquiring the set of judgment conditions.
And the identification unit is used for acquiring the coincidence condition of the result of the statistical analysis and the centralized judgment condition of the judgment condition so as to identify the fire occurrence position point.
The ranking module 204, as shown in FIG. 8, includes:
the influence map acquiring unit 2041 is configured to acquire an image map of the target area.
And a sum value obtaining unit 2042, configured to calculate a sum value of the number of times of fire occurrence location points corresponding to the pixels included in each of the patches in the image map.
And a risk value calculation unit 2043, configured to calculate fire risk values of each pattern spot in the image map.
And the grade judging unit 2044 is used for obtaining the fire danger grade of the pattern spots according to the fire danger values of the pattern spots.
The inventive device embodiment and the inventive method embodiment are based on the same inventive concept.
Embodiments of the present invention also provide a storage medium, which can be used to store program codes used in implementing the embodiments. Optionally, in this embodiment, the storage medium may be located in at least one network device of a plurality of network devices of a computer network. Optionally, in this embodiment, the storage medium may include, but is not limited to: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
In the above embodiments of the present invention, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed terminal can be implemented in other manners. The above-described system embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
It should be noted that: the sequence of the above embodiments of the present invention is only for description, and does not represent the advantages and disadvantages of the embodiments.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (1)

1. A method of fire hazard classification, the method comprising:
acquiring a spectral image set of a target area; the spectral image set comprises a plurality of spectral images;
analyzing the fire occurrence position points of each spectral image, and extracting the fire occurrence position points in the spectral images;
calculating the times of fire occurrence position points of the pixels; counting the number of times that each pixel point is superposed with the position of the fire occurrence position by taking the pixel point as a unit, namely the number of times of the fire occurrence position; counting the times of the fire occurrence position points of the image element points in the analysis result of the spectral image set by taking the image element points as research units; carrying out fire hazard classification on the target area according to the number of times of position coincidence of each pixel point and a fire occurrence position; the more the times are, the greater the probability of fire at the position of the pixel point is;
acquiring an image map of a target area;
calculating the fire hazard value of each pattern spot in the image map;
obtaining the fire hazard grade of the pattern spots according to the fire hazard values of the pattern spots;
the calculating of the fire risk value of each pattern spot in the image map comprises:
calculating the sum value of the times of fire occurrence position points corresponding to pixels included in each pattern spot in the image map;
the fire risk value of the pattern spot is equal to the sum of the times of fire occurrence position points multiplied by the area of the minimum circumscribed rectangle of the pattern spot;
the analyzing of the fire occurrence location point for each spectral image, and the extracting of the fire occurrence location point in the spectral image includes:
extracting suspected fire occurrence position points;
extracting fire occurrence position points;
the method for extracting the fire occurrence location point comprises the following steps:
establishing a square observation window by taking a suspected fire occurrence position point as a center, and performing statistical analysis based on temperature characteristics on pixels in the observation window; the following conditions are simultaneously satisfied in the observation window:
in the daytime, T4 is more than 330K AND T4-T11 is more than 15K
Or the like, or, alternatively,
at night: t4 > 315KAND T4-T11 > 15K
The pixels are marked as the points for predicting the fire occurrence positions, and other pixels in the observation window are marked as pixels to be counted; note the book
Figure DEST_PATH_IMAGE002
Figure DEST_PATH_IMAGE004
Figure DEST_PATH_IMAGE006
Figure DEST_PATH_IMAGE008
Figure DEST_PATH_IMAGE010
Figure DEST_PATH_IMAGE012
Respectively as follows:
calculating a first brightness temperature mean value of the pixel to be counted;
the average absolute deviation of the first brightness temperature of the pixel to be counted;
calculating a second brightness temperature mean value of the pixel to be counted;
the average absolute deviation of the second brightness temperature of the pixel to be counted;
calculating the average value of the first brightness temperature and the second brightness temperature of the pixel;
calculating the average absolute deviation of the first brightness temperature and the second brightness temperature of the pixel to be counted;
note also:
Figure DEST_PATH_IMAGE014
Figure DEST_PATH_IMAGE016
respectively predicting the mean value and the average absolute deviation of the first brightness temperature of the pixel of the fire occurrence position point in the observation window;
acquiring a judgment condition set; the judgment condition set comprises the following judgment conditions:
jug 1: in the daytime: t4 > 365K or, night: t4 > 330K
Jug2:
Figure DEST_PATH_IMAGE018
Jug3:
Figure DEST_PATH_IMAGE020
Jug4:
Figure DEST_PATH_IMAGE022
Jug5:
Figure DEST_PATH_IMAGE024
Jug6:
Figure DEST_PATH_IMAGE026
Acquiring the coincidence condition of the result of the statistical analysis and the centralized judgment condition of the judgment condition, thereby identifying the position point of the fire; the fire occurrence position point identification algorithm comprises the following steps:
jug1OR ((Jug2 AND Jug3 AND Jug4) AND (Jug5 OR Jug6)) OR (Jug2 AND Jug3 AND Jug4) is true, the suspected fire occurrence location point is identified as a fire occurrence location point;
the multiple spectral images have the same specification, namely the length, the width and the resolution are the same and correspond to the same target area; spectral images were obtained using MODIS;
and acquiring a spectral image statistical result and a preset judgment condition set, and detecting a fire occurrence position according to the relation between the statistical result and the judgment condition set.
CN201811095932.3A 2018-09-19 2018-09-19 Fire hazard classification method and device Active CN109377711B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811095932.3A CN109377711B (en) 2018-09-19 2018-09-19 Fire hazard classification method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811095932.3A CN109377711B (en) 2018-09-19 2018-09-19 Fire hazard classification method and device

Publications (2)

Publication Number Publication Date
CN109377711A CN109377711A (en) 2019-02-22
CN109377711B true CN109377711B (en) 2021-07-09

Family

ID=65405635

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811095932.3A Active CN109377711B (en) 2018-09-19 2018-09-19 Fire hazard classification method and device

Country Status (1)

Country Link
CN (1) CN109377711B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110459030A (en) * 2019-09-06 2019-11-15 深圳市瑞讯云技术有限公司 The detection method and device of fire
CN112991670A (en) * 2021-02-04 2021-06-18 西安美格智联软件科技有限公司 Fire-fighting dangerous area classification management and control method and system, storage medium and processing terminal
CN113033391B (en) * 2021-03-24 2022-03-08 浙江中辰城市应急服务管理有限公司 Fire risk early warning research and judgment method and system

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101989373A (en) * 2009-08-04 2011-03-23 中国科学院地理科学与资源研究所 Visible light-thermal infrared based multispectral multi-scale forest fire monitoring method
CN104700095A (en) * 2015-03-30 2015-06-10 北京市环境保护监测中心 Satellite remote sensing monitoring method and processing device for straw burning fire points
CN105488220A (en) * 2015-12-23 2016-04-13 天维尔信息科技股份有限公司 Fire-fighting early-warning method and device

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3067285B2 (en) * 1991-07-12 2000-07-17 ホーチキ株式会社 Fire detection device using image processing
CN101592524B (en) * 2009-07-07 2011-02-02 中国科学技术大学 Inter-class variance based MODIS forest fire point detection method
CN106646651A (en) * 2016-12-14 2017-05-10 中国科学院遥感与数字地球研究所 Fire point detection method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101989373A (en) * 2009-08-04 2011-03-23 中国科学院地理科学与资源研究所 Visible light-thermal infrared based multispectral multi-scale forest fire monitoring method
CN104700095A (en) * 2015-03-30 2015-06-10 北京市环境保护监测中心 Satellite remote sensing monitoring method and processing device for straw burning fire points
CN105488220A (en) * 2015-12-23 2016-04-13 天维尔信息科技股份有限公司 Fire-fighting early-warning method and device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于环境一号卫星数据的小麦秸秆焚烧点提取方法研究;张为兵;《中国优秀硕士学位论文全文数据库 农业科技辑》;20140215(第2期);第3.3节 *

Also Published As

Publication number Publication date
CN109377711A (en) 2019-02-22

Similar Documents

Publication Publication Date Title
CN109360369B (en) Method and device for analyzing fire hazard based on clustering result
CN109377711B (en) Fire hazard classification method and device
Berra et al. Assessing spring phenology of a temperate woodland: A multiscale comparison of ground, unmanned aerial vehicle and Landsat satellite observations
US20190050625A1 (en) Systems, methods and computer program products for multi-resolution multi-spectral deep learning based change detection for satellite images
US20190279019A1 (en) Method and apparatus for performing privacy masking by reflecting characteristic information of objects
CN111127508B (en) Target tracking method and device based on video
Wang et al. Detecting tents to estimate the displaced populations for post-disaster relief using high resolution satellite imagery
US11157735B2 (en) Cloud detection in aerial imagery
CN108847031B (en) Traffic behavior monitoring method and device, computer equipment and storage medium
CN109547748B (en) Object foot point determining method and device and storage medium
CN109993020A (en) Face is deployed to ensure effective monitoring and control of illegal activities alarm method and device
CN106846304B (en) Electrical equipment detection method and device based on infrared detection
JP2018066943A (en) Land category change interpretation support device, land category change interpretation support method, and program
CN111612104A (en) Vehicle loss assessment image acquisition method, device, medium and electronic equipment
US8996577B2 (en) Object information provision device, object information provision system, terminal, and object information provision method
CN109377712B (en) Classification method based on pattern spot fire risk value and storage medium
CN108230288B (en) Method and device for determining fog state
CN110440764B (en) Meter detection cradle head secondary alignment method, device and equipment
CN109238977B (en) Fire hazard classification method and device based on spectral analysis and storage medium
CN111931744B (en) Method and device for detecting change of remote sensing image
CN116797993B (en) Monitoring method, system, medium and equipment based on intelligent community scene
CN110896475B (en) Display terminal channel switching detection method and device
JP6434834B2 (en) Inspection object extraction device and inspection object extraction method
US20190289357A1 (en) Camera system
CN109242195B (en) Method for predicting fire occurrence condition of specified area

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20210622

Address after: 201599 room 547, 2nd floor, building 5, No. 3688, Tingwei Road, Caojing Town, Jinshan District, Shanghai

Applicant after: SHANGHAI HUAGONG SECURITY TECHNOLOGY SERVICE Co.,Ltd.

Address before: 11 fangzhuanchang Hutong, Dongcheng District, Beijing 102200

Applicant before: Li Lin

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant