CN113946641A - Forest fire area judgment method - Google Patents

Forest fire area judgment method Download PDF

Info

Publication number
CN113946641A
CN113946641A CN202111101112.2A CN202111101112A CN113946641A CN 113946641 A CN113946641 A CN 113946641A CN 202111101112 A CN202111101112 A CN 202111101112A CN 113946641 A CN113946641 A CN 113946641A
Authority
CN
China
Prior art keywords
fire
fire point
latitude
forest
sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111101112.2A
Other languages
Chinese (zh)
Other versions
CN113946641B (en
Inventor
张涛
陈超
周剑敏
刘洁
乌日娜
邓楼楼
程莉
刘彤
王子寒
杨晓龙
涂智军
郭瑞科
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Institute of Control Engineering
Original Assignee
Beijing Institute of Control Engineering
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Control Engineering filed Critical Beijing Institute of Control Engineering
Priority to CN202111101112.2A priority Critical patent/CN113946641B/en
Publication of CN113946641A publication Critical patent/CN113946641A/en
Application granted granted Critical
Publication of CN113946641B publication Critical patent/CN113946641B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/0014Radiation pyrometry, e.g. infrared or optical thermometry for sensing the radiation from gases, flames
    • G01J5/0018Flames, plasma or welding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F13/00Interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units
    • G06F13/38Information transfer, e.g. on bus
    • G06F13/42Bus transfer protocol, e.g. handshake; Synchronisation
    • G06F13/4282Bus transfer protocol, e.g. handshake; Synchronisation on a serial bus, e.g. I2C bus, SPI bus
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A40/00Adaptation technologies in agriculture, forestry, livestock or agroalimentary production
    • Y02A40/10Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in agriculture
    • Y02A40/28Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in agriculture specially adapted for farming

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Plasma & Fusion (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Remote Sensing (AREA)
  • Data Mining & Analysis (AREA)
  • Fire Alarms (AREA)

Abstract

The invention discloses a forest fire area judgment method, and belongs to the technical field of satellite attitude control. The method is used for a novel satellite-borne fire detection sensor system, the longitude and latitude positions of the fire are calculated while the fire detection is finished, whether the fire is a forest fire or not is determined, and the most direct fire information is provided for the national forest fire prevention unit. Firstly, data interaction between the novel fire point sensor and an upper computer is specified for the novel fire point sensor; and secondly, matching the longitude and latitude information of the fire point obtained by calculation of the upper computer with a global forest area pre-installed in the system, judging whether the fire point is in the forest area, and giving out whether the fire point is in the interior or the exterior.

Description

Forest fire area judgment method
Technical Field
The invention relates to a forest fire area judgment method, and belongs to the technical field of satellite attitude control.
Background
The fire detection sensor is a novel satellite-borne sensor, the longitude and latitude position of a fire point is calculated by the system according to the output information of the sensor, and a sensor user is more concerned about whether the fire point is a forest fire point or not, so that the forest fire area judgment needs to be carried out by the system.
Disclosure of Invention
The invention aims to overcome the defects and provides a forest fire area judgment method, which comprises the steps of receiving fire azimuth information input by a fire sensor according to interaction of an upper computer system and the fire sensor, converting the fire azimuth information into geographical longitude and latitude information of a fire, matching whether the fire is a forest fire in a plurality of forest areas pre-installed by the system, and informing a user whether the fire is a domestic forest fire or an outdoor forest fire.
In order to achieve the above purpose, the invention provides the following technical scheme:
a forest fire area judgment method comprises the following steps:
(1) the upper computer system acquires each of the N forest regions according to the map information of the N forest regionsThe upper geographic latitude limit and the lower geographic latitude limit of the self, wherein the upper geographic latitude limit of the ith forest zone is recorded as lambdamaxiThe lower latitude limit is recorded as lambdamini,1≤i≤N;
(2) The upper computer system is spaced at an interval delta lambda according to equal latitudeiDividing the ith forest zone into jmaxEach latitude strip acquires the upper limit L of the geographic longitude of the forest area on each latitude stripmaxijAnd a lower geographic longitude limit Lminij
(3) The upper computer system interacts with the fire point sensor, receives fire point azimuth information input by the fire point sensor, and converts the fire point azimuth information into fire point geographical longitude and latitude information (L)mm);
(4) The upper computer system is based on the geographic longitude and latitude information (L) of the fire pointmm),λmaxiAnd λminiA determined latitude range, and LmaxijAnd LminijAnd determining the fire point occurrence area according to the determined longitude range.
Further, in the step (4), according to the geographic longitude and latitude information (L) of the fire pointmm),λmaxiAnd λminiA determined latitude range, and LmaxijAnd LminijThe method for judging the fire point occurrence area in the determined longitude range comprises the following steps:
(411)i=1;
(412) determining fire latitude λmWhether it is in the latitude range (lambda) of the ith forest zoneminimaxi);
If not, then executing the step (417);
if the latitude is within the latitude range, the step (413) is carried out;
(413)j=1;
(414) judging fire point longitude LmWhether it is in the longitude range (L) of the jth latitude strip in the ith forest zoneminij,Lmaxij);
If the longitude range is in, then step (415) is carried out;
if not, then go to step (416);
(415) if i belongs to [1, N ], outputting a prompt that the fire point is in the forest region, outputting longitude and latitude information of a jth latitude strip in the ith forest region, and ending the circulation;
(416) if j > jmaxThen, go to step (417); otherwise j equals j +1, return to step (414);
(417) if i is not more than N, i is i +1, and returning to the step (412); if i is larger than N, ending the judging process, outputting that the fire point is not in the forest area, and ending the circulation.
Further, N is N1+ N2, where N1 is the number of domestic forest fire areas, and N2 is the number of overseas forest fire areas.
The geographic longitude and latitude information (L) according to the fire pointmm),λmaxiAnd λminiA determined latitude range, and LmaxijAnd LminijThe method for judging the fire point occurrence area in the determined longitude range comprises the following steps:
(421)i=1;
(422) determining fire latitude λmWhether it is in the latitude range (lambda) of the ith forest zoneminimaxi);
If not, then executing the step (417);
if the latitude is within the latitude range, the step (413) is carried out;
(423)j=1;
(424) judging fire point longitude LmWhether it is in the longitude range (L) of the jth latitude strip in the ith forest zoneminij,Lmaxij);
If the longitude range is in, then step (425) is carried out;
if the longitude range is not in the longitude range, the step (426) is carried out;
(425) if i belongs to [1, N1], outputting 'the fire is an internal forest fire', ending the circulation, otherwise, outputting 'the fire is an external forest fire', and ending the circulation;
(426) if j > jmaxThen go to step (427); otherwise, returning to the step (424);
(427) if i is not more than N, i is i +1, and returning to the step (412); if i is larger than N, ending the judging process, outputting that the fire point is not in the forest area, and ending the circulation.
Further, in the step (3), the geographical longitude and latitude information of the fire point is the geographical longitude and latitude information of the fire point after confirmation; the interaction method of the upper computer system and the fire point sensor comprises the following steps:
(31) the FIRE point sensor outputs the acquired FIRE point information to an upper computer in a TM _ FIRE data packet mode; the fire point information comprises fire point extraction number, real fire point marks and fire point azimuth information;
(32) the upper computer receives a TM _ FIRE data packet input by the FIRE point sensor, judges according to the FIRE point extraction number and the real FIRE point mark, and performs the following operations according to the judgment result: the method comprises the steps that an instruction is not sent to a fire point sensor, or a TC _ FIREFATA instruction is sent to the fire point sensor, or a TC _ PRE _ FIREFATA instruction is sent to the fire point sensor; the TC _ FIREDIDATA instruction and the TC _ PRE _ FIREDIDATA instruction comprise fire point extraction numbers, real fire point marks and fire point longitude and latitude information; the TC _ FIREDATA corresponds to a confirmed fire point, and the TC _ PRE _ FIREDATA instruction corresponds to a PRE-confirmed fire point;
(33) the fire point sensor receives a TC _ FIREFATA instruction input by an upper computer, and stores fire point information contained in the TC _ FIREFATA instruction into an independent area divided by the fire point sensor;
the upper computer extracts the fire point information stored by the fire point sensor to the memory area of the upper computer through a 1553B bus to obtain the geographical longitude and latitude information of the confirmed fire point.
Further, in the step (31), the FIRE point sensor sequentially outputs a first TM _ FIRE and a second TM _ FIRE every period, which respectively correspond to the first identified FIRE point information before confirmation and the second identified FIRE point information after confirmation of the FIRE point sensor;
when the FIRE point sensor identifies the FIRE point for the first time, the FIRE point extraction number in the TM _ FIRE for the first time is more than 0, and the real FIRE point mark is 0; if the FIRE point is a real FIRE point after the second identification and confirmation, the FIRE point extraction number in the TM _ FIRE for the second time is more than 0, the real FIRE point mark is 1, if the FIRE point is a non-real FIRE point after the second identification and confirmation, the FIRE point extraction number in the TM _ FIRE for the second time is 0, and the real FIRE point mark is 0;
when the FIRE point sensor does not recognize the FIRE point for the first time, the FIRE point extraction numbers in the first TM _ FIRE output and the second TM _ FIRE output are both 0, and the real FIRE point mark is 0.
Further, in the step (32), the upper computer receives the fire information input by the fire sensor, and the method for judging according to the point extraction number and the real fire mark is as follows:
if the extracted fire point number is 0, the fire point longitude and latitude are not calculated, the upper computer does not reply a TC _ FIREFATA or TC _ PRE _ FIREFATA instruction to the fire point sensor;
if the number of the extracted fire points is greater than 0 and the real fire point mark is 0, the upper computer calculates longitude and latitude information of all the extracted fire points and sends a TC _ PRE _ FIREFATA instruction to the fire point sensor;
if the number of extracted fire points is greater than 0 and the real fire point mark is 1, the upper computer calculates the longitude and latitude of all extracted fire points and sends a TC _ FIREADATA instruction to the fire point sensor.
Further, the fire point azimuth information is a unit vector representation (x, y, z) of the fire point vector in a sensor self reference mirror coordinate system, and x is satisfied2+y2+z 21 is ═ 1; the sensor itself reference mirror coordinate system is defined as: the center of the reference mirror is a coordinate origin, the direction of the optical axis of the sensor is a + Z axis, the direction of the center of the reference mirror pointing to a connector fixedly connecting the sensor to the star body is a + X axis, and a right-hand rectangular coordinate system is formed by the + Y axis, the + X axis and the + Z axis.
Further, in the interaction method of the upper computer system and the fire point sensor, the time of the fire point sensor is synchronous with that of the upper computer;
the method comprises the steps that an upper computer periodically sends a time calibration instruction to a fire point sensor, the fire point sensor immediately latches the time of the fire point sensor after receiving the instruction, compensation is carried out according to the difference value between the time of the upper computer and the time of the fire point sensor, and the consistency of the time of the fire point sensor and the time of the upper computer is achieved;
and when the fire point sensor does not receive a time calibration instruction sent by the upper computer within a certain period of time, correcting the self time according to the internal clock of the fire point sensor.
Further, the TM _ FIRE data packet, the TC _ FIRE data command and the TC _ PRE _ FIRE data command further include FIRE area and characteristic information, and the upper computer system is based on geographic longitude and latitude information (L) of the FIREmm),λmaxiAnd λminiA determined latitude range, and LmaxijAnd LminijAnd determining the fire point occurrence area and outputting fire point area and characteristic information in the determined longitude range.
Compared with the prior art, the invention has the following beneficial effects:
(1) the invention defines the data interaction between the upper computer and the fire point sensor. Receiving fire point azimuth information input by a fire point sensor, and converting the fire point azimuth information into geographical longitude and latitude information of a fire point; the specific content of the fire point information output of the sensor is specified, and the information interaction content is respectively designed aiming at the two conditions that the fire point is detected and the fire point is not detected by the sensor, so that the practicability and the intelligent level of the sensor are improved;
(2) according to the method, the areas of the forest maps are accurately defined through the upper geographical latitude limit and the lower geographical latitude limit, and the upper geographical longitude limit and the lower geographical longitude limit of each latitude strip, whether the longitude and latitude information of the fire point falls into each forest area or not is judged, the occurrence area of the fire point can be judged, and the judgment method is accurate;
(3) the invention can inform the user of whether the fire point is a domestic forest fire point or an extravehicular forest fire point in real time, and provides the most direct fire information for the national forest fire prevention department.
Drawings
FIG. 1 is a schematic diagram of a prior art fire sensor configuration;
FIG. 2 is a schematic structural diagram of a rotary encoder of a fire point sensor in the prior art;
FIG. 3 is a schematic diagram of data interaction between the fire point sensor and an upper computer according to the present invention;
FIG. 4 is a schematic diagram of cloud judgment data output according to the present invention;
FIG. 5 is a schematic diagram of the division of a forest area into latitude and longitude areas according to the present invention;
FIG. 6 is a general flowchart of a forest fire area determination method according to the present invention;
fig. 7 is a flowchart of determining an occurrence area of a fire in the forest fire area determining method of the present invention.
Detailed Description
The features and advantages of the present invention will become more apparent and appreciated from the following detailed description of the invention.
The word "exemplary" is used exclusively herein to mean "serving as an example, embodiment, or illustration. Any embodiment described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments. While the various aspects of the embodiments are presented in drawings, the drawings are not necessarily drawn to scale unless specifically indicated.
As shown in fig. 1 and 2, the fire point detection sensor includes an optical lens, a motor, a code wheel, various optical filters on the code wheel, an opaque correction baffle and a full-pass, the motor is connected with the reducer, the code wheel is fixed on an output shaft of the reducer, the rotary disc is provided with various optical filters (such as different long-wave band infrared filters and middle-wave band infrared filters), the correction baffle and the full-pass around a rotating shaft thereof, the code wheel is driven by the reducer to rotate so that the required optical filters are positioned in an optical path to realize fire point detection, and azimuth information of the fire point in a sensor reference mirror coordinate system is sent to an upper computer system.
As shown in FIG. 3 and FIG. 4, the data interaction method between the fire point sensor and the upper computer of the invention comprises the following steps:
firstly, outputting fire information: as shown in fig. 2, the code wheel of the filter wheel inside the FIRE point sensor moves for about 1.5-2 s for one circle, and effective TM _ FIRE data packets are output twice in each circle (the frame count changes):
sequentially outputting a first TM _ FIRE and a second TM _ FIRE by the FIRE point sensor every period, wherein the first TM _ FIRE and the second TM _ FIRE respectively correspond to the pre-confirmation FIRE point information recognized for the first time by the FIRE point sensor and the post-confirmation FIRE point information recognized for the second time;
when the FIRE point sensor identifies the FIRE point for the first time, the FIRE point extraction number in the TM _ FIRE for the first time is more than 0, and the real FIRE point mark is 0; if the FIRE point is a real FIRE point after the second identification and confirmation, the FIRE point extraction number in the TM _ FIRE for the second time is more than 0, the real FIRE point mark is 1, if the FIRE point is a non-real FIRE point after the second identification and confirmation, the FIRE point extraction number in the TM _ FIRE for the second time is 0, and the real FIRE point mark is 0;
if the FIRE point is not identified, the FIRE point extraction number in the first TM _ FIRE output and the second TM _ FIRE output is 0;
and then, carrying out fire point information input: after receiving the TM _ FIRE data packet, the upper computer judges the numFireExtracted FIRE point extraction number n and the real FIRE point mark in the packet;
if the numFireExtracted fire point extraction number n is 0, the longitude and latitude do not need to be calculated, and a sensor does not need to reply a TC _ FIREDATA or TC _ PRE _ FIREDATA instruction;
if the numFireExtracted extracted FIRE point number n is larger than 0 and the real FIRE point mark is 0, the TM _ FIRE data packet corresponds to the FIRE point before confirmation, the host computer calculates the longitude and latitude of all the extracted FIRE points, a sensor is informed through a TC _ PRE _ FIREFATA instruction, and the sensor performs windowing by using data in the instruction;
if the numFireExtracted extracted FIRE point number n is larger than 0 and the real FIRE point mark is 1, the TM _ FIRE data packet is corresponding to the confirmed FIRE point, the host computer calculates the longitude and latitude of all the extracted FIRE points, eliminates the false FIRE points according to the longitude and latitude and the longitude and latitude of the pre-installed false FIRE points, rearranges the rest FIRE points and returns the rearranged FIRE points to the sensor through a TC _ FIREDATA instruction.
In the interaction process, the time consistency between the upper computer and the sensor needs to be ensured, and a preferable method is that the upper computer sends a time calibration instruction to the sensor, and the specific method is as follows:
the upper computer sends a time calibration instruction to the sensor every 1s, and the instruction comprises the time of the upper computer; and the sensor latches the sensor time immediately after receiving the instruction, and compensates the sensor time by using the difference between the received upper computer time and the self time to ensure the consistency of the sensor time and the upper computer time.
Further, in order to improve the reliability of the sensor and the accuracy of fire point prediction, the upper computer sends required auxiliary data information (the longitude and latitude of the center of the field of view of the sensor, the attribute of the earth surface projection point at the center of the field of view, the speed of the earth surface projection point at the center of the field of view, the distance between the earth surface projection point of the sight line at the center of the field of view and the satellite, the angular speed of the sensor and the like) to the sensor.
Further, the sensor performs cloud judgment data output: the sensor detects cloud layer targets in a field of view in real time and has the cloud judgment result output capacity of 32 multiplied by 8 areas;
further, the sensor autonomously stores fire point information and outputs the stored fire point information according to requirements: the sensors are divided into independent areas for storing fire point information, and the upper computer moves the fire point information stored by the sensors to the internal memory area of the upper computer software through a 1553B bus.
Further, the method also comprises the following steps of on-track maintenance of the sensor: and the upper computer performs on-orbit maintenance on the parameters and software of the sensor through the on-orbit maintenance instruction.
Example 1
As shown in fig. 4, in this embodiment, the content of the data interaction method mainly includes:
(1) time alignment instructions: the upper computer sends a time calibration instruction TC _ SYNC _ TIMEABS to the sensor every 1s, and the instruction content is as shown in Table 1:
TABLE 1 time alignment command TC _ SYNC _ TIMEABS content
Content providing method and apparatus Number of bytes Format Equivalent weight
Time _ s 4 Unsigned integer 1s
Time _ us 4 Unsigned integer 1us
The sensor latches the sensor time immediately after receiving the instruction, and compensates the sensor time by using the difference between the received upper computer time and the self time to ensure the consistency of the sensor time and the upper computer time; if the upper computer does not periodically send the instruction, namely the sensor does not receive the time of the upper computer within a certain period of time, the sensor is still required to correct the time of the sensor according to the internal clock. The time unification between the sensor and the upper computer is realized.
(2) Sensor assistance data: in order to improve the reliability of the sensor and the accuracy of fire point prediction, the upper computer sends required auxiliary data information (the longitude and latitude of the center of the field of view of the sensor, the attribute of the earth surface projection point at the center of the field of view, the speed of the earth surface projection point at the center of the field of view, the distance between the earth surface projection point of the sight line at the center of the field of view and a satellite and the like) to the sensor. The longitude and latitude of the center of the visual field represent the longitude and latitude positions of the projection points of the visual axis of the sensor on the earth surface; the ground projection point attribute represents the attribute of the visual axis of the sensor at the ground projection point, such as land or sea, the inside of the earth or the outside of the earth, and the like. The sensor assists fire point judgment according to the auxiliary data, for example, if the fire point is identified in an ocean area, the fire point may be an offshore drilling platform and not a forest fire point; for example, if the sensor identifies a fire outside the earth, it may be exposed to sunlight, etc.
(3) And (3) fire point information output: the code wheel in the FIRE point sensor moves for about 1.5-2 s for one circle, effective TM _ FIRE data packets are output twice in each circle, the contents of the data packets are shown in table 2, the real FIRE point mark is 1 or 0,1 represents the confirmed real FIRE point, 0 represents the FIRE point before confirmation or does not exist(ii) a The fire point orientation is the unit vector representation (X, Y, Z) of the fire point vector in the sensor self reference mirror coordinate system, and satisfies X2+Y2+Z2=1:
TABLE 2TM _ FIRE packet contents
Figure BDA0003270945710000081
Figure BDA0003270945710000091
If the FIRE point sensor identifies the FIRE point, the TM _ FIRE is output as the FIRE point data before confirmation for the first time in the period, and the TM _ FIRE is output as the FIRE point data after confirmation for the second time (if the FIRE point is real); if no real FIRE point is confirmed, the FIRE point extraction number in the second TM _ FIRE output is 0;
if the FIRE point sensor does not identify the FIRE point, the FIRE point extraction number in the first TM _ FIRE output and the second TM _ FIRE output is 0;
(4) fire point information input: after receiving the TM _ FIRE data packet, the upper computer judges the 'FIRE point extraction number' and the 'real FIRE point mark' in the packet:
if the 'fire point extraction number' is 0, the fire point longitude and latitude do not need to be calculated, and a sensor does not need to reply a TC _ FIREFATA or TC _ PRE _ FIREFATA instruction;
if the 'FIRE point extraction number' is greater than 0 and the 'real FIRE point mark' is 0, the TM _ FIRE data packet corresponds to the confirmed FIRE point, the upper computer calculates the longitude and latitude information of all the extracted FIRE points and sends a TC _ PRE _ FIREFATA instruction to the sensor; the computing method of the longitude and latitude information is that the geographical longitude and latitude information of the fire point is computed according to the azimuth information output by the fire point sensor, the installation information of the sensor on the whole star and the satellite orbit information and the attitude information of the corresponding moment of the fire point, and the specific computing method can refer to a 'remote sensing image geometric positioning' method;
if the 'FIRE point extraction number' is greater than 0 and the 'real FIRE point mark' is 1, the TM _ FIRE data packet corresponds to the confirmed FIRE point, the upper computer calculates the longitude and latitude of all the extracted FIRE points and sends a TC _ FIREADATA instruction to the sensor;
the contents of the TC _ FIREDATA or TC _ PRE _ FIREDATA command are shown in table 3, where "true fire flag" in the TC _ FIREDATA command is 1, and "true fire flag" in the TC _ PRE _ FIREDATA command is 0:
TABLE 3TC _ FIREDATA or TC _ PRE _ FIREDATA instruction content
Serial number Instruction content
1 Sensor time _ s
2 Sensor time _ us
3 Number of fire points
4 True and actual fire point mark
5 Fire 1 longitude
6 1 latitude of fire point
7 Fire 1 area and characteristic information
。。。
Longitude of fire point nX
Latitude nY fire point
Fire n area and characteristic information
(5) Outputting cloud judgment data: as shown in fig. 4, the sensor detects cloud layer targets in a field of view in real time, and has the cloud judgment result output capacity of 32 × 8 areas; intercepting a symmetrical cloud judgment total area by taking the center of a sensor field of view as a central point, averagely dividing the total area into 32 multiplied by 8 sub-areas, outputting a cloud judgment result for each sub-area, outputting specific requirements and contents of cloud layer information in the field of view to an upper computer, selecting data or performing other operations by the upper computer according to the cloud judgment result of each sub-area, and judging each sub-area to be one of the following tables 4:
TABLE 4 possible cloud determination results for each subregion
Serial number Cloud judgment result
1 Determining clear sky
2 Suspected clear sky
3 Suspected of having clouds
4 Determining the existence of clouds
(6) And (3) outputting the fire point information stored by the sensor: the sensor divides an independent area to store fire point information, and the sensor autonomously stores fire point data which can be acquired at any time when the ground needs; specifically, the upper computer moves the fire point information stored by the sensor to the internal memory area of the upper computer software through a 1553B bus. To improve data validity, the sensor is required to only maintain the fire information in the TC _ FIREDATA command packet.
(7) The sensor is maintained on track: and the upper computer performs on-orbit maintenance on the parameters and software of the sensor through the on-orbit maintenance instruction.
As shown in fig. 5, 6 and 7, the forest fire area determination method of the present invention:
s1, pre-loading N1 forest region map information of China and N2 forest region map information of other global regions by the system:
(1) for each forest region, giving an upper latitude limit lambda of the regionmaxiAnd lower latitude limit lambdamini
(2) At equal latitudinal intervals of delta lambdaiDividing the area into j latitude strips, and marking the upper geographic longitude limit L of the forest area on each latitude stripmaxijAnd a lower geographic longitude limit LminijAs shown in fig. 3.
S2, judging forest fire area
The system matches with a global forest area pre-installed in the system according to the geographic latitude and longitude information of the fire point, and judges whether the fire point is in the forest area and out of the country, wherein the method specifically comprises the following steps:
(1)i=1;
(2) according to the input fire point latitude lambdamJudging whether the latitude is in the latitude range (lambda) of the ith forest regionminimaxi):
If not, the fire point is not in the ith forest area, and the step (7) is carried out;
if the range is within the range, carrying out the next step;
(3)j=1;
(4) according to the inputted fire longitude LmJudging whether the heat is in the jth latitude strip (L) of the fire point areaminij,Lmaxij):
If at (L)minij,Lmaxij) Longitude range, and carrying out the step (5);
if not, performing the step (6);
(5) if i belongs to [1, N1], outputting 'the fire is an internal forest fire', ending the circulation, otherwise, outputting 'the fire is an external forest fire', and ending the circulation;
(6) if j > jmaxThen, the step (7) is carried out; if j is j +1, returning to the step (4);
(7) if i is less than or equal to N1+ N2, i is equal to i +1, and returning to the step (2); if i is larger than N1+ N2, the judgment process is ended, the fire point is output and is not in the forest region, and the circulation is ended.
The invention has been described in detail with reference to specific embodiments and illustrative examples, but the description is not intended to be construed in a limiting sense. Those skilled in the art will appreciate that various equivalent substitutions, modifications or improvements may be made to the technical solution of the present invention and its embodiments without departing from the spirit and scope of the present invention, which fall within the scope of the present invention. The scope of the invention is defined by the appended claims.
Those skilled in the art will appreciate that those matters not described in detail in the present specification are well known in the art.

Claims (10)

1. A forest fire area judgment method is characterized by comprising the following steps:
(1) the upper computer system is based on the ground of N forest areasObtaining the upper geographic latitude limit and the lower geographic latitude limit of each of N forest regions by the map information, wherein the upper geographic latitude limit of the ith forest region is recorded as lambdamaxiThe lower latitude limit is recorded as lambdamini,1≤i≤N;
(2) The upper computer system is spaced at an interval delta lambda according to equal latitudeiDividing the ith forest zone into jmaxEach latitude strip acquires the upper limit L of the geographic longitude of the forest area on each latitude stripmaxijAnd a lower geographic longitude limit Lminij
(3) The upper computer system interacts with the fire point sensor, receives fire point azimuth information input by the fire point sensor, and converts the fire point azimuth information into fire point geographical longitude and latitude information (L)mm);
(4) The upper computer system is based on the geographic longitude and latitude information (L) of the fire pointmm),λmaxiAnd λminiA determined latitude range, and LmaxijAnd LminijAnd determining the fire point occurrence area according to the determined longitude range.
2. The forest fire area judgment method according to claim 1, wherein N is N1+ N2, where N1 is the number of forest fire areas in the forest and N2 is the number of forest fire areas out of the forest.
3. The forest fire area judgment method according to claim 1, wherein in the step (4), the fire point is judged according to geographical longitude and latitude information (L) of the fire pointmm),λmaxiAnd λminiA determined latitude range, and LmaxijAnd LminijThe method for judging the fire point occurrence area in the determined longitude range comprises the following steps:
(411)i=1;
(412) determining fire latitude λmWhether it is in the latitude range (lambda) of the ith forest zoneminimaxi);
If not, then executing the step (417);
if the latitude is within the latitude range, the step (413) is carried out;
(413)j=1;
(414) judging fire point longitude LmWhether it is in the longitude range (L) of the jth latitude strip in the ith forest zoneminij,Lmaxij);
If the longitude range is in, then step (415) is carried out;
if not, then go to step (416);
(415) if i belongs to [1, N ], outputting a prompt that the fire point is in the forest region, outputting longitude and latitude information of a jth latitude strip in the ith forest region, and ending the circulation;
(416) if j > jmaxThen, go to step (417); otherwise j equals j +1, return to step (414);
(417) if i is not more than N, i is i +1, and returning to the step (412); if i is larger than N, ending the judging process, outputting that the fire point is not in the forest area, and ending the circulation.
4. The forest fire area judgment method according to claim 2, wherein the fire point-based geographic latitude and longitude information (L)mm),λmaxiAnd λminiA determined latitude range, and LmaxijAnd LminijThe method for judging the fire point occurrence area in the determined longitude range comprises the following steps:
(421)i=1;
(422) determining fire latitude λmWhether it is in the latitude range (lambda) of the ith forest zoneminimaxi);
If not, then executing the step (417);
if the latitude is within the latitude range, the step (413) is carried out;
(423)j=1;
(424) judging fire point longitude LmWhether it is in the longitude range (L) of the jth latitude strip in the ith forest zoneminij,Lmaxij);
If the longitude range is in, then step (425) is carried out;
if the longitude range is not in the longitude range, the step (426) is carried out;
(425) if i belongs to [1, N1], outputting 'the fire is an internal forest fire', ending the circulation, otherwise, outputting 'the fire is an external forest fire', and ending the circulation;
(426) if j > jmaxThen go to step (427); otherwise, returning to the step (424);
(427) if i is not more than N, i is i +1, and returning to the step (412); if i is larger than N, ending the judging process, outputting that the fire point is not in the forest area, and ending the circulation.
5. The forest fire area judgment method according to claim 1, wherein in the step (3), the geographical longitude and latitude information of the fire point is the geographical longitude and latitude information of the fire point after confirmation; the interaction method of the upper computer system and the fire point sensor comprises the following steps:
(31) the FIRE point sensor outputs the acquired FIRE point information to an upper computer in a TM _ FIRE data packet mode; the fire point information comprises fire point extraction number, real fire point marks and fire point azimuth information;
(32) the upper computer receives a TM _ FIRE data packet input by the FIRE point sensor, judges according to the FIRE point extraction number and the real FIRE point mark, and performs the following operations according to the judgment result: the method comprises the steps that an instruction is not sent to a fire point sensor, or a TC _ FIREFATA instruction is sent to the fire point sensor, or a TC _ PRE _ FIREFATA instruction is sent to the fire point sensor; the TC _ FIREDIDATA instruction and the TC _ PRE _ FIREDIDATA instruction comprise fire point extraction numbers, real fire point marks and fire point longitude and latitude information; the TC _ FIREDATA corresponds to a confirmed fire point, and the TC _ PRE _ FIREDATA instruction corresponds to a PRE-confirmed fire point;
(33) the fire point sensor receives a TC _ FIREFATA instruction input by an upper computer, and stores fire point information contained in the TC _ FIREFATA instruction into an independent area divided by the fire point sensor;
the upper computer extracts the fire point information stored by the fire point sensor to the memory area of the upper computer through a 1553B bus to obtain the geographical longitude and latitude information of the confirmed fire point.
6. The forest FIRE area judgment method according to claim 5, wherein in the step (31), the FIRE point sensor sequentially outputs a first time TM _ FIRE and a second time TM _ FIRE every period, which respectively correspond to the pre-confirmation FIRE point information recognized by the FIRE point sensor for the first time and the post-confirmation FIRE point information recognized for the second time;
when the FIRE point sensor identifies the FIRE point for the first time, the FIRE point extraction number in the TM _ FIRE for the first time is more than 0, and the real FIRE point mark is 0; if the FIRE point is a real FIRE point after the second identification and confirmation, the FIRE point extraction number in the TM _ FIRE for the second time is more than 0, the real FIRE point mark is 1, if the FIRE point is a non-real FIRE point after the second identification and confirmation, the FIRE point extraction number in the TM _ FIRE for the second time is 0, and the real FIRE point mark is 0;
when the FIRE point sensor does not recognize the FIRE point for the first time, the FIRE point extraction numbers in the first TM _ FIRE output and the second TM _ FIRE output are both 0, and the real FIRE point mark is 0.
7. The forest fire area judgment method according to claim 5, wherein in the step (32), the upper computer receives fire information input by the fire sensor, and the judgment method according to the point extraction number and the real fire mark is as follows:
if the extracted fire point number is 0, the fire point longitude and latitude are not calculated, the upper computer does not reply a TC _ FIREFATA or TC _ PRE _ FIREFATA instruction to the fire point sensor;
if the number of the extracted fire points is greater than 0 and the real fire point mark is 0, the upper computer calculates longitude and latitude information of all the extracted fire points and sends a TC _ PRE _ FIREFATA instruction to the fire point sensor;
if the number of extracted fire points is greater than 0 and the real fire point mark is 1, the upper computer calculates the longitude and latitude of all extracted fire points and sends a TC _ FIREADATA instruction to the fire point sensor.
8. The forest fire area judgment method according to claim 1 or 5, wherein the fire orientation information is a unit vector representation (x, y, z) of the fire vector in the coordinate system of the sensor's own reference mirror, and satisfies x2+y2+z21 is ═ 1; sensor's own reference mirror coordinate system definitionComprises the following steps: the center of the reference mirror is a coordinate origin, the direction of the optical axis of the sensor is a + Z axis, the direction of the center of the reference mirror pointing to a connector fixedly connecting the sensor to the star body is a + X axis, and a right-hand rectangular coordinate system is formed by the + Y axis, the + X axis and the + Z axis.
9. The forest fire area judgment method according to claim 5, wherein in the interaction method of the upper computer system and the fire point sensor, the time of the fire point sensor is synchronous with that of the upper computer;
the method comprises the steps that an upper computer periodically sends a time calibration instruction to a fire point sensor, the fire point sensor immediately latches the time of the fire point sensor after receiving the instruction, compensation is carried out according to the difference value between the time of the upper computer and the time of the fire point sensor, and the consistency of the time of the fire point sensor and the time of the upper computer is achieved;
and when the fire point sensor does not receive a time calibration instruction sent by the upper computer within a certain period of time, correcting the self time according to the internal clock of the fire point sensor.
10. The forest FIRE area judgment method according to claim 5, wherein the TM _ FIRE data packet, the TC _ FIREDATA instruction and the TC _ PRE _ FIREDATA instruction further comprise FIRE area and characteristic information, and the upper computer system is used for judging the forest FIRE area according to the geographic longitude and latitude information (L) of the FIREmm),λmaxiAnd λminiA determined latitude range, and LmaxijAnd LminijAnd determining the fire point occurrence area and outputting fire point area and characteristic information in the determined longitude range.
CN202111101112.2A 2021-09-18 2021-09-18 Forest fire area judging method Active CN113946641B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111101112.2A CN113946641B (en) 2021-09-18 2021-09-18 Forest fire area judging method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111101112.2A CN113946641B (en) 2021-09-18 2021-09-18 Forest fire area judging method

Publications (2)

Publication Number Publication Date
CN113946641A true CN113946641A (en) 2022-01-18
CN113946641B CN113946641B (en) 2024-07-09

Family

ID=79328385

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111101112.2A Active CN113946641B (en) 2021-09-18 2021-09-18 Forest fire area judging method

Country Status (1)

Country Link
CN (1) CN113946641B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106991681A (en) * 2017-04-11 2017-07-28 福州大学 A kind of fire boundary vector information extract real-time and method for visualizing and system
CN107633637A (en) * 2017-10-25 2018-01-26 国网湖南省电力公司 A kind of power network mountain fire satellite monitoring alarm localization method based on bivariate table interpolation
WO2020130485A1 (en) * 2018-12-19 2020-06-25 삼성전자 주식회사 Electronic device and method for providing v2x service using same

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106991681A (en) * 2017-04-11 2017-07-28 福州大学 A kind of fire boundary vector information extract real-time and method for visualizing and system
CN107633637A (en) * 2017-10-25 2018-01-26 国网湖南省电力公司 A kind of power network mountain fire satellite monitoring alarm localization method based on bivariate table interpolation
WO2020130485A1 (en) * 2018-12-19 2020-06-25 삼성전자 주식회사 Electronic device and method for providing v2x service using same

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
NIZAR AL BASSAM等: "IoT based Autonomous Fire Suppression System For Vehicles", 《2020 INTERNATIONAL CONFERENCE ON EMERGING TRENDS IN INFORMATION TECHNOLOGY AND ENGINEERING (IC-ETITE)》, 27 April 2020 (2020-04-27), pages 1 - 5 *
徐伟刚等: "全天候空天地森林防火监测平台", 《南方林业科学 》, vol. 49, no. 4, 31 March 2018 (2018-03-31), pages 61 - 63 *

Also Published As

Publication number Publication date
CN113946641B (en) 2024-07-09

Similar Documents

Publication Publication Date Title
CN110631593B (en) Multi-sensor fusion positioning method for automatic driving scene
CN111045068B (en) Low-orbit satellite autonomous orbit and attitude determination method based on non-navigation satellite signals
CN1788188B (en) Picked-up image display method and device
US20080063270A1 (en) Method and Apparatus for Determining a Location Associated With an Image
CN110501736B (en) System and method for tightly coupling positioning by utilizing visual images and GNSS ranging signals
CN115574816B (en) Bionic vision multi-source information intelligent perception unmanned platform
CN104949673B (en) A kind of object localization method and device based on non-vision perception information
CN114485654A (en) Multi-sensor fusion positioning method and device based on high-precision map
CN113721248A (en) Fusion positioning method and system based on multi-source heterogeneous sensor
CN115183762A (en) Airport warehouse inside and outside mapping method, system, electronic equipment and medium
CN113819904B (en) polarization/VIO three-dimensional attitude determination method based on zenith vector
CN109631836A (en) A kind of height of cloud base method for fast measuring
Yastikli et al. Influence of system calibration on direct sensor orientation
CN113946641A (en) Forest fire area judgment method
CN117760404A (en) All-weather autonomous navigation method, system, computer equipment and medium based on single star orientation
EP1934632A1 (en) Terrain mapping
CN113934805B (en) Data interaction method for satellite-borne intelligent fire sensor
WO2007001471A2 (en) Method and apparatus for determining a location associated with an image
CN115683170A (en) Calibration method based on radar point cloud data fusion error
CN118089707B (en) Single-star positioning method based on astronomical navigation
CN113936061B (en) Marine dynamic target positioning system and positioning method thereof
Yuan et al. Research on train positioning system based on map matching and multi-information fusion
CN116952228A (en) Inertial navigation/radar/defend navigation three-combination emission coordinate system combined navigation method
CN117824644A (en) Bionic tightly-combined avionic attitude measurement method based on sequential volume Kalman filtering
CN116182845A (en) Polarized light/inertial integrated navigation system and method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant