CN117289724A - Intelligent visual internal attack search and rescue control method and system - Google Patents

Intelligent visual internal attack search and rescue control method and system Download PDF

Info

Publication number
CN117289724A
CN117289724A CN202311587697.2A CN202311587697A CN117289724A CN 117289724 A CN117289724 A CN 117289724A CN 202311587697 A CN202311587697 A CN 202311587697A CN 117289724 A CN117289724 A CN 117289724A
Authority
CN
China
Prior art keywords
value
information
distance
entrance
influence
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202311587697.2A
Other languages
Chinese (zh)
Other versions
CN117289724B (en
Inventor
刘灿华
林渠
徐斌
袁亮
王沼阳
余汉培
叶志诚
郑清图
吕志龙
张帅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Taining County Fire Rescue Brigade
Original Assignee
Taining County Fire Rescue Brigade
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Taining County Fire Rescue Brigade filed Critical Taining County Fire Rescue Brigade
Priority to CN202311587697.2A priority Critical patent/CN117289724B/en
Publication of CN117289724A publication Critical patent/CN117289724A/en
Application granted granted Critical
Publication of CN117289724B publication Critical patent/CN117289724B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A62LIFE-SAVING; FIRE-FIGHTING
    • A62BDEVICES, APPARATUS OR METHODS FOR LIFE-SAVING
    • A62B33/00Devices for allowing seemingly-dead persons to escape or draw attention; Breathing apparatus for accidentally buried persons

Abstract

The application relates to an intelligent visual internal attack search and rescue control method and system, and relates to the technical field of fire rescue, wherein the control method comprises the following steps: acquiring the current position point of a rescue worker; house structure information is called according to the current position point of the rescue personnel; analyzing and processing the current position point of the rescue personnel and house structure information to form traveling initial route information; detecting and acquiring fire scene environment detection information along a plurality of groups of unmanned aerial vehicle groups based on the traveling initial route information; analyzing and processing the traveling initial route information and the fire scene environment detection information to form a traveling route difficulty level value; analyzing and processing the fire scene environment detection information to form a fire scene environment influence value; and analyzing and processing the travel initial route information, the travel route difficulty level value and the fire scene environment influence value to form travel route final information, and outputting the travel route final information and the fire scene environment detection information. The rescue efficiency of rescue personnel to stranded personnel in the indoor scene of a fire is got into to this application has the effect that improves.

Description

Intelligent visual internal attack search and rescue control method and system
Technical Field
The application relates to the technical field of fire rescue, in particular to an intelligent visual internal attack search and rescue control method and system.
Background
Fire rescue is to be in the sudden event such as fire, gas leakage, earthquake, mud-rock flow, in order to ensure public and property safety, reduce casualties and property loss, carry out the action of timely handling and handling to the accident through professional fire rescue team and equipment.
Related art, when carrying out the conflagration rescue, the fireman need carry out the scene of a fire in searching for and rescuing to stranded personnel, because reasons such as thick smoke, flame, oxygen content and collapse building cause indoor environment complicated in the scene of a fire this moment, the fireman need wear protective equipment such as fire helmet, fire protection suit, fire protection boots, fire gloves, air respirator in getting into the scene of a fire, still need wear auxiliary rescue equipment such as flashlight, special intercom, illuminate the scene of a fire through the flashlight to conveniently search for and rescue stranded personnel fast.
With respect to the related art as described above, the applicant found the following drawbacks: when fire fighters rescue, the fire fighters entering the indoor fire scene need to illuminate the fire scene by holding the flashlight, so that the environment and the travelling route in the fire scene are not easy to know, and the rescue efficiency is affected.
Disclosure of Invention
In order to improve rescue efficiency of trapped personnel when rescue personnel enter an indoor fire scene, the application provides an intelligent visual internal attack search and rescue control method and system.
In a first aspect, the present application provides an intelligent visual internal attack search and rescue control method, which adopts the following technical scheme:
an intelligent visual internal attack search and rescue control method comprises the following steps:
acquiring the current position point of a rescue worker;
according to the current position point of the rescue personnel, house structure information corresponding to the current position point of the rescue personnel is called;
analyzing and processing the current position point of the rescue personnel and house structure information according to a preset house travel route analysis method to form travel initial route information;
detecting and acquiring fire scene environment detection information along a plurality of groups of unmanned aerial vehicle groups based on the traveling initial route information;
analyzing and processing the traveling initial route information and the fire scene environment detection information according to a preset route difficulty level analysis method to form a traveling route difficulty level value;
analyzing and processing the fire scene environment detection information according to a preset fire scene environment influence analysis method to form a fire scene environment influence value;
and selecting and analyzing method according to the preset travel route so as to analyze and process the travel initial route information, the travel route difficulty level value and the fire scene environment influence value to form travel route final information, and outputting the travel route final information and the fire scene environment detection information.
According to the technical scheme, the current position point of the rescue personnel is obtained, the house structure information is obtained, the current position point of the rescue personnel and the house structure information are analyzed and processed through the house travel route analysis method to form travel initial route information, a plurality of groups of unmanned aerial vehicle units detect and obtain fire scene environment detection information along the travel initial route, the travel initial route information and the fire scene environment detection information are analyzed and processed through the route difficulty analysis method to form a travel route difficulty value, the fire scene environment detection information is analyzed and processed through the fire scene environment influence analysis method to form a fire scene environment influence value, the travel route final information is formed through the travel route selection analysis method, the travel route final information and the fire scene environment detection information are output, so that rescue personnel entering an indoor fire scene conveniently know the environment and the travel route in the fire scene, and rescue efficiency of the rescue personnel entering the indoor fire scene is improved.
Optionally, the analyzing the travel initial route information and the fire scene environment detection information according to the preset route difficulty level analyzing method to form the travel route difficulty level value includes:
according to the traveling initial route information, a special road section distance value corresponding to the traveling initial route information and a room entrance and exit position point are called;
acquiring fire scene environment detection information corresponding to the position points of the entrance and the exit of the room based on the fire scene environment detection information and taking the fire scene environment detection information as the entrance and the exit environment detection information;
analyzing and processing the entrance environment detection information according to a preset entrance influence analysis method to form an entrance influence value;
analyzing and processing the fire scene environment detection information according to a preset bright position extraction method to form special bright position points;
according to the special light position point and the traveling initial route information, analyzing and calculating a distance value between the special light position point and the traveling initial route information and taking the distance value as a special light distance value;
according to a preset travel route difficulty level value calculation formula, analyzing and calculating a special road section distance value, an access influence value and a special light distance value to form a travel route difficulty level value, wherein the preset travel route difficulty level value calculation formula is as follows: a=a [ (A1-A0)/L ] +ba2+c (uA 3);
Wherein A is the difficulty level value of the travelling route; a is the weight of the distance value of the special road section;
a1 is a special road section distance value;
a0 is a preset reference distance value of a special road section;
l is the preset total length of the initial travel route;
b is the weight of the influence value of the entrance;
a2 is the influence value of the entrance;
c is the weight of the special bright distance value;
u is a scaling factor of a particular light distance value;
a3 is a special bright distance value;
a+b+c=1。
according to the technical scheme, the distance value of the special road section and the position point of the entrance and the exit of the room are adjusted through the traveling initial route information, the fire scene environment detection information corresponding to the position point of the entrance and the exit of the room is adjusted and taken as the entrance environment detection information, the entrance environment detection information is analyzed and processed through the entrance influence analysis method to form the entrance influence value, the fire scene environment detection information is analyzed and processed through the bright position extraction method to form the special bright position point, the distance value between the special bright position point and the traveling initial route information is analyzed and calculated and taken as the special bright distance value, and the special road section distance value, the entrance influence value and the special bright distance value are analyzed and calculated through the traveling route difficulty value calculation formula to form the traveling route difficulty value, so that the obtained route difficulty value is influenced by the special road section distance value, the entrance influence value and the special bright distance value, and the obtained route difficulty value is improved.
Optionally, the analyzing the access environment detection information according to the preset access influence analysis method to form an access influence value includes:
according to the entrance and exit environment detection information, acquiring unmanned aerial vehicle flight strategy information and an infrared distance detection value of an unmanned aerial vehicle corresponding to the entrance and exit environment detection information, and defining a group of unmanned aerial vehicles as a single unmanned mother vehicle and a plurality of unmanned son vehicles;
analyzing and acquiring initial detection information of an entrance and an exit according to a matching result of unmanned aerial vehicle unit flight strategy information and preset flight reference strategy information;
according to the corresponding relation between the infrared distance detection value of the unmanned aerial vehicle and the preset entrance detection adjustment information, the entrance detection adjustment information corresponding to the infrared distance detection value of the unmanned aerial vehicle is obtained through analysis;
according to the corresponding relation between the entrance initial detection information, the entrance detection adjustment information and the preset entrance detection correction information, analyzing and obtaining entrance detection correction information corresponding to the entrance initial detection information and the entrance detection adjustment information;
according to a preset entrance deviation influence analysis method, the entrance detection correction information is analyzed to form an entrance deviation influence value, and the entrance deviation influence value is used as an entrance influence value.
According to the technical scheme, the unmanned aerial vehicle flight strategy information and the infrared distance detection value of the unmanned aerial vehicle are acquired through the entrance environment detection information, the entrance initial detection information is analyzed and acquired through the matching result of the unmanned aerial vehicle flight strategy information and the flight reference strategy information, the entrance detection adjustment information is acquired through the analysis of the unmanned aerial vehicle infrared distance detection value, the entrance detection correction information is acquired through the analysis of the entrance initial detection information and the entrance detection adjustment information, the entrance deviation influence analysis method is used for analyzing and processing the entrance detection correction information to form an entrance deviation influence value, and the entrance deviation influence value is used as an entrance influence value, so that the accuracy of the acquired entrance influence value is improved.
Optionally, the executing method when the flight strategy information of the unmanned aerial vehicle unit is acquired comprises the following steps:
acquiring an infrared distance detection value of the unmanned aerial vehicle;
according to the infrared distance detection value of the unmanned aerial vehicle and a preset infrared distance reference value of the unmanned aerial vehicle, analyzing and calculating a difference value between the infrared distance detection value of the unmanned aerial vehicle and the infrared distance reference value of the unmanned aerial vehicle, and taking the difference value as an infrared distance deviation value of the unmanned aerial vehicle;
According to the corresponding relation between the unmanned aerial vehicle infrared distance deviation value and the preset unmanned aerial vehicle position adjustment information, analyzing and obtaining the unmanned aerial vehicle position adjustment information corresponding to the unmanned aerial vehicle infrared distance deviation value, and outputting the unmanned aerial vehicle position adjustment information;
acquiring a plurality of unmanned mother-son machine distance detection values between a single unmanned mother machine and a plurality of unmanned son machines;
according to the distance detection values of the plurality of unmanned mother-son machines, an average value among the distance detection values of the plurality of unmanned mother-son machines is analyzed and calculated and used as an average distance value of the unmanned mother-son machines;
according to the distance detection values of the plurality of unmanned mother-son machines and the average distance value of the unmanned mother-son machines, analyzing and calculating the difference between the distance detection values of the plurality of unmanned mother-son machines and the average distance value of the unmanned mother-son machines and taking the difference as an adjustment value of the distance between the unmanned mother-son machines and the average distance value of the unmanned mother-son machines;
according to the corresponding relation between the unmanned mother-son machine distance adjustment value and the preset unmanned mother-son machine distance adjustment information, analyzing and obtaining the unmanned mother-son machine distance adjustment information corresponding to the unmanned mother-son machine distance adjustment value, and outputting the unmanned mother-son machine distance adjustment information.
According to the technical scheme, the infrared distance detection value of the unmanned aerial vehicle is obtained, the difference value between the infrared distance detection value of the unmanned aerial vehicle and the infrared distance reference value of the unmanned aerial vehicle is analyzed and calculated and used as the infrared distance deviation value of the unmanned aerial vehicle, the position adjustment information of the unmanned aerial vehicle is obtained through the infrared distance deviation value analysis of the unmanned aerial vehicle, the distance detection values of the unmanned aerial vehicle and the average value among the distance detection values of the unmanned aerial vehicle and the average distance value of the unmanned aerial vehicle are analyzed and calculated, the difference value between the distance detection values of the unmanned aerial vehicle and the average distance value of the unmanned aerial vehicle and used as the distance adjustment value of the unmanned aerial vehicle is analyzed and calculated, the distance adjustment information of the unmanned aerial vehicle and the unmanned aerial vehicle corresponding to the distance adjustment value of the unmanned aerial vehicle is output, and the distance adjustment information of the unmanned aerial vehicle and the unmanned aerial vehicle are controlled conveniently.
Optionally, the analyzing the passageway detection correction information according to the preset passageway deviation influence analysis method to form the passageway deviation influence value includes:
according to the entrance detection correction information, an entrance area detection value corresponding to the entrance detection correction information and entrance shape detection information are called;
according to the entrance area detection value and a preset entrance area reference value, analyzing and calculating a difference value between the entrance area detection value and the entrance area reference value and taking the difference value as an entrance area deviation value;
according to the corresponding relation between the entrance area deviation value and the preset entrance area influence value, analyzing and obtaining an entrance area influence value corresponding to the entrance area deviation value;
according to the entrance shape detection information and preset entrance shape reference information, analyzing and obtaining deviation information between the entrance shape detection information and the entrance shape reference information and taking the deviation information as entrance shape deviation information;
according to the corresponding relation between the entrance shape deviation information and the preset entrance shape influence value, analyzing and obtaining an entrance shape influence value corresponding to the entrance shape deviation information;
and analyzing and acquiring the entrance comprehensive influence value corresponding to the entrance area influence value and the entrance shape influence value according to the corresponding relation between the entrance area influence value, the entrance shape influence value and the preset entrance comprehensive influence value, and taking the entrance comprehensive influence value as an entrance deviation influence value.
By adopting the technical scheme, the passageway area detection value and the passageway shape detection information are modulated through the passageway detection correction information, the difference value between the passageway area detection value and the passageway area reference value is analyzed and calculated to be the passageway area deviation value, the passageway area influence value is obtained through passageway area deviation value analysis, the passageway shape influence value is obtained through passageway shape deviation information analysis and is obtained through passageway shape deviation information analysis, the passageway comprehensive influence value is obtained through passageway area influence value and passageway shape influence value analysis, and the passageway comprehensive influence value is taken as the passageway deviation influence value, so that the passageway deviation influence value is influenced by the passageway area and the passageway shape, and the accuracy of the obtained passageway deviation influence value is improved.
Optionally, the analyzing the fire scene environment detection information according to the preset fire scene environment influence analysis method to form the fire scene environment influence value includes:
retrieving ambient temperature detection information corresponding to the fire scene environment detection information according to the fire scene environment detection information and trapped person detection information;
According to a preset environmental temperature influence analysis method, analyzing and processing the environmental temperature detection information to form an environmental temperature influence value;
analyzing the trapped person detection information and the traveling initial route information according to a preset trapped person influence analysis method to form a trapped person influence value;
according to the corresponding relation between the environmental temperature influence value, the trapped person influence value and the preset environmental comprehensive influence value, analyzing and obtaining the environmental comprehensive influence value corresponding to the environmental temperature influence value and the trapped person influence value, and taking the environmental comprehensive influence value as the fire scene environmental influence value.
By adopting the technical scheme, the environment temperature detection information and the trapped person detection information are called through the fire scene environment detection information, the environment temperature detection information is analyzed and processed through the environment temperature influence analysis method to form the environment temperature influence value, the trapped person detection information and the traveling initial route information are analyzed and processed through the trapped person influence analysis method to form the trapped person influence value, the environment comprehensive influence value is obtained through the environment temperature influence value and the trapped person influence value analysis, and the environment comprehensive influence value is used as the fire scene environment influence value, so that the obtained fire scene environment influence value is influenced by the environment temperature influence value and the trapped person influence value, and the accuracy of the obtained fire scene environment influence value is improved.
Optionally, the analyzing the environmental temperature detection information according to the preset environmental temperature influence analysis method to form the environmental temperature influence value includes:
according to the environmental temperature detection information, a high-temperature central position point, a high-temperature central temperature value and a low-temperature coverage area corresponding to the environmental temperature detection information are called;
according to the high-temperature central position point and the low-temperature coverage area, analyzing and calculating the furthest distance value between the high-temperature central position point and the low-temperature coverage area and taking the furthest distance value as the actual distance value of high Wen Fugai;
a high Wen Fugai distance reference value corresponding to the high-temperature center temperature value is called according to the high-temperature center temperature value;
according to the high Wen Fugai distance reference value and the high temperature covering distance actual value, analyzing and calculating the difference between the high Wen Fugai distance reference value and the high temperature covering distance actual value and taking the difference as a temperature change moving distance value;
analyzing and processing the low-temperature coverage area according to a preset object matching method to form low-temperature area object type information and low-temperature area object position points;
according to the corresponding relation between the low-temperature area object type information and the preset object type influence value, analyzing and obtaining an object type influence value corresponding to the low-temperature area object type information;
According to the object position points in the low-temperature area and the high-temperature center position points, analyzing and calculating distance values between the object position points in the low-temperature area and the high-temperature center position points and taking the distance values as object high-temperature center distance values;
according to the object high-temperature center distance value and the high-temperature covering distance reference value, analyzing and calculating the difference value between the object high-temperature center distance value and the high-temperature covering distance reference value and taking the difference value as an object position deviation value;
according to a preset calculation formula of the environmental temperature influence value, analyzing and processing the temperature change moving distance value, the object type influence value and the object position deviation value to form the environmental temperature influence value, wherein the calculation formula of the environmental temperature influence value is as follows: b=d (vb1) +eb2+f (wB 3);
b is an ambient temperature influence value;
d is the weight of the temperature change moving distance value;
v is a scaling factor of the temperature change movement distance value;
b1 is a temperature change movement distance value;
e is the weight of the object type influence value;
b2 is the object type influence value;
f is the weight of the object position deviation value;
w is a scaling factor of the temperature change movement distance value;
b3 is the object position deviation value;
d+e+f=1。
according to the technical scheme, the high-temperature center position point, the high-temperature center temperature value and the low-temperature coverage area are acquired through the environmental temperature detection information, the furthest distance value between the high-temperature center position point and the low-temperature coverage area is analyzed and calculated and used as the high Wen Fugai distance actual value, the high-temperature coverage distance reference value is acquired through the high-temperature center temperature value, the difference value between the high Wen Fugai distance reference value and the high-temperature coverage distance actual value is analyzed and calculated and used as the temperature change moving distance value, the low-temperature coverage area is analyzed and treated through the object matching method to form the low-temperature area object type information and the low-temperature area object position point, the object type influence value is obtained through the low-temperature area object type information analysis, the distance value between the low-temperature area object position point and the high-temperature center position point is analyzed and calculated and used as the object high-temperature center distance value, the difference value between the object high-temperature center distance value and the high-temperature coverage distance reference value is analyzed and used as the object position deviation value, and the environmental temperature change moving distance value, the temperature change moving value, the object type influence value and the object position deviation value are processed through the environmental temperature influence value calculation formula to form the environmental influence value, so that the accuracy is improved.
Optionally, the analyzing the trapped person detection information and the traveling initial route information according to the preset trapped person influence analysis method to form the trapped person influence value includes:
calling the number of the trapped person corresponding to the trapped person detection information according to the trapped person detection information and the position point of the trapped person;
according to the position points of each trapped person, analyzing and calculating the distance value between the position points of each trapped person and taking the distance value as the distance value between the trapped persons;
according to the trapped person position points and the traveling initial route information, analyzing and calculating distance values between the trapped person position points and the traveling initial route information and taking the distance values as trapped person route distance values;
according to the trapped person position points and the low-temperature area object position points, analyzing and calculating distance values between the trapped person position points and the low-temperature area object position points and taking the distance values as trapped person object distance values;
analyzing and calculating the number of the trapped person, the distance value of the trapped person, the route distance value of the trapped person and the object distance value of the trapped person according to a preset calculation formula of the influence value of the trapped person to form the influence value of the trapped person, wherein the calculation formula of the influence value of the trapped person is as follows: c=g (rC 2/C1) +h (sC 3) +i (tC 4);
C is the influence value of trapped people;
g is the weight of the density of trapped people;
r is the distance scaling factor of the density of trapped people;
c1 is the number of trapped people;
c2 is the distance value of trapped personnel;
h is the weight of the route distance value of the trapped person;
s is a scale factor of the route distance value of the trapped person;
c3 is the route distance value of the trapped person;
i is the weight of the object distance value of the trapped person;
t is a scaling factor of the object distance value of the trapped person;
c4 is the object distance value of the trapped person;
g+h+i=1。
by adopting the technical scheme, the trapped person numerical value and the trapped person position points are called through the trapped person detection information, then the distance value among the trapped person position points is analyzed and calculated and used as the trapped person distance value, the distance value among the trapped person position points and the traveling initial route information is analyzed and calculated and used as the trapped person route distance value, the distance value among the trapped person position points and the low-temperature area object position points is analyzed and calculated and used as the trapped person object distance value, and the trapped person numerical value, the trapped person distance value, the trapped person route distance value and the trapped person object distance value are analyzed and calculated through the trapped person influence value calculation formula to form the trapped person influence value, so that the accuracy of the acquired trapped person influence value is improved.
Optionally, the analyzing the initial travel route information, the travel route difficulty level value and the fire scene environmental impact value according to the preset travel route selection analysis method to form the travel route final information includes:
acquiring a personnel number of a rescue worker;
analyzing and calculating the travelling route difficulty level value and the fire scene environment influence value according to a preset travelling route influence value calculation formula to form a travelling route influence value;
forward sorting is carried out based on the travel route influence values, travel initial route information corresponding to the first plurality of travel route influence values corresponding to the number of rescue workers in the forward sorting is selected as travel route final information, and the forward sorting is defined as sorting from big to small of the travel route influence values;
the travel route influence value calculation formula is: z=ja+kb+lc;
z is a travel route influence value;
j is the weight value of the difficulty level value of the travelling route;
a is the difficulty level value of the travelling route;
k is a weight value of an environmental temperature influence value;
b is an ambient temperature influence value;
l is the weight value of the influence value of the trapped person;
c is the influence value of trapped people;
j+k+l=1, and l > k > j.
By adopting the technical scheme, the numerical value of the rescue personnel is obtained, the traveling route difficulty level value and the fire scene environment influence value are analyzed and calculated through the traveling route influence value calculation formula to form the traveling route influence value, the traveling route influence value is forward ordered, traveling initial route information corresponding to the first plurality of traveling route influence values corresponding to the numerical value of the rescue personnel in the forward ordering is selected as the traveling route final information, and therefore the accuracy of the obtained traveling route final information is improved.
In a second aspect, the application provides an intelligent visual internal attack search and rescue system, which adopts the following technical scheme:
an intelligent visual internal attack search and rescue system, comprising:
the acquisition module is used for acquiring the current position point of the rescue personnel, the fire scene environment detection information, the infrared distance detection value of the unmanned aerial vehicle and the distance detection value of the unmanned mother-son machine;
a memory for storing a program of the intelligent visualization internal attack search and rescue control method according to any one of the first aspects;
a processor, a program in memory capable of being loaded by the processor and implementing the intelligent visual internal attack search and rescue control method according to any one of the first aspect.
Through adopting above-mentioned technical scheme, acquire current position point, fire scene environment detection information, unmanned son machine infrared distance detection value and unmanned mother-son machine distance detection value to the rescue personnel through the acquisition module, the procedure load in the rethread treater to the memory is carried out to the rescue personnel who conveniently get into indoor fire scene know the environment and the route of marcing in the fire scene, and then improve the rescue efficiency to stranded personnel in the rescue personnel gets into indoor fire scene.
In summary, the present application includes at least one of the following beneficial technical effects:
1. Acquiring current position points of rescue workers, acquiring house structure information, analyzing and processing the current position points of the rescue workers and the house structure information through a house travel route analysis method to form travel initial route information, detecting and acquiring fire scene environment detection information along the travel initial route through a plurality of groups of unmanned aerial vehicle units, analyzing and processing the travel initial route information and the fire scene environment detection information through a route difficulty analysis method to form a travel route difficulty level value, analyzing and processing the fire scene environment detection information through a fire scene environment influence analysis method to form a fire scene environment influence value, analyzing and processing the travel initial route information, the travel route difficulty level value and the fire scene environment influence value through a travel route selection analysis method to form travel route final information, and outputting travel route final information and the fire scene environment detection information, so that rescue workers entering an indoor fire scene can conveniently know the environment and the travel route in the fire scene, and rescue efficiency of the rescue workers entering the indoor fire scene is improved;
2. The method comprises the steps of calling a special road section distance value and a room entrance position point through travel initial route information, calling fire scene environment detection information corresponding to the room entrance position point and taking the fire scene environment detection information as entrance environment detection information, analyzing and processing the entrance environment detection information through an entrance influence analysis method to form an entrance influence value, analyzing and processing the fire scene environment detection information through a brightness position extraction method to form a special brightness position point, analyzing and calculating a distance value between the special brightness position point and the travel initial route information and taking the distance value as a special brightness distance value, analyzing and calculating the special road section distance value, the entrance influence value and the special brightness distance value through a travel route difficulty value calculation formula to form a travel route difficulty value, so that the obtained route difficulty value is influenced by the special road section distance value, the entrance influence value and the special brightness distance value, and the accuracy of the obtained route difficulty value is improved;
3. the method comprises the steps of acquiring flight strategy information of an unmanned aerial vehicle and infrared distance detection values of the unmanned aerial vehicle through entrance environment detection information, analyzing and acquiring entrance initial detection information through a matching result of the flight strategy information of the unmanned aerial vehicle and flight reference strategy information, analyzing and acquiring entrance detection adjustment information through the infrared distance detection values of the unmanned aerial vehicle, analyzing and acquiring entrance detection correction information through the entrance initial detection information and the entrance detection adjustment information, analyzing and processing the entrance detection correction information through an entrance deviation influence analysis method to form an entrance deviation influence value, and taking the entrance deviation influence value as the entrance influence value, so that accuracy of the acquired entrance influence value is improved.
Drawings
FIG. 1 is a flow chart of a method for intelligent visualization internal attack search and rescue control according to an embodiment of the present application.
Fig. 2 is a flowchart of a method for analyzing and processing travel initial route information and fire scene environment detection information to form a travel route difficulty level value according to a preset route difficulty level analysis method in an embodiment of the present application.
Fig. 3 is a flowchart of a method for analyzing and processing the entrance environment detection information to form an entrance influence value according to a preset entrance influence analysis method in an embodiment of the present application.
Fig. 4 is a flowchart of a method performed when acquiring flight strategy information of an unmanned aerial vehicle according to an embodiment of the present application.
Fig. 5 is a flowchart of a method for analyzing the passageway detection correction information to form a passageway deviation influence value according to a preset passageway deviation influence analysis method according to an embodiment of the present application.
Fig. 6 is a flowchart of a method for analyzing fire environment detection information to form fire environment influence values according to a preset fire environment influence analysis method according to an embodiment of the present application.
Fig. 7 is a flowchart of a method for analyzing environmental temperature detection information to form an environmental temperature influence value according to a preset environmental temperature influence analysis method in an embodiment of the present application.
FIG. 8 is a flowchart of a method for analyzing trapped person detection information and travel initial route information to form a trapped person influence value according to a preset trapped person influence analysis method according to an embodiment of the present application.
Fig. 9 is a flowchart of a method for analyzing and processing initial travel route information, a travel route difficulty level value and a fire scene environmental impact value according to a preset travel route selection analysis method to form final travel route information according to an embodiment of the present application.
FIG. 10 is a system flow diagram of intelligent visualization internal attack search and rescue according to an embodiment of the present application.
Reference numerals illustrate: 1. an acquisition module; 2. a memory; 3. a processor.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application will be further described in detail with reference to fig. 1 to 10 and the embodiments. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the present application.
The embodiment of the application discloses an intelligent visual internal attack search and rescue control method.
Referring to fig. 1, an intelligent visual internal attack search and rescue control method includes:
step S100, obtaining the current position point of the rescue personnel.
The current position point of the rescue personnel refers to the position point of the personnel in the fire scene where the current time enters to rescue the trapped personnel. The current position point of the rescue personnel is detected and obtained through a GPS positioning device preset on the rescue personnel.
Step S200, house structure information corresponding to the current position point of the rescue personnel is called according to the current position point of the rescue personnel.
The house structure information refers to layout structure information of each room inside a house where a rescue worker is located, and the house structure information is inquired and obtained from a database storing the house structure information.
The house structure information is retrieved through the current position point of the rescue personnel, so that the house structure information can be conveniently used subsequently.
Step S300, analyzing and processing the current position point of the rescue personnel and the house structure information according to a preset house travel route analysis method to form travel initial route information.
The house travel route analysis method is an analysis method for analyzing and acquiring a travel route in a house, and the house travel route analysis method is obtained by inquiring a database storing house travel route analysis methods. The travel initial route information refers to initial route information when traveling in a house.
And analyzing and processing the current position point of the rescue personnel and house structure information by a house travel route analysis method, so that travel initial route information is formed, and the subsequent use of the travel initial route information is facilitated. The house travel route analysis method specifically comprises the steps of calling out the entrance and exit position points and the shape information of each room in a house through house structure information, determining an observation reference position section according to the shape information of the room, determining the observation position point of the room according to the shape information of the room and the entrance and exit position points when the entrance and exit position points are not located in the observation reference position section, and connecting the observation position point of the room with the entrance and exit position points according to the adjacent connection principle to form travel initial route information.
Step S400, detecting and acquiring fire scene environment detection information along the lines of a plurality of groups of unmanned aerial vehicle based on the traveling initial route information.
The fire scene environment detection information is detection information for detecting the environment inside a fire scene, and the fire scene environment detection information flies along a travel initial route corresponding to the travel initial route information through a plurality of groups of unmanned aerial vehicle groups and is detected and acquired by sensors preset on the unmanned aerial vehicle groups. The fire scene environment detection information comprises detection information such as picture detection information, temperature detection information, illumination intensity detection information and the like in the fire scene.
The unmanned aerial vehicle sets fly and detect along the traveling initial route corresponding to the traveling initial route information, so that the fire scene environment detection information is acquired, and the subsequent use of the fire scene environment detection information is facilitated.
Step S500, according to the preset route difficulty level analysis method, the traveling initial route information and the fire scene environment detection information are analyzed and processed to form a traveling route difficulty level value.
The route difficulty level analysis method is an analysis method for analyzing the difficulty level of a travel route, and is obtained by inquiring a database storing the route difficulty level analysis method. The travel route difficulty level value refers to a level value indicating the difficulty level of the travel route.
The travel initial route information and the fire scene environment detection information are analyzed and processed through the route difficulty level analysis method, so that a travel route difficulty level value is formed, and the travel route difficulty level value is convenient to use subsequently.
Step S600, analyzing and processing the fire scene environment detection information according to a preset fire scene environment influence analysis method to form a fire scene environment influence value.
The fire scene environment influence analysis method is used for analyzing the influence degree of the environment in the fire scene on the travelling route, and is obtained by inquiring a database storing the fire scene environment influence analysis method. The fire scene environment influence value refers to a degree value indicating the degree of influence of the environment in the fire scene on the travel route.
The fire scene environment detection information is analyzed and processed through the fire scene environment influence analysis method, so that a fire scene environment influence value is formed, and the subsequent use of the fire scene environment influence value is facilitated.
Step S700, according to the preset travel route selection analysis method, the travel initial route information, the travel route difficulty level value and the fire scene environment influence value are analyzed to form the travel route final information, and the travel route final information and the fire scene environment detection information are output.
The route selection analysis method refers to an analysis method for selecting and analyzing a route, and the route selection analysis method is obtained by searching a database storing the route selection analysis method. The travel route final information refers to final route information for traveling in the house.
The method comprises the steps of analyzing and processing the information of an initial travel route, the difficulty level value of the travel route and the environment influence value of a fire scene through a travel route selection analysis method, so that final information of the travel route is formed, and final information of the travel route and fire scene environment detection information are output, so that rescue workers can know the environment in the fire scene and the travel route, and further the rescue efficiency of the rescue workers entering the indoor fire scene to trapped people is improved.
In step S500 shown in fig. 1, in order to further secure the rationality of the travel route difficulty level value, further individual analysis and calculation of the travel route difficulty level value are required, and specifically, the steps shown in fig. 2 will be described in detail.
Referring to fig. 2, the method for analyzing the travel initial route information and the fire scene environment detection information according to the preset route difficulty level analysis method to form a travel route difficulty level value includes the following steps:
step S510, the special road distance value and the room entrance and exit position point corresponding to the travel initial route information are called according to the travel initial route information.
The special road section distance value is used for indicating the distance value occupied by a special road section such as a stair in a travelling route, and the special road section distance value is inquired and obtained from a database storing the special road section distance value. The room entrance/exit position point is a position point indicating a room entrance/exit that needs to be passed through in the travel route.
The distance value of the special road section and the position points of the entrance and the exit of the room are called through the information of the initial travel route, so that the distance value of the special road section and the position points of the entrance and the exit of the room are convenient to use subsequently.
Step S520, retrieving fire scene environment detection information corresponding to the room entrance/exit position points based on the fire scene environment detection information, and using the retrieved fire scene environment detection information as the entrance/exit environment detection information.
The entrance/exit environment detection information is detection information for detecting an environment in which an entrance/exit of a room is located. The fire scene environment detection information corresponding to the room entrance and exit position points is retrieved, and the fire scene environment detection information corresponding to the room entrance and exit position points is used as the entrance and exit environment detection information, so that the follow-up use of the entrance and exit environment detection information is facilitated.
In step S530, according to the preset entrance/exit influence analysis method, the entrance/exit environment detection information is analyzed to form an entrance influence value.
The entrance/exit influence analysis method is an analysis method for analyzing the degree of influence of the entrance on the travel route, and is obtained by searching a database storing the entrance/exit influence analysis method. The entrance influence value is a value indicating the degree to which the entrance has an influence on the travel route.
And the inlet and outlet environment detection information is analyzed and processed through an inlet and outlet influence analysis method, so that an inlet and outlet influence value is formed, and the subsequent use of the inlet and outlet influence value is facilitated.
Step S540, according to the preset bright position extraction method, analyzing and processing the fire scene environment detection information to form a special bright position point.
The bright position extraction method refers to an analysis method for extracting the bright position, and the bright position extraction method is obtained by inquiring a database storing the bright position extraction method. The special light position point means a position point for indicating that the environment in the fire has special light.
The fire scene environment detection information is analyzed and processed through the bright position extraction method, so that special bright position points are formed, and the special bright position points are convenient to use subsequently. The bright position extraction method specifically comprises the steps of comparing illumination intensity detection information in fire scene environment detection information with a preset illumination intensity reference interval, so that picture detection information corresponding to illumination intensity detection information which is not located in the illumination intensity reference interval is obtained and used as picture illumination anomaly detection information, and then confirming a position point by detecting the same special bright corresponding to a plurality of detected picture illumination anomaly detection information in the flight process of an unmanned aerial vehicle.
Step S550, according to the special brightness position point and the traveling initial route information, analyzing and calculating a distance value between the special brightness position point and the traveling initial route information and taking the distance value as the special brightness distance value.
Wherein, the special light distance value refers to the shortest distance value between the position point of the detected abnormal illumination intensity and the initial route when the house is in the process of traveling.
And analyzing and calculating the distance value between the special bright position point and the traveling initial route information through the special bright position point and the traveling initial route information, and taking the distance value between the special bright position point and the traveling initial route information as the special bright distance value, so that the special bright distance value is convenient to use subsequently.
Step S560, according to a preset travel route difficulty level calculation formula, the travel route difficulty level is formed by analyzing and calculating the special road distance value, the entrance influence value and the special light distance value.
The travel route difficulty level value refers to a comprehensive influence value of various factors when an initial route during travel in a house is influenced. The travel route difficulty level value calculation formula is a calculation formula for analyzing and obtaining a travel route difficulty level value, and the travel route difficulty level value calculation formula is obtained by inquiring a database storing the travel route difficulty level value calculation formula.
The preset travel route difficulty level value calculation formula is as follows: a=a [ (A1-A0)/L ] +ba2+c (uA 3);
wherein a is a travelling route difficulty level value, a is a weight of a special road section distance value, A1 is a special road section distance value, A0 is a preset special road section reference distance value, L is a preset travelling initial route total length, b is a weight of an entrance influence value, A2 is an entrance influence value, c is a weight of a special light distance value, u is a scale factor of a special light distance value, A3 is a special light distance value, and a+b+c=1.
For example, a=0.3, a1=10, a0= 5,L =50, b=0.4, a2=0.6, c=0.3, u=0.1, a3=2, then a=a [ (A1-a0)/L ] +ba2+c (uA 3) =0.3 [ (10-5)/50 ] +0.24+0.06=0.33.
In step S530 shown in fig. 2, in order to further secure the rationality of the entrance influence value, further individual analysis calculation of the entrance influence value is required, and specifically, the steps shown in fig. 3 will be described in detail.
Referring to fig. 3, according to a preset doorway influence analysis method, performing analysis processing on doorway environment detection information to form a doorway influence value includes the following steps:
and step S531, according to the entrance and exit environment detection information, retrieving unmanned aerial vehicle flight strategy information and an unmanned aerial vehicle infrared distance detection value corresponding to the entrance and exit environment detection information.
The unmanned aerial vehicle flight strategy information refers to flight strategy information when the unmanned aerial vehicle flies to the entrance and the exit, and the unmanned aerial vehicle flight strategy information is inquired and obtained from a database storing the unmanned aerial vehicle flight strategy information. A group of unmanned aerial vehicle units is defined to be a single unmanned aerial vehicle and a plurality of unmanned aerial vehicles are formed, sensors such as images and temperatures are preset on the unmanned aerial vehicle, infrared detection sensors are preset on the unmanned aerial vehicle, and communication connection is carried out between the unmanned aerial vehicle and the unmanned aerial vehicle in a Bluetooth communication mode and the like. The unmanned aerial vehicle infrared distance detection value refers to a distance detection value for infrared distance detection when the unmanned aerial vehicle flies to the entrance, and the unmanned aerial vehicle infrared distance detection value is detected and obtained through an infrared distance detection device preset on the unmanned aerial vehicle.
The unmanned aerial vehicle flight strategy information and the unmanned aerial vehicle infrared distance detection value corresponding to the entrance and exit environment detection information are called, so that the unmanned aerial vehicle flight strategy information and the unmanned aerial vehicle infrared distance detection value can be conveniently used subsequently.
Step S532, according to the matching result of the unmanned aerial vehicle unit flight strategy information and the preset flight reference strategy information, the initial detection information of the entrance and the exit is analyzed and obtained.
The flight reference strategy information refers to reference flight strategy information when the unmanned aerial vehicle unit flies, and the flight reference strategy information carries out multiple test flights on the unmanned aerial vehicle unit and records and stores entrance and exit information under the flight strategy. The entrance initial detection information is detection information after initial detection of the entrance.
Matching the flight strategy information of the unmanned aerial vehicle unit with preset flight reference strategy information, and taking entrance and exit information corresponding to the flight reference strategy information result closest to the flight strategy information of the unmanned aerial vehicle unit as entrance and exit initial detection information.
In step S533, according to the corresponding relationship between the infrared distance detection value of the unmanned aerial vehicle and the preset entrance detection adjustment information, the entrance detection adjustment information corresponding to the infrared distance detection value of the unmanned aerial vehicle is obtained by analysis.
The entrance detection adjustment information is adjustment information for adjusting initial detection information of the entrance, and the entrance detection adjustment information is obtained by inquiring a database storing the entrance detection adjustment information.
The entrance detection adjustment information is obtained through unmanned aerial vehicle infrared distance detection value analysis, so that the follow-up use of the entrance detection adjustment information is facilitated.
Step S534, analyzing and obtaining the entrance detection correction information corresponding to the entrance initial detection information and the entrance detection adjustment information according to the correspondence between the entrance initial detection information, the entrance detection adjustment information and the preset entrance detection correction information.
The entrance detection correction information is detection information obtained by correcting the initial detection information of the entrance, and the entrance detection correction information is obtained by searching a database storing the entrance detection correction information.
And the entrance detection correction information is obtained through the analysis of the entrance initial detection information and the entrance detection adjustment information, so that the follow-up use of the entrance detection correction information is facilitated.
In step S535, the passageway detection correction information is analyzed according to the predetermined passageway deviation influence analysis method to form a passageway deviation influence value, and the passageway deviation influence value is used as the passageway influence value.
The entrance/exit deviation influence analysis method is an analysis method for analyzing the influence degree corresponding to the influence of deviation between the entrance detection and the reference entrance, and the entrance deviation influence analysis method is obtained by inquiring a database storing the entrance deviation influence analysis method, and the entrance deviation influence value is an influence degree value corresponding to the influence of deviation between the entrance detection and the reference entrance.
And analyzing and processing the entrance detection correction information by using an entrance deviation influence analysis method to form an entrance deviation influence value, and taking the entrance deviation influence value as an entrance influence value to improve the accuracy of the acquired entrance influence value.
In step S531 shown in fig. 3, in order to further ensure the rationality of acquiring the flight policy information of the unmanned aerial vehicle, further individual analysis and calculation of the execution method when acquiring the flight policy information of the unmanned aerial vehicle are required, and specifically, the steps shown in fig. 4 will be described in detail.
Referring to fig. 4, the execution method when acquiring flight strategy information of the unmanned aerial vehicle comprises the following steps:
in step S5311, an unmanned plane infrared distance detection value is obtained.
The infrared distance detection value of the unmanned aerial vehicle is acquired, so that the subsequent use of the infrared distance detection value of the unmanned aerial vehicle is facilitated.
Step S5312, analyzing and calculating the difference between the unmanned aerial vehicle infrared distance detection value and the unmanned aerial vehicle infrared distance reference value according to the unmanned aerial vehicle infrared distance detection value and the preset unmanned aerial vehicle infrared distance reference value, and taking the difference as the unmanned aerial vehicle infrared distance deviation value.
The unmanned aerial vehicle infrared distance reference value refers to a reference distance value of unmanned aerial vehicle infrared distance detection, and the unmanned aerial vehicle infrared distance reference value is obtained by inquiring a database storing the unmanned aerial vehicle infrared distance reference value. The unmanned aerial vehicle infrared distance deviation value refers to a deviation value when the infrared detection distance of the unmanned aerial vehicle deviates from the reference distance.
The difference value between the infrared distance detection value of the unmanned aerial vehicle and the infrared distance reference value of the unmanned aerial vehicle is analyzed and calculated through the infrared distance detection value of the unmanned aerial vehicle and the preset infrared distance reference value of the unmanned aerial vehicle, and the difference value between the infrared distance detection value of the unmanned aerial vehicle and the infrared distance reference value of the unmanned aerial vehicle is used as the infrared distance deviation value of the unmanned aerial vehicle, so that the infrared distance deviation value of the unmanned aerial vehicle can be conveniently used subsequently.
Step S5313, analyzing and acquiring the unmanned aerial vehicle position adjustment information corresponding to the unmanned aerial vehicle infrared distance deviation value according to the corresponding relation between the unmanned aerial vehicle infrared distance deviation value and the preset unmanned aerial vehicle position adjustment information, and outputting the unmanned aerial vehicle position adjustment information.
The unmanned aerial vehicle position adjustment information refers to control information for controlling adjustment of the position of the unmanned aerial vehicle, and the unmanned aerial vehicle position adjustment information is obtained by inquiring a database storing the unmanned aerial vehicle position adjustment information.
The unmanned aerial vehicle position adjustment information is obtained through unmanned aerial vehicle infrared distance deviation value analysis, and the unmanned aerial vehicle position adjustment information is output, so that the position of the unmanned aerial vehicle is adjusted, and the position of the unmanned aerial vehicle is located at a position when the unmanned aerial vehicle infrared distance detection value is consistent with the unmanned aerial vehicle infrared distance reference value.
Step S5314, obtaining a plurality of unmanned mother-son machine distance detection values between a single unmanned mother machine and a plurality of unmanned son machines.
The distance detection value of the unmanned mother-son machine refers to a distance detection value between the unmanned mother machine and the unmanned son machine, and the distance detection value of the unmanned mother-son machine is obtained through time analysis conversion which is spent in a signal transmission process by a signal receiving device preset on the unmanned mother machine.
The distance detection values of the plurality of unmanned mother-son machines between the single unmanned mother machine and the plurality of unmanned son machines are obtained, so that the follow-up use of the distance detection values of the plurality of unmanned mother-son machines is facilitated.
In step S5315, according to the plurality of unmanned mother-child distance detection values, an average value of the plurality of unmanned mother-child distance detection values is analyzed and calculated, and the average value is used as the average distance value of the unmanned mother-child.
The average distance value of the unmanned mother and child machines refers to the average distance value between the unmanned mother machine and the unmanned child machine. The average value among the plurality of unmanned mother-son machine distance detection values is analyzed and calculated through the plurality of unmanned mother-son machine distance detection values, and the average value among the plurality of unmanned mother-son machine distance detection values is used as the average distance value of the unmanned mother-son machine, so that the subsequent use of the average distance value of the unmanned mother-son machine is convenient.
In step S5316, according to the distance detection values of the plurality of unmanned mother-child machines and the average distance value of the unmanned mother-child machines, the differences between the distance detection values of the plurality of unmanned mother-child machines and the average distance value of the unmanned mother-child machines are analyzed and calculated, and are used as the distance adjustment values of the unmanned mother-child machines.
The distance adjustment value of the unmanned mother-son machine refers to a deviation value between a distance detection value and a distance average value between the unmanned mother machine and the unmanned son machine. The difference value between the plurality of unmanned mother-son machine distance detection values and the unmanned mother-son machine average distance value is analyzed and calculated, and the difference value between the plurality of unmanned mother-son machine distance detection values and the unmanned mother-son machine average distance value is used as an unmanned mother-son machine distance adjustment value, so that the follow-up use of the unmanned mother-son machine distance adjustment value is facilitated.
Step S5317, according to the corresponding relation between the unmanned mother-son machine distance adjustment value and the preset unmanned mother-son machine distance adjustment information, analyzing and obtaining the unmanned mother-son machine distance adjustment information corresponding to the unmanned mother-son machine distance adjustment value, and outputting the unmanned mother-son machine distance adjustment information.
The unmanned mother-son machine distance adjustment information is used for controlling the unmanned mother machine to adjust the distance between the unmanned mother machine and the unmanned son machine, and the unmanned mother-son machine distance adjustment information is obtained by inquiring a database storing the unmanned mother-son machine distance adjustment information.
The unmanned mother-son machine distance adjustment information is obtained through analysis of the unmanned mother-son machine distance adjustment value, and the unmanned mother-son machine distance adjustment information is output, so that the distance between the unmanned mother machine and the unmanned son machine is adjusted, the flight strategy of the unmanned mother machine and the unmanned son machine when passing through the entrance corresponds to the entrance, and the accuracy of the unmanned machine set flight strategy information obtained at the moment is improved.
In step S535 shown in fig. 3, in order to further secure the rationality of the outlet/inlet deviation influence value, further individual analysis calculation of the outlet/inlet deviation influence value is required, and specifically, the steps shown in fig. 5 will be described in detail.
Referring to fig. 5, according to a preset entrance deviation influence analysis method, the method for analyzing the entrance detection correction information to form an entrance deviation influence value includes the following steps:
step S5351, retrieving the entrance area detection value and the entrance shape detection information corresponding to the entrance detection correction information based on the entrance detection correction information.
The entrance detection correction information includes entrance detection information related to an entrance such as an entrance area and an entrance shape, and the entrance area detection value is an area detection value indicating an entrance, and the entrance area detection value is obtained by searching a database storing the entrance area detection value. The entrance shape detection information is shape information indicating entrance detection, and the entrance shape detection information is obtained by searching a database storing the entrance shape detection information.
The entrance area detection value and the entrance shape detection information are called through the entrance detection correction information, so that the entrance area detection value and the entrance shape detection information are convenient to use subsequently.
Step S5352, analyzing and calculating the difference between the entrance area detection value and the entrance area reference value as the entrance area deviation value according to the entrance area detection value and the preset entrance area reference value.
The entrance area reference value is an area reference value indicating an entrance, and is obtained by searching a database storing the entrance area reference value. The inlet/outlet area deviation value is a deviation value indicating that the area of the inlet is deviated.
And analyzing and calculating the difference value between the entrance area detection value and the entrance area reference value through the entrance area detection value and the preset entrance area reference value, and taking the difference value between the entrance area detection value and the entrance area reference value as an entrance area deviation value, so that the follow-up use of the entrance area deviation value is convenient.
In step S5353, according to the correspondence between the entrance area deviation value and the preset entrance area influence value, the entrance area influence value corresponding to the entrance area deviation value is obtained by analysis.
The entrance area influence value is an influence degree value indicating that the area of an entrance influences the traveling of subsequent rescue workers, and is obtained by inquiring a database storing the entrance area influence value.
The entrance area influence value is obtained through entrance area deviation value analysis, so that the follow-up use of the entrance area influence value is facilitated.
Step S5354, analyzing and acquiring deviation information between the entrance shape detection information and the entrance shape reference information according to the entrance shape detection information and the preset entrance shape reference information, and taking the deviation information as entrance shape deviation information.
The entrance shape reference information is reference shape information indicating an entrance, and the entrance shape reference information is acquired by searching a database storing the entrance shape reference information. The entrance shape deviation information is deviation information indicating that there is a deviation between the detected shape of the entrance and the reference shape.
The deviation information between the passageway shape detection information and the passageway shape reference information is analyzed and obtained through the passageway shape detection information and the preset passageway shape reference information, and the deviation information between the passageway shape detection information and the passageway shape reference information is used as passageway shape deviation information, so that the passageway shape deviation information can be conveniently used subsequently.
Step S5355, analyzing and obtaining the entrance shape influence value corresponding to the entrance shape deviation information according to the corresponding relation between the entrance shape deviation information and the preset entrance shape influence value.
The entrance shape influence value is a value indicating the influence degree of the entrance shape on the travel of the following rescue workers, and is obtained by inquiring from a database storing the entrance shape influence value.
The influence value of the shape of the entrance is obtained through the analysis of the deviation information of the shape of the entrance, so that the influence value of the shape of the entrance is convenient to use subsequently.
In step S5356, according to the correspondence between the entrance area influence value, the entrance shape influence value and the preset entrance integrated influence value, the entrance integrated influence value corresponding to the entrance area influence value and the entrance shape influence value is obtained by analysis, and the entrance integrated influence value is used as the entrance deviation influence value.
The entrance and exit comprehensive influence value is a comprehensive influence value indicating that an entrance affects the traveling of subsequent rescue workers, and the entrance and exit comprehensive influence value is inquired and obtained from a database storing the entrance and exit comprehensive influence value.
The comprehensive influence value of the entrance is obtained by analyzing the influence value of the area of the entrance and the influence value of the shape of the entrance, and the comprehensive influence value of the entrance is used as the influence value of the deviation of the entrance, so that the obtained influence value of the deviation of the entrance is influenced by the area of the entrance and the shape of the entrance, and the accuracy of the obtained influence value of the deviation of the entrance is improved.
In step S600 shown in fig. 1, in order to further secure the rationality of the fire scene environmental impact value, further individual analysis calculation of the fire scene environmental impact value is required, specifically, the detailed description will be given by the steps shown in fig. 6.
Referring to fig. 6, the analyzing process of the fire scene environment detection information according to the preset fire scene environment influence analyzing method to form the fire scene environment influence value includes the steps of:
step S610, retrieving the ambient temperature detection information and the trapped person detection information corresponding to the fire scene environment detection information according to the fire scene environment detection information.
The environmental temperature detection information is detection information for detecting the environmental temperature in the fire scene, and the environmental temperature detection information is obtained by inquiring from a database storing the environmental temperature detection information. The trapped person detection information is detection information for detecting trapped persons in a fire scene, and the trapped person detection information is obtained by inquiring from a database storing the trapped person detection information.
The environment temperature detection information and the trapped person detection information are called through the fire scene environment detection information, so that the environment temperature detection information and the trapped person detection information can be conveniently used later.
In step S620, according to the preset environmental temperature influence analysis method, the environmental temperature detection information is analyzed to form an environmental temperature influence value.
The environmental temperature influence analysis method is used for analyzing the influence degree of the environmental temperature in the fire scene on the travel of the rescue workers, and is obtained by inquiring a database storing the environmental temperature influence analysis method. The environmental temperature influence value is an influence degree value of influence of the environmental temperature in the fire scene on the travel of the rescue workers.
The environmental temperature detection information is analyzed and processed through the environmental temperature influence analysis method, so that an environmental temperature influence value is formed, and the environmental temperature influence value is convenient to use subsequently.
Step S630, analyzing the trapped person detection information and the traveling initial route information according to a preset trapped person influence analysis method to form a trapped person influence value.
The trapped person influence analysis method is an analysis method for analyzing the influence degree of trapped persons in a fire scene on the travelling of rescue workers, the trapped person influence analysis method is obtained by inquiring a database storing the trapped person influence analysis method, and the trapped person influence value is an influence degree value of the trapped persons in the fire scene on the travelling of the rescue workers.
The trapped person detection information and the traveling initial route information are analyzed and processed through the trapped person influence analysis method, so that a trapped person influence value is formed, and the subsequent use of the trapped person influence value is facilitated.
Step S640, according to the corresponding relation between the environmental temperature influence value, the trapped person influence value and the preset environmental comprehensive influence value, analyzing and obtaining the environmental comprehensive influence value corresponding to the environmental temperature influence value and the trapped person influence value, and taking the environmental comprehensive influence value as the fire scene environmental influence value.
The comprehensive environmental impact value refers to an impact degree value of impact of the environment in the fire scene on the advancing of the rescue workers, and is obtained by inquiring a database storing the comprehensive environmental impact value.
The environmental comprehensive influence value is obtained through analysis of the environmental temperature influence value and the influence value of trapped personnel, and is used as the fire scene environmental influence value, so that the obtained fire scene environmental influence value is influenced by the environmental temperature in a fire scene and the influence of the trapped personnel, and the accuracy of the obtained fire scene environmental influence value is improved.
In step S620 shown in fig. 6, in order to further ensure the rationality of the environmental temperature influence value, further individual analysis calculation of the environmental temperature influence value is required, specifically, the detailed description will be given by the steps shown in fig. 7.
Referring to fig. 7, the method for analyzing the environmental temperature detection information according to the preset environmental temperature influence analysis method to form an environmental temperature influence value includes the following steps:
step S621, a high temperature center position point, a high temperature center temperature value and a low temperature coverage area corresponding to the ambient temperature detection information are retrieved according to the ambient temperature detection information.
The high-temperature center position point refers to a position point where the highest temperature is located in a fire scene, and the high-temperature center temperature value is obtained by inquiring a database storing the high-temperature center temperature value. The high-temperature center temperature value refers to the temperature value of the highest temperature in the fire scene, and the high-temperature center position point is obtained by inquiring a database storing the high-temperature center position point. The low-temperature coverage area refers to an area covered by a fire at a low temperature, and is obtained by inquiring a database storing the low-temperature coverage area. The high-temperature central temperature value, the high-temperature central position point and the low-temperature coverage area are all stored after being detected and acquired through an infrared thermal imaging device preset on the unmanned aerial vehicle.
The high-temperature central position point, the high-temperature central temperature value and the low-temperature coverage area are subjected to environmental temperature detection information, so that the high-temperature central position point, the high-temperature central temperature value and the low-temperature coverage area are conveniently used in the follow-up process.
In step S622, the furthest distance between the high temperature center point and the low temperature coverage area is calculated and analyzed according to the high temperature center point and the low temperature coverage area, and is taken as the actual distance value of the high Wen Fugai.
Wherein, the high Wen Fugai distance actual value refers to an actual distance value for indicating high temperature coverage in a fire. And analyzing and calculating the furthest distance value between the high-temperature central position point and the low-temperature coverage area through the high-temperature central position point and the low-temperature coverage area, and taking the furthest distance value between the high-temperature central position point and the low-temperature coverage area as the actual value of the high Wen Fugai distance, thereby being convenient for using the actual value of the high Wen Fugai distance subsequently.
In step S623, the high Wen Fugai distance reference value corresponding to the high temperature center temperature value is retrieved according to the high temperature center temperature value.
The high Wen Fugai distance reference value refers to a reference distance value which can be covered by the highest temperature in the fire scene, and the high Wen Fugai distance reference value is obtained by inquiring from a database storing the high Wen Fugai distance reference value.
The high-temperature covering distance reference value is adjusted through the high-temperature center temperature value, so that the Wen Fugai-distance reference value can be conveniently used later.
In step S624, the difference between the high Wen Fugai distance reference value and the high coverage distance actual value is analyzed and calculated according to the high coverage distance reference value and the high coverage distance actual value and is used as the temperature change moving distance value.
The temperature change moving distance value refers to a distance value which moves when the temperature in a fire scene changes, the difference value between the high temperature covering distance reference value and the high temperature covering distance actual value is analyzed and calculated through the high Wen Fugai distance reference value and the high temperature covering distance actual value, and the difference value between the high Wen Fugai distance reference value and the high temperature covering distance actual value is used as the temperature change moving distance value, so that the subsequent use of the temperature change moving distance value is convenient.
Step S625, performing analysis processing on the low-temperature coverage area according to the preset object matching method to form low-temperature area object type information and low-temperature area object position points.
The object matching method is an analysis method for carrying out matching recognition on objects in a fire scene, and is obtained by inquiring a database storing the object matching method. The low-temperature area object type information refers to type information of an object in a low-temperature area in a fire scene, and the low-temperature area object position point refers to a position point of the object in the low-temperature area in the fire scene.
The low-temperature coverage area is analyzed and processed through the object matching method, so that low-temperature area object type information and low-temperature area object position points are formed, and the low-temperature area object type information and the low-temperature area object position points are conveniently used subsequently. The object matching method specifically comprises the steps of identifying the object type of an image corresponding to a low-temperature coverage area, acquiring object type information of the low-temperature area, determining the position of an object in an identified fire scene through images corresponding to a plurality of low-temperature coverage areas, and outputting object position points of the low-temperature area.
In step S626, according to the correspondence between the low-temperature area object type information and the preset object type influence value, the object type influence value corresponding to the low-temperature area object type information is obtained.
The object type influence value refers to an influence degree value of influence of objects in a low-temperature area in a fire scene on the advance of rescue workers, and the object type influence value is inquired and obtained from a database storing the object type influence value.
The object type influence value is obtained through analysis of the object type information in the low-temperature area, so that the object type influence value can be conveniently used subsequently.
Step S627, according to the object position point in the low temperature area and the high temperature center position point, analyzing and calculating the distance value between the object position point in the low temperature area and the high temperature center position point and using the distance value as the object high temperature center distance value.
The object high-temperature center distance value refers to a distance value between an object in a low-temperature area in a fire scene and a high-temperature center, the distance value between the object position point in the low-temperature area and the high-temperature center position point is analyzed and calculated through the object position point in the low-temperature area and the high-temperature center position point, and the distance value between the object position point in the low-temperature area and the high-temperature center position point is taken as the object high-temperature center distance value, so that the object high-temperature center distance value is convenient to use subsequently.
In step S628, the difference between the object high temperature center distance value and the high temperature coverage distance reference value is analyzed and calculated as the object position deviation value according to the object high temperature center distance value and the high temperature coverage distance reference value.
The object position deviation value refers to a deviation value when deviation exists between the distance between an object in a low-temperature area in a fire scene and a high-temperature center and a reference distance which can be covered by the high-temperature center, the difference value between the object high-temperature center distance value and the high-temperature covering distance reference value is analyzed and calculated through the object high-temperature center distance value and the high-temperature covering distance reference value, and the difference value between the object high-temperature center distance value and the high-temperature covering distance reference value is used as the object position deviation value, so that the object position deviation value can be conveniently used subsequently.
In step S629, the temperature change moving distance value, the object type influence value and the object position deviation value are analyzed according to the preset environmental temperature influence value calculation formula to form an environmental temperature influence value.
The environmental temperature influence value refers to a comprehensive influence degree value which influences the temperature in a fire scene and the marching of a rescue worker by an object, the environmental temperature influence value calculation formula refers to a calculation formula for analyzing and calculating the environmental temperature influence value, and the environmental temperature influence value calculation formula is inquired and obtained from a database storing the environmental temperature influence value calculation formula.
The calculation formula of the environmental temperature influence value is as follows: b=d (vb1) +eb2+f (wB 3), B is an ambient temperature influence value, d is a weight of a temperature change movement distance value, v is a proportionality factor of a temperature change movement distance value, B1 is a temperature change movement distance value, e is a weight of an object type influence value, B2 is an object type influence value, f is a weight of an object position deviation value, w is a proportionality factor of a temperature change movement distance value, B3 is an object position deviation value, d+e+f=1.
For example, when d=0.3, v=0.1, b1=2, e=0.5, b2=0.6, f=0.2, w=0.1, b3=3, then b=d (vb1) +eb2+f (wB 3) =0.42.
In step S630 shown in fig. 6, in order to further ensure the rationality of the influence value of the trapped person, further individual analysis calculation of the influence value of the trapped person is required, specifically, the detailed description will be given by the steps shown in fig. 8.
Referring to fig. 8, the method for analyzing the trapped person detection information and the traveling initial route information according to the preset trapped person influence analysis method to form the trapped person influence value includes the following steps:
step S631, retrieving the trapped person value and the trapped person position point corresponding to the trapped person detection information according to the trapped person detection information.
The trapped person value refers to the number of trapped persons in a fire scene, and the trapped person value is obtained by inquiring a database storing the trapped person value. The trapped person position points refer to the position points of trapped persons, and the trapped person position points are obtained by inquiring a database storing the trapped person position points. The trapped person detection information includes detection information related to the number of trapped persons, the position, and the like.
The trapped person numerical value and the trapped person position point are inquired and obtained through the trapped person detection information, so that the subsequent use of the trapped person numerical value and the trapped person position point is convenient.
In step S632, the distance value between the trapped person position points is analyzed and calculated according to the trapped person position points and is used as the trapped person distance value.
The distance value of the trapped person refers to the distance value of the trapped person, the distance value of the trapped person is analyzed and calculated through the position points of the trapped person, and the distance value of the position points of the trapped person is used as the distance value of the trapped person, so that the subsequent use of the distance value of the trapped person is facilitated.
Step S633, analyzing and calculating the distance value between the trapped person position point and the traveling initial route information according to the trapped person position point and the traveling initial route information, and taking the distance value as the trapped person route distance value.
The trapped person route distance value refers to a distance value between each trapped person and an initial route traveled by the rescue personnel, the distance value between the trapped person position point and the initial route traveled information is analyzed and calculated through the trapped person position point and the initial route traveled information, and the distance value between the trapped person position point and the initial route traveled information is used as the trapped person route distance value, so that the subsequent use of the trapped person route distance value is facilitated.
In step S634, according to the trapped person position point and the low temperature area object position point, a distance value between the trapped person position point and the low temperature area object position point is analyzed and calculated and is used as the trapped person object distance value.
The trapped person object distance value refers to a distance value between objects in a low-temperature area in each trapped person fire scene, the distance value between the trapped person position point and the low-temperature area object position point is analyzed and calculated through the trapped person position point and the low-temperature area object position point, and the distance value between the trapped person position point and the low-temperature area object position point is used as the trapped person object distance value, so that the subsequent use of the trapped person object distance value is facilitated.
In step S635, the equation is calculated according to the preset trapped person influence value to analyze and calculate the trapped person value, the trapped person distance value, the trapped person route distance value and the trapped person object distance value to form the trapped person influence value.
The trapped person influence value refers to a comprehensive influence degree value of influence of trapped persons on the marching of rescue workers, the trapped person influence value calculation formula refers to a calculation formula for analyzing and calculating the trapped person influence value, and the trapped person influence value calculation formula is obtained by inquiring a database storing the trapped person influence value calculation formula. The calculation formula of the influence value of the trapped person is as follows: c=g (rC 2/C1) +h (sC 3) +i (tC 4). C is a trapped person influence value, g is a weight of the trapped person density, r is a distance scale factor of the trapped person density, C1 is a trapped person number, C2 is a trapped person distance value, h is a weight of a trapped person route distance value, s is a scale factor of the trapped person route distance value, C3 is a trapped person route distance value, i is a weight of a trapped person object distance value, t is a scale factor of the trapped person object distance value, C4 is a trapped person object distance value, g+h+i=1.
For example, when g=0.4, r=10, c2=0.1, c1= 2,h =0.1, s=0.1, c3=4, i=0.5, t=0.1, c4=2, then c=g (rC 2/C1) +h (sC 3) +i (tC 4) =0.34.
In step S700 shown in fig. 1, in order to further secure the rationality of the travel route end information, further individual analysis calculation of the travel route end information is required, specifically, the detailed description will be given by the steps shown in fig. 9.
Referring to fig. 9, according to a preset travel route selection analysis method, to analyze and process travel initial route information, a travel route difficulty level value, and a fire scene environment influence value to form travel route final information includes the steps of:
step S710, a number of the rescue personnel is obtained.
The number of the rescue workers is used for indicating the number of the rescue workers entering the fire scene at the current time, and the number of the rescue workers can be obtained by analyzing the fire scene after being detected and obtained through a positioning detection device preset on the rescue workers, and the number of the rescue workers can also be obtained by inputting through an external commander.
Step S720, according to a preset travel route influence value calculation formula, the travel route difficulty level value and the fire scene environment influence value are analyzed and calculated to form a travel route influence value.
The travel route influence value refers to a comprehensive influence degree value of influence of various factors on travel of rescue workers, the travel route influence value calculation formula refers to a calculation formula for calculating the travel route influence value, and the travel route influence value calculation formula is obtained by inquiring a database storing the travel route influence value calculation formula. The travel route influence value calculation formula is: z=ja+kb+lc. Z is a travel route influence value, j is a weight value of a travel route difficulty level value, A is a travel route difficulty level value, k is a weight value of an ambient temperature influence value, B is an ambient temperature influence value, l is a weight value of a trapped person influence value, C is a trapped person influence value, j+k+l=1, and l > k > j.
For example, when j=0.2, a=0.33, k=0.3, b=0.42, l=0.5, c=0.34, then z=ja+kb+lc=0.362.
Step S730, forward sorting is performed based on the travel route influence values, and travel initial route information corresponding to the first plurality of travel route influence values corresponding to the number of rescue workers in the forward sorting is selected as travel route final information.
Wherein the forward ordering is defined as ordering the travel route impact values from big to small. The forward sorting is carried out on the travel route influence values, and travel initial route information corresponding to the first plurality of travel route influence values corresponding to the number of rescue workers in the forward sorting is selected as travel route final information, so that the influence degree of the trapped personnel influence values on the travel route final information is improved, the influence degree of the travel route difficulty degree value and the environment temperature influence value on the travel route final information is reduced, and the accuracy of the obtained travel route final information is improved.
Referring to fig. 10, based on the same inventive concept, an embodiment of the present invention provides an intelligent visual internal attack search and rescue system, including:
the acquisition module 1 is used for acquiring the current position point of a rescue worker, fire scene environment detection information, an unmanned son-aircraft infrared distance detection value and an unmanned son-aircraft distance detection value;
a memory 2, configured to store a program of the intelligent visual internal attack search and rescue control method according to any one of fig. 1 to 9;
the processor 3, the program in the memory can be loaded and executed by the processor and implement the intelligent visualized internal attack search and rescue control method as described in any one of fig. 1 to 9.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional modules is illustrated, and in practical application, the above-described functional allocation may be performed by different functional modules according to needs, i.e. the internal structure of the apparatus is divided into different functional modules to perform all or part of the functions described above. The specific working processes of the above-described systems, devices and units may refer to the corresponding processes in the foregoing method embodiments, which are not described herein.
The foregoing description of the preferred embodiments of the present application is not intended to limit the scope of the application, in which any feature disclosed in this specification (including abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise. That is, each feature is one example only of a generic series of equivalent or similar features, unless expressly stated otherwise.

Claims (9)

1. An intelligent visual internal attack search and rescue control method is characterized by comprising the following steps:
acquiring the current position point of a rescue worker;
according to the current position point of the rescue personnel, house structure information corresponding to the current position point of the rescue personnel is called;
analyzing and processing the current position point of the rescue personnel and house structure information according to a preset house travel route analysis method to form travel initial route information;
detecting and acquiring fire scene environment detection information along a plurality of groups of unmanned aerial vehicle groups based on the traveling initial route information;
analyzing and processing the traveling initial route information and the fire scene environment detection information according to a preset route difficulty level analysis method to form a traveling route difficulty level value;
analyzing and processing the fire scene environment detection information according to a preset fire scene environment influence analysis method to form a fire scene environment influence value;
according to a preset travel route selection analysis method, the travel initial route information, the travel route difficulty level value and the fire scene environment influence value are analyzed and processed to form travel route final information, and the travel route final information and fire scene environment detection information are output;
the method for analyzing the travel initial route information and the fire scene environment detection information according to the preset route difficulty level analysis method to form a travel route difficulty level value comprises the following steps:
According to the traveling initial route information, a special road section distance value corresponding to the traveling initial route information and a room entrance and exit position point are called;
acquiring fire scene environment detection information corresponding to the position points of the entrance and the exit of the room based on the fire scene environment detection information and taking the fire scene environment detection information as the entrance and the exit environment detection information;
analyzing and processing the entrance environment detection information according to a preset entrance influence analysis method to form an entrance influence value;
analyzing and processing the fire scene environment detection information according to a preset bright position extraction method to form special bright position points;
according to the special light position point and the traveling initial route information, analyzing and calculating a distance value between the special light position point and the traveling initial route information and taking the distance value as a special light distance value;
according to a preset travel route difficulty level value calculation formula, analyzing and calculating a special road section distance value, an access influence value and a special light distance value to form a travel route difficulty level value, wherein the preset travel route difficulty level value calculation formula is as follows: a=a [ (A1-A0)/L ] +ba2+c (uA 3);
wherein A is the difficulty level value of the travelling route;
a is the weight of the distance value of the special road section;
A1 is a special road section distance value;
a0 is a preset reference distance value of a special road section;
l is the preset total length of the initial travel route;
b is the weight of the influence value of the entrance;
a2 is the influence value of the entrance;
c is the weight of the special bright distance value;
u is a scaling factor of a particular light distance value;
a3 is a special bright distance value;
a+b+c=1。
2. the intelligent visualized internal attack search and rescue control method according to claim 1, wherein the analyzing the entrance environment detection information to form the entrance influence value according to the preset entrance influence analysis method comprises:
according to the entrance and exit environment detection information, acquiring unmanned aerial vehicle flight strategy information and an infrared distance detection value of an unmanned aerial vehicle corresponding to the entrance and exit environment detection information, and defining a group of unmanned aerial vehicles as a single unmanned mother vehicle and a plurality of unmanned son vehicles;
analyzing and acquiring initial detection information of an entrance and an exit according to a matching result of unmanned aerial vehicle unit flight strategy information and preset flight reference strategy information;
according to the corresponding relation between the infrared distance detection value of the unmanned aerial vehicle and the preset entrance detection adjustment information, the entrance detection adjustment information corresponding to the infrared distance detection value of the unmanned aerial vehicle is obtained through analysis;
According to the corresponding relation between the entrance initial detection information, the entrance detection adjustment information and the preset entrance detection correction information, analyzing and obtaining entrance detection correction information corresponding to the entrance initial detection information and the entrance detection adjustment information;
according to a preset entrance deviation influence analysis method, the entrance detection correction information is analyzed to form an entrance deviation influence value, and the entrance deviation influence value is used as an entrance influence value.
3. The intelligent visual internal attack search and rescue control method according to claim 2, wherein the execution method when acquiring the flight strategy information of the unmanned aerial vehicle comprises the following steps:
acquiring an infrared distance detection value of the unmanned aerial vehicle;
according to the infrared distance detection value of the unmanned aerial vehicle and a preset infrared distance reference value of the unmanned aerial vehicle, analyzing and calculating a difference value between the infrared distance detection value of the unmanned aerial vehicle and the infrared distance reference value of the unmanned aerial vehicle, and taking the difference value as an infrared distance deviation value of the unmanned aerial vehicle;
according to the corresponding relation between the unmanned aerial vehicle infrared distance deviation value and the preset unmanned aerial vehicle position adjustment information, analyzing and obtaining the unmanned aerial vehicle position adjustment information corresponding to the unmanned aerial vehicle infrared distance deviation value, and outputting the unmanned aerial vehicle position adjustment information;
Acquiring a plurality of unmanned mother-son machine distance detection values between a single unmanned mother machine and a plurality of unmanned son machines;
according to the distance detection values of the plurality of unmanned mother-son machines, an average value among the distance detection values of the plurality of unmanned mother-son machines is analyzed and calculated and used as an average distance value of the unmanned mother-son machines;
according to the distance detection values of the plurality of unmanned mother-son machines and the average distance value of the unmanned mother-son machines, analyzing and calculating the difference between the distance detection values of the plurality of unmanned mother-son machines and the average distance value of the unmanned mother-son machines and taking the difference as an adjustment value of the distance between the unmanned mother-son machines and the average distance value of the unmanned mother-son machines;
according to the corresponding relation between the unmanned mother-son machine distance adjustment value and the preset unmanned mother-son machine distance adjustment information, analyzing and obtaining the unmanned mother-son machine distance adjustment information corresponding to the unmanned mother-son machine distance adjustment value, and outputting the unmanned mother-son machine distance adjustment information.
4. The intelligent visualized internal attack search and rescue control method according to claim 2, wherein the analyzing the entrance detection correction information to form the entrance deviation influence value according to the preset entrance deviation influence analysis method includes:
according to the entrance detection correction information, an entrance area detection value corresponding to the entrance detection correction information and entrance shape detection information are called;
According to the entrance area detection value and a preset entrance area reference value, analyzing and calculating a difference value between the entrance area detection value and the entrance area reference value and taking the difference value as an entrance area deviation value;
according to the corresponding relation between the entrance area deviation value and the preset entrance area influence value, analyzing and obtaining an entrance area influence value corresponding to the entrance area deviation value;
according to the entrance shape detection information and preset entrance shape reference information, analyzing and obtaining deviation information between the entrance shape detection information and the entrance shape reference information and taking the deviation information as entrance shape deviation information;
according to the corresponding relation between the entrance shape deviation information and the preset entrance shape influence value, analyzing and obtaining an entrance shape influence value corresponding to the entrance shape deviation information;
and analyzing and acquiring the entrance comprehensive influence value corresponding to the entrance area influence value and the entrance shape influence value according to the corresponding relation between the entrance area influence value, the entrance shape influence value and the preset entrance comprehensive influence value, and taking the entrance comprehensive influence value as an entrance deviation influence value.
5. The intelligent visual internal attack search and rescue control method according to claim 1, wherein the analyzing the fire scene environment detection information to form the fire scene environment influence value according to the preset fire scene environment influence analysis method comprises:
Retrieving ambient temperature detection information corresponding to the fire scene environment detection information according to the fire scene environment detection information and trapped person detection information;
according to a preset environmental temperature influence analysis method, analyzing and processing the environmental temperature detection information to form an environmental temperature influence value;
analyzing the trapped person detection information and the traveling initial route information according to a preset trapped person influence analysis method to form a trapped person influence value;
according to the corresponding relation between the environmental temperature influence value, the trapped person influence value and the preset environmental comprehensive influence value, analyzing and obtaining the environmental comprehensive influence value corresponding to the environmental temperature influence value and the trapped person influence value, and taking the environmental comprehensive influence value as the fire scene environmental influence value.
6. The intelligent visual internal attack search and rescue control method according to claim 5, wherein the analyzing the environmental temperature detection information to form the environmental temperature influence value according to the preset environmental temperature influence analysis method comprises:
according to the environmental temperature detection information, a high-temperature central position point, a high-temperature central temperature value and a low-temperature coverage area corresponding to the environmental temperature detection information are called;
According to the high-temperature central position point and the low-temperature coverage area, analyzing and calculating the furthest distance value between the high-temperature central position point and the low-temperature coverage area and taking the furthest distance value as the actual distance value of high Wen Fugai;
a high Wen Fugai distance reference value corresponding to the high-temperature center temperature value is called according to the high-temperature center temperature value;
according to the high Wen Fugai distance reference value and the high temperature covering distance actual value, analyzing and calculating the difference between the high Wen Fugai distance reference value and the high temperature covering distance actual value and taking the difference as a temperature change moving distance value;
analyzing and processing the low-temperature coverage area according to a preset object matching method to form low-temperature area object type information and low-temperature area object position points;
according to the corresponding relation between the low-temperature area object type information and the preset object type influence value, analyzing and obtaining an object type influence value corresponding to the low-temperature area object type information;
according to the object position points in the low-temperature area and the high-temperature center position points, analyzing and calculating distance values between the object position points in the low-temperature area and the high-temperature center position points and taking the distance values as object high-temperature center distance values;
according to the object high-temperature center distance value and the high-temperature covering distance reference value, analyzing and calculating the difference value between the object high-temperature center distance value and the high-temperature covering distance reference value and taking the difference value as an object position deviation value;
According to a preset calculation formula of the environmental temperature influence value, analyzing and processing the temperature change moving distance value, the object type influence value and the object position deviation value to form the environmental temperature influence value, wherein the calculation formula of the environmental temperature influence value is as follows: b=d (vb1) +eb2+f (wB 3);
b is an ambient temperature influence value;
d is the weight of the temperature change moving distance value;
v is a scaling factor of the temperature change movement distance value;
b1 is a temperature change movement distance value;
e is the weight of the object type influence value;
b2 is the object type influence value;
f is the weight of the object position deviation value;
w is a scaling factor of the temperature change movement distance value;
b3 is the object position deviation value;
d+e+f=1。
7. the intelligent visual internal attack search and rescue control method according to claim 6, wherein the analyzing the trapped person detection information and the traveling initial route information according to the preset trapped person influence analysis method to form the trapped person influence value comprises:
calling the number of the trapped person corresponding to the trapped person detection information according to the trapped person detection information and the position point of the trapped person;
according to the position points of each trapped person, analyzing and calculating the distance value between the position points of each trapped person and taking the distance value as the distance value between the trapped persons;
According to the trapped person position points and the traveling initial route information, analyzing and calculating distance values between the trapped person position points and the traveling initial route information and taking the distance values as trapped person route distance values;
according to the trapped person position points and the low-temperature area object position points, analyzing and calculating distance values between the trapped person position points and the low-temperature area object position points and taking the distance values as trapped person object distance values;
analyzing and calculating the number of the trapped person, the distance value of the trapped person, the route distance value of the trapped person and the object distance value of the trapped person according to a preset calculation formula of the influence value of the trapped person to form the influence value of the trapped person, wherein the calculation formula of the influence value of the trapped person is as follows: c=g (rC 2/C1) +h (sC 3) +i (tC 4);
c is the influence value of trapped people;
g is the weight of the density of trapped people;
r is the distance scaling factor of the density of trapped people;
c1 is the number of trapped people;
c2 is the distance value of trapped personnel;
h is the weight of the route distance value of the trapped person;
s is a scale factor of the route distance value of the trapped person;
c3 is the route distance value of the trapped person;
i is the weight of the object distance value of the trapped person;
t is a scaling factor of the object distance value of the trapped person;
C4 is the object distance value of the trapped person;
g+h+i=1。
8. the intelligent visual search and rescue control method according to claim 7, wherein selecting an analysis method according to a preset travel route to analyze and process travel initial route information, a travel route difficulty level value and a fire scene environment influence value to form travel route final information comprises:
acquiring a personnel number of a rescue worker;
analyzing and calculating the travelling route difficulty level value and the fire scene environment influence value according to a preset travelling route influence value calculation formula to form a travelling route influence value;
forward sorting is carried out based on the travel route influence values, travel initial route information corresponding to the first plurality of travel route influence values corresponding to the number of rescue workers in the forward sorting is selected as travel route final information, and the forward sorting is defined as sorting from big to small of the travel route influence values;
the travel route influence value calculation formula is: z=ja+kb+lc;
z is a travel route influence value;
j is the weight value of the difficulty level value of the travelling route;
a is the difficulty level value of the travelling route;
k is a weight value of an environmental temperature influence value;
b is an ambient temperature influence value;
l is the weight value of the influence value of the trapped person;
c is the influence value of trapped people;
j+k+l=1, and l > k > j.
9. An intelligent visual internal attack search and rescue system, which is characterized by comprising:
the acquisition module (1) is used for acquiring the current position point of the rescue personnel, the fire scene environment detection information and the unmanned son-aircraft infrared distance detection value and the unmanned son-aircraft distance detection value;
a memory (2) for storing a program of the intelligent visual internal attack search and rescue control method according to any one of claims 1 to 8;
a processor (3), a program in a memory being loadable by the processor and implementing the intelligent visual inside attack search and rescue control method according to any one of claims 1 to 8.
CN202311587697.2A 2023-11-27 2023-11-27 Intelligent visual internal attack search and rescue control method and system Active CN117289724B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311587697.2A CN117289724B (en) 2023-11-27 2023-11-27 Intelligent visual internal attack search and rescue control method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311587697.2A CN117289724B (en) 2023-11-27 2023-11-27 Intelligent visual internal attack search and rescue control method and system

Publications (2)

Publication Number Publication Date
CN117289724A true CN117289724A (en) 2023-12-26
CN117289724B CN117289724B (en) 2024-04-16

Family

ID=89252152

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311587697.2A Active CN117289724B (en) 2023-11-27 2023-11-27 Intelligent visual internal attack search and rescue control method and system

Country Status (1)

Country Link
CN (1) CN117289724B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1724603A2 (en) * 2005-05-17 2006-11-22 Hitachi, Ltd. System, method and computer program product for user interface operations for ad-hoc sensor node tracking
CN103164626A (en) * 2013-03-22 2013-06-19 合肥九信安全科技有限公司 Fire disaster high-risk unit fire risk assessment method based on control force analysis
GB202015243D0 (en) * 2020-09-25 2020-11-11 Connected Innovations Ltd Fire safety system and method
CN113639755A (en) * 2021-08-20 2021-11-12 江苏科技大学苏州理工学院 Fire scene escape-rescue combined system based on deep reinforcement learning
CN114912800A (en) * 2022-05-11 2022-08-16 北京达美盛软件股份有限公司 Early warning escape system and method
CN116527846A (en) * 2023-04-28 2023-08-01 王乐廷 Fire scene real-time monitoring command system based on digital cloud technology
CN116776528A (en) * 2023-04-03 2023-09-19 中国消防救援学院 Emergency rescue method and system based on computer simulation exercise
CN116884167A (en) * 2023-09-08 2023-10-13 山东舒尔智能工程有限公司 Intelligent fire control video monitoring and alarm linkage control system

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1724603A2 (en) * 2005-05-17 2006-11-22 Hitachi, Ltd. System, method and computer program product for user interface operations for ad-hoc sensor node tracking
CN103164626A (en) * 2013-03-22 2013-06-19 合肥九信安全科技有限公司 Fire disaster high-risk unit fire risk assessment method based on control force analysis
GB202015243D0 (en) * 2020-09-25 2020-11-11 Connected Innovations Ltd Fire safety system and method
CN113639755A (en) * 2021-08-20 2021-11-12 江苏科技大学苏州理工学院 Fire scene escape-rescue combined system based on deep reinforcement learning
CN114912800A (en) * 2022-05-11 2022-08-16 北京达美盛软件股份有限公司 Early warning escape system and method
CN116776528A (en) * 2023-04-03 2023-09-19 中国消防救援学院 Emergency rescue method and system based on computer simulation exercise
CN116527846A (en) * 2023-04-28 2023-08-01 王乐廷 Fire scene real-time monitoring command system based on digital cloud technology
CN116884167A (en) * 2023-09-08 2023-10-13 山东舒尔智能工程有限公司 Intelligent fire control video monitoring and alarm linkage control system

Also Published As

Publication number Publication date
CN117289724B (en) 2024-04-16

Similar Documents

Publication Publication Date Title
CN110047240B (en) Intelligent fire-fighting warning guidance system and method based on Internet of things
CN111951510A (en) Forestry fire prevention intelligence patrols and examines monitoring early warning system based on big data
KR102124067B1 (en) SYSTEM FOR PREDICTING SMOKE SPREADING AND EVACUATION ROUTE USING INTERNET OF THING (IoT) SENSORS, AMD METHOD FOR THE SAME
CN108490124A (en) A kind of gas detecting system and method based on unmanned plane
CN110503811B (en) Gas monitoring system and method
CN108007464A (en) Ultra-high-tension power transmission line inspection robot autonomous navigation method and system
CN109697807A (en) A kind of Internet of Things escaping guidance system of intelligence fire behavior detection analysis
CN107942363A (en) The fireman's indoor navigation system and its method calculated based on architecture structure drawing
CN111369071A (en) Intelligent evacuation system and method based on evacuation time prediction and fire detection model
US20060061752A1 (en) Information sensing and sharing system for supporting rescue operations from burning buildings
CN106643739A (en) Indoor environment personnel location method and system
CN113299035A (en) Fire identification method and system based on artificial intelligence and binocular vision
CN111753780B (en) Transformer substation violation detection system and violation detection method
CN115862273B (en) Intelligent monitoring method for toxic gas
CN110647168A (en) Cable tunnel environment detecting system based on multi-rotor unmanned aerial vehicle
CN111080027A (en) Dynamic escape guiding method and system
US11674895B2 (en) System and method for monitoring an air-space of an extended area
CN117289724B (en) Intelligent visual internal attack search and rescue control method and system
CN112504262A (en) Method for navigating people in building floor, method and system for navigating people in multi-floor building
CN114972639A (en) Fire monitoring system suitable for wisdom garden
KR20160017864A (en) Mehtod, Apparatus and System for Providing Safety Shelter Course
CN116071708A (en) Image recognition analysis method for fire risk security protection
WO2021230635A1 (en) Driver assistance system for excavator and method for controlling excavator by using same
CN116698044A (en) Unmanned aerial vehicle navigation method and system
CN207502748U (en) A kind of UAV system spectrum detection system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant