CN112214025A - Intelligent reconnaissance fire extinguishing system and fire extinguishing control method thereof - Google Patents

Intelligent reconnaissance fire extinguishing system and fire extinguishing control method thereof Download PDF

Info

Publication number
CN112214025A
CN112214025A CN202011146352.XA CN202011146352A CN112214025A CN 112214025 A CN112214025 A CN 112214025A CN 202011146352 A CN202011146352 A CN 202011146352A CN 112214025 A CN112214025 A CN 112214025A
Authority
CN
China
Prior art keywords
intelligent
fire
image
robot
reconnaissance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011146352.XA
Other languages
Chinese (zh)
Inventor
邱雁
何宇
金航杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Jiangfeng Technology Co ltd
Original Assignee
Zhejiang Jiangfeng Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Jiangfeng Technology Co ltd filed Critical Zhejiang Jiangfeng Technology Co ltd
Priority to CN202011146352.XA priority Critical patent/CN112214025A/en
Publication of CN112214025A publication Critical patent/CN112214025A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0238Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
    • G05D1/024Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
    • AHUMAN NECESSITIES
    • A62LIFE-SAVING; FIRE-FIGHTING
    • A62CFIRE-FIGHTING
    • A62C37/00Control of fire-fighting equipment
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0214Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0221Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0223Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0242Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using non-visible light signals, e.g. IR or UV signals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0253Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting relative motion information from a plurality of images taken successively, e.g. visual odometry, optical flow
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0255Control of position or course in two dimensions specially adapted to land vehicles using acoustic signals, e.g. ultra-sonic singals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • G05D1/0278Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using satellite positioning signals, e.g. GPS
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration using local operators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/77Retouching; Inpainting; Scratch removal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20048Transform domain processing
    • G06T2207/20056Discrete and fast Fourier transform, [DFT, FFT]

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Automation & Control Theory (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Electromagnetism (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Optics & Photonics (AREA)
  • Emergency Management (AREA)
  • Business, Economics & Management (AREA)
  • Public Health (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Acoustics & Sound (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention relates to an intelligent reconnaissance fire-extinguishing system, which solves the problems of the prior art and comprises field equipment, communication equipment and terminal equipment, the communication equipment comprises a portable base station and a cloud server, the terminal equipment comprises a PC terminal, a command center and a mobile phone terminal, the field device is communicated with a cloud server through a portable base station, the cloud server is communicated and connected with a PC terminal, a command center and a mobile phone terminal, the field equipment comprises an intelligent reconnaissance unmanned aerial vehicle, an intelligent reconnaissance robot, an intelligent fire-extinguishing robot, a field command platform and a portable reconnaissance and distribution control desk, the intelligent reconnaissance unmanned aerial vehicle, the intelligent reconnaissance robot, the intelligent fire-extinguishing robot, the field command platform and the portable reconnaissance and distribution control console are all in communication connection with the portable base station, and the intelligent reconnaissance robot and the intelligent fire-extinguishing robot are respectively provided with a GPS outdoor positioning device and an IMU indoor positioning device.

Description

Intelligent reconnaissance fire extinguishing system and fire extinguishing control method thereof
Technical Field
The invention belongs to a fire extinguishing system and an application method thereof, and relates to an intelligent reconnaissance fire extinguishing system and a fire extinguishing control method thereof.
Background
In recent years, fire accidents frequently occur, from 2015 to 2019, 140.1 thousands of fires (344 fires are larger) are reported all over the country, 7456 people die, 4693 people are injured, and direct property loss is 185.57 billion yuan. At present, great defects exist in the protection of fire fighters, and particularly in a fire site, the determination of the temperature of dangerous goods and the like has great problems, so that danger is easy to occur and the fire fighters are harmed.
Chinese national patent application No. cn201020236640.x discloses an infrared thermal imager in the technical field of intrinsically safe infrared imaging for mining, which comprises: the optical system is used for receiving the infrared radiation of the measured object; an infrared detector for converting infrared radiation received by the optical system into an electrical signal; the signal processing and system control equipment is used for converting the electric signal generated by the infrared detector into an infrared video image; the system also comprises a visible light source for illuminating so that the optical system can take visible light pictures. The infrared thermal imager can shoot an infrared thermal imaging picture and a visible light picture at the same time, and further can intuitively and accurately find the abnormal condition of the measured object by comparing the infrared thermal imaging picture with the visible light picture, thereby greatly reducing the probability of judgment error. However, even if such techniques are applied, the problems of high labor intensity in manual inspection and high danger in manual fire extinguishing still exist
Disclosure of Invention
The invention solves the problems of high labor intensity during manual inspection and high danger during manual fire extinguishing in the prior art, and provides an intelligent reconnaissance fire extinguishing system and a fire extinguishing control method thereof.
The technical scheme adopted by the invention for solving the technical problems is as follows: an intelligent reconnaissance fire extinguishing system comprises field equipment, communication equipment and terminal equipment, wherein the communication equipment comprises a portable base station and a cloud server, the terminal equipment comprises a PC terminal, a command center and a mobile phone terminal, the field equipment is communicated with a cloud server through a portable base station, the cloud server is communicated with the PC terminal, the command center and the mobile phone terminal, the field equipment comprises an intelligent reconnaissance unmanned aerial vehicle, an intelligent reconnaissance robot, an intelligent fire-extinguishing robot, a field command platform and a portable reconnaissance and distribution control desk, the intelligent reconnaissance unmanned aerial vehicle, the intelligent reconnaissance robot, the intelligent fire-extinguishing robot, the field command platform and the portable reconnaissance and distribution console are all in communication connection with the portable base station, and the intelligent reconnaissance robot and the intelligent fire-extinguishing robot are respectively provided with a GPS outdoor positioning device and an IMU indoor positioning device. The intelligent fire extinguishing robot can form an intelligent reconnaissance unmanned aerial vehicle, an intelligent reconnaissance robot and an intelligent fire extinguishing robot to assist fire fighters in patrolling and extinguishing, and can realize safe and efficient patrolling and fire extinguishing work as far as possible.
Preferably, the field device further comprises at least one of a handheld intelligent fire-fighting thermal infrared imager or a wearable intelligent fire-fighting thermal infrared imager. In the invention, the handheld intelligent fire-fighting thermal infrared imager or the wearable intelligent fire-fighting thermal infrared imager is configured on the body of a fireman, and the wireless communication equipment is used for receiving command commands to extinguish fire on site and also can be matched with a robot to carry out combined fire extinguishing.
Preferably, the handheld intelligent fire-fighting thermal infrared imager or the wearable intelligent fire-fighting thermal infrared imager comprises a shell and a circuit unit, and the circuit unit is connected with the shell through a silica gel piece.
Preferably, the casing is made of PET materials, a convection groove for heat dissipation is arranged between the casing and the circuit unit, a sand blasting layer is arranged on the surface of the casing, and a 300-DEG C organic silicon high-temperature-resistant paint layer is coated on the surface of the sand blasting layer. High-temperature resistant paint: coating a layer of 300 ℃ organosilicon high-temperature resistant paint on the sand blasting surface, and controlling the sand blasting surface temperature to about 260 ℃ actually through the treatment of the high-temperature resistant paint. Through tests, under the environment of a high-temperature box at 300 ℃ for half an hour, the surface of a zinc-plated plate (50x50x2mm) coated with high-temperature-resistant paint on the surface is basically protected from deformation, and the surface of the zinc-plated plate not coated with the high-temperature-resistant paint is in a deformation state; sand blasting treatment: performing sand blasting treatment on the surface of the outer layer to further reduce the direct radiation effect of high temperature on the surface of the injection molding part and reinforce the adhesive force of the surface of the high-temperature-resistant paint, so that the temperature of the injection molding surface is actually controlled to be about 210 ℃, and the surface of the injection molding part is covered with a thick heat insulation coat; adopting a PET material: PET is a milky white or pale yellow, highly crystalline polymer with a smooth, glossy surface. The thermoplastic engineering resin has good stability and excellent performance, and the outstanding performance of the thermoplastic engineering resin can withstand high temperature test for a long time. The high heat resistance, together with excellent flammability and UL laboratory certification, allows PEI resins to meet the stringent requirements for high temperature applications, and allows the temperature of the PET inner surface to be controlled to about 110 ℃.
Preferably, the circuit unit of the handheld intelligent fire-fighting thermal infrared imager comprises an STC single chip microcomputer, a display circuit, a battery electric quantity monitoring circuit, a power supply circuit, a character superposition chip and a temperature measurement chip, wherein the STC single chip microcomputer, the display circuit, the character superposition chip and the temperature measurement chip are all powered by the power supply circuit, the battery electric quantity monitoring circuit is connected with a P3.5 port of the STC single chip microcomputer, and the display circuit, the character superposition chip and the temperature measurement chip are all electrically connected with the STC single chip microcomputer. The use of the character superposition chip directly leads the thermal imager to be directly attached with the digital description on the basis of the common thermal imager, and further improves the safety of using the invention.
An intelligent reconnaissance fire-extinguishing control method is suitable for the intelligent reconnaissance fire-extinguishing system and comprises the following steps that firstly, field equipment collects field data; secondly, the cloud server processes the field data; thirdly, the terminal equipment receives and issues command commands; fourthly, the field device executes corresponding fire-fighting work according to the command, and is characterized in that: in the fourth step, robots in the field equipment operate in a group robot formation and navigation control mode based on the leader, and the field positioning of the robots is based on an indoor positioning method combining wireless positioning and IMU (inertial measurement Unit) for positioning.
Preferably, in step three, the method for implementing VR command by using the active shutter type time-sharing display method includes the following steps: the active shutter type time-sharing display method comprises the following substeps: transmitting the model vertex coordinate position, the patch information and the camera position in a scene space into a video memory of a video card through calculation, calling the video card to perform linear matrix calculation, dividing an image into two frames by taking a camera in an original scene as a base point, and extending the images by 3cm to the left and the right for rendering to form two groups of continuous staggered pictures corresponding to the left eye and the right eye so as to simulate views seen by the two eyes of a human;
the active shutter type time-sharing display method comprises the following substeps: left and right lens switches of synchronous control shutter-type 3D glasses make left and right eyes see corresponding picture to form VR figure, supply the user to command the use.
Preferably, the cloud server restoring and repairing the image by using the inverse filter and the wiener filter includes the following sub-steps:
a first restoring and repairing substep, extracting an original blurred image;
a second restoring and repairing substep, namely judging and selecting a point spread function PSF, and if the point spread function PSF is defocused and fuzzy, extracting a fuzzy radius by using an edge extraction operator;
a third recovery and repair substep, performing two-dimensional Fourier transform on the point spread function PSF to obtain a frequency spectrum of the wiener filter;
a fourth recovery and restoration substep, namely expanding the original image to obtain an image meeting the size required by Fourier transform;
a fifth recovery and repair substep, performing two-dimensional Fourier transform on the expanded image to obtain the frequency spectrum of the image;
a sixth restoring and repairing substep, namely multiplying the frequency spectrum of the image after Fourier transform by the frequency spectrum of the wiener filter to obtain the frequency spectrum of the restored image;
a restoration repair sub-step seven of performing inverse Fourier transform on the frequency spectrum of the restored image to obtain an expanded restored image;
a restoring and repairing substep eight, which is to carry out low-pass filtering noise removal and equalization processing on the expanded restored image;
and a ninth restoring and repairing substep, namely extracting the original image from the upper left corner of the expanded restored image, and finally obtaining a restored and repaired image of the original blurred image.
Preferably, in the fourth step, the robots in the field devices are operated in a leader-based swarm robot formation and navigation control mode, and the method comprises the following substeps:
the first robot control substep is to establish formation based on leader consistency for a plurality of intelligent reconnaissance robots and intelligent fire extinguishing robots,
the robot control sub-step II is used for realizing formation control based on the leader, forming consistency for controlling the direction angles of the group robots based on the consistency of the alignment behaviors of the leader, enabling the group robots to move towards the same direction, and enabling the gathering and scattering behaviors to be used for controlling the relative distances of the group robots to form consistency so as to enable the group robots to keep a certain formation;
a robot control sub-step II, dividing formation control into a target layer, a leading layer and a following layer, sending target information of the target layer to a leader of the leading layer, and enabling the leader to move towards a target point by adopting a position consistency method; meanwhile, each leader sends self state information to a follower group corresponding to the leader in the follower layer, and the formation control method and the followers keep a certain formation shape to move towards the target.
The substantial effects of the invention are as follows: the intelligent fire extinguishing robot can form an intelligent reconnaissance unmanned aerial vehicle, an intelligent reconnaissance robot and an intelligent fire extinguishing robot to assist fire fighters in patrolling and extinguishing, and can realize the patrolling and fire extinguishing work which is safe and efficient as much as possible.
Drawings
FIG. 1 is a schematic structural diagram of the present invention;
FIG. 2 is a schematic circuit diagram of an STC single chip microcomputer part of the handheld intelligent fire-fighting thermal infrared imager in the invention;
FIG. 3 is a schematic circuit diagram of a display circuit interface portion of the handheld intelligent fire-fighting thermal infrared imager according to the present invention;
FIG. 4 is a schematic circuit diagram of a battery power monitoring portion of the handheld intelligent fire-fighting thermal infrared imager according to the present invention;
FIG. 5 is a schematic circuit diagram of a power circuit portion of the handheld intelligent fire-fighting thermal infrared imager of the present invention;
FIG. 6 is a schematic circuit diagram of a character superposition chip part of the handheld intelligent fire-fighting thermal infrared imager according to the present invention;
Detailed Description
The technical solution of the present invention will be further specifically described below by way of specific examples.
Example 1:
an intelligent reconnaissance fire extinguishing system (see figure 1) comprises field devices, communication devices and terminal devices, wherein the communication devices comprise portable base stations and cloud servers, the terminal equipment comprises a PC terminal, a command center and a mobile phone terminal, the field equipment is communicated with the cloud server through a portable base station, the cloud server is in communication connection with the PC terminal, the command center and the mobile phone terminal, the field device comprises an intelligent reconnaissance unmanned aerial vehicle, an intelligent reconnaissance robot, an intelligent fire-extinguishing robot, a field command platform and a portable reconnaissance and distribution console, the intelligent reconnaissance unmanned aerial vehicle, the intelligent reconnaissance robot, the intelligent fire-extinguishing robot, the field command platform and the portable reconnaissance and distribution control console are all in communication connection with the portable base station, and the intelligent reconnaissance robot and the intelligent fire-extinguishing robot are respectively provided with a GPS outdoor positioning device and an IMU indoor positioning device. The field device also comprises at least one of a handheld intelligent fire-fighting thermal infrared imager or a wearable intelligent fire-fighting thermal infrared imager. The handheld intelligent fire-fighting thermal infrared imager or the wearable intelligent fire-fighting thermal infrared imager respectively comprises a shell and a circuit unit, and the circuit unit is connected with the shell through a silica gel piece. The casing is the casing that the PET material was made, the casing with be provided with between the circuit unit and be used for radiating convection current groove, the surface of casing is provided with sand blasting handle layer, sand blasting handle layer's surface scribbles the high-temperature resistant lacquer layer of one deck 300 ℃ organosilicon. High-temperature resistant paint: coating a layer of 300 ℃ organosilicon high-temperature resistant paint on the sand blasting surface, and treating the sand blasting surface by the high-temperature resistant paint to actually control the temperature of the sand blasting surface to about 260 ℃. Through tests, under the environment of a high-temperature box at 300 ℃ for half an hour, the surface of a zinc-plated plate (50x50x2mm) coated with high-temperature-resistant paint on the surface is basically protected from deformation, and the surface of the zinc-plated plate not coated with the high-temperature-resistant paint is in a deformation state; sand blasting treatment: performing sand blasting treatment on the surface of the outer layer, further reducing the direct radiation effect of high temperature on the surface of the injection molding part, and reinforcing the adhesive force of the surface of the high-temperature-resistant paint, so that the temperature of the injection molding surface is actually controlled to be about 210 ℃, and the surface of the injection molding part is covered with a thick heat insulation coat; adopting a PET material: PET is a milky white or pale yellow, highly crystalline polymer with a smooth, glossy surface. The thermoplastic engineering resin has excellent stability and energy level, and the outstanding performance of the thermoplastic engineering resin can withstand high temperature test for a long time. The high heat resistance, together with excellent flammability and UL laboratory certification, allows PEI resins to meet the stringent requirements for high temperature applications, and allows the temperature of the PET inner surface to be controlled to about 110 ℃. The circuit unit of the handheld intelligent fire-fighting thermal infrared imager comprises an STC single chip microcomputer, a display circuit, a battery electric quantity monitoring circuit, a power circuit, a character superposition chip and a temperature measurement chip, wherein the STC single chip microcomputer, the display circuit, the character superposition chip and the temperature measurement chip are all powered by the power circuit, the battery electric quantity monitoring circuit is connected with a P3.5 port of the STC single chip microcomputer, and the display circuit, the character superposition chip and the temperature measurement chip are all electrically connected with the STC single chip microcomputer. The STC single chip microcomputer is an STC12C5204AD chip, the character superposition chip is an MAX7456EUI + chip, a DVDD port of the character superposition chip is connected with a power supply circuit, the DVDD port of the character superposition chip is grounded through a capacitor C16, a CLKI port and an XFB port of the character superposition chip are connected with a crystal oscillator, the CS, the SDIN, the SCLK and the SDOUT are respectively connected with an input/output port of the corresponding STC single chip microcomputer, a LOS port of the character superposition chip is grounded through a resistor R16, a VOUT port of the character superposition chip is connected with a first end of a resistor R12 through a capacitor C15, a second end of a resistor R12 is connected with a display circuit, an SAG port of the character superposition chip is connected with a first end of a resistor R12 through a capacitor C17, a PVDD port of the character superposition chip is connected with the power supply circuit, a PVDD port of the character superposition chip is grounded through a capacitor C18, a VIN port of the character superposition chip is grounded, a PGND, the second end of the capacitor C19 is grounded through a resistor R13, the second end of the capacitor C19 is connected with the display circuit, the AVDD port of the character superposition chip is connected with the power circuit, the AVDD port of the character superposition chip is grounded through a capacitor C20, and the AGND port of the character superposition chip is grounded.
In order to improve the high temperature resistance of this embodiment, adopt the silica gel spare to bond: the silica gel is heat-resistant and cold-resistant, the using temperature is 100-300 ℃, the silica gel has excellent weather resistance, ozone resistance and good insulativity, and after the treatment, the environmental temperature of the core module of the liquid crystal display unit is ensured to be lower than 60 ℃, and the environmental temperature of the core module of the acquisition and processing unit is lower than 80 ℃. Meanwhile, a convection groove can be designed between the PET material and the silica gel piece, the convection speed of hot air is improved, the heat dissipation is accelerated, the heat accumulation is reduced, and the temperature acting on the surface of the silica gel piece is actually controlled to be about 100 ℃.
In order to further improve the safety and quality of the product, the metal oxide film resistor is selected as the resistor in the embodiment, and the metal oxide film resistor is formed by burning a layer of metal oxide film on a high-heat-conduction ceramic rod by using a high-temperature combustion technology, spraying a solution by using a tin and tin compound, spraying the solution into a constant-temperature furnace at 500-500 ℃ and coating the solution on a ceramic matrix, and then spraying a non-combustible coating on the outer layer. The high-temperature-resistant and corrosion-resistant aluminum alloy can keep the stability at high temperature, and is typically characterized by strong acid and alkali resistance, salt mist resistance and suitability for working in severe environment. Meanwhile, the composite material has the advantages of low noise, stability and good high-frequency characteristic.
The capacitor is selected, in a circuit, the capacitor plays the roles of blocking, coupling, bypassing, filtering, energy storage and time constant adjustment, and is an indispensable basic device, but a common aluminum electrolytic capacitor is often heated and expanded to form gas under high temperature and high pressure to burst under certain pressure, in order to be capable of safely and reliably operating under the special environment, the tantalum polymer capacitor of the U.S. KEMET company is adopted in the embodiment, the KEMET is one of the largest tantalum polymer capacitor maker in the world and the inventors of the tantalum capacitor, and the motivation of initial research and development is to replace unreliable aluminum capacitors in the fighter fire control radar. The tantalum polymer capacitor of KEMET company has excellent thermal stability, and the test of manufacturers shows that the service life of the capacitor exceeds more than 2000 hours in a 150 ℃ working environment; at the working temperature of 125 ℃, the service life is more than 2 ten thousand hours, and the simulation operation lasts 1000000 hours before one failure occurs.
The intelligent reconnaissance robot in the embodiment carries various sensors, and can collect fire scenes including toxic gas content, inflammable and explosive gas content, oxygen content, smoke concentration, temperature, humidity, infrared images and visible light images; the system has communication capability and sends information to a real-time data processing center at regular time; the intelligent reconnaissance robot adopts a mode of combining GPS outdoor positioning and IMU indoor positioning, and meets the requirements of complex fire-fighting environments. Reporting the position information of the robot per se to a data processing center in real time, wherein the position information comprises a horizontal position, a vertical position and a speed; the intelligent reconnaissance robot in this embodiment can easily avoid unknown barrier through many laser radar, 360 ultrasonic sensor, and the safe walking is 360 rotatory in place in mobile device and the crew environment, can walk forward or backward and need not the turn around, and narrow space also can pass through freely. Reporting the working capacity conditions of the robots including power, damage conditions and the like to a data processing center in real time; command platform instructions can be received; real-time navigation can be performed according to the data processing center/command platform map. PAD, mobile phone and computer can be controlled, and the system has multiple operation modes, preset task sequencing, virtual sensors and 1-99 temporary calling and simultaneous control.
The basic performance of the intelligent reconnaissance unmanned aerial vehicle in the embodiment should reach wind resistance of 8 grades, the flight is larger than 10 kilometers, the maximum flight height is higher than 1000 meters, the maximum climbing speed is higher than 5 meters per second, the maximum flight speed is higher than 15 meters per second, and the load capacity is 10 kilograms. The endurance time is not less than 50 minutes. Working humidity: not less than 10-90%, and has no condensation and rain-proof ability. The safety function is out of control to return to the journey, low electricity is reported to the police, low electricity returns to the journey, automatic obstacle avoidance ability. Based on the same unmanned aerial vehicle platform, the unmanned aerial vehicle can carry various sensors, and can collect fire scenes including toxic gas content, inflammable and explosive gas content, oxygen content, smoke concentration, temperature, humidity, infrared images and visible light images; and the system has communication capability and sends information to the data processing center in real time at regular time.
The intelligent fire-extinguishing robot in the embodiment can receive the instruction of the data processing center, automatically moves to a high-temperature position of a fire scene and carries out fire-extinguishing operation, calculates the primary position of flame based on detection of infrared and visible light video image recognition, controls the action of driving a fire monitor, and carries out secondary image recognition and accurate positioning so as to achieve coincidence of a water column drop point and the flame position, and can judge the deviation of the water column drop point and the flame position by manually observing videos and correct the final coincidence of the deviation by manual control. The cloud intelligent data processing center in this embodiment should possess the function of fire alarm acceptance identification: the system can receive fire alarm information transmitted from the fire alarm system and automatically call fire protection geographic information such as a geographic topographic map of a disaster area, building BIM information, fire protection facilities, a water source distribution map and the like. The fire alarm category is identified. Still possess the aid decision function: and performing macroscopic scheduling command decision by using an artificial intelligence technology and comprehensive database information. Mainly focuses on the complex environments such as the places, high-rise buildings, large-scale traffic accidents, natural disasters and the like of inflammable and explosive chemical dangerous goods, the combined action scheme is compiled in real time according to the factors such as disaster types, meteorological and geographic environments, fire fighting strength and the like, and information such as action commands, action schemes, fire fighting force deployment diagrams, operation plans and the like is sent to related commanders in a multimedia mode. And has the functions of fire extinguishing data processing: statistical charts such as histograms, pie charts, etc. may be plotted. A special electronic map can be generated to assist a commander in statistical analysis of various data according to geographic conditions. The fire alarm system has the functions of recording, sorting, inquiring and statistically analyzing the items such as fire alarm category, time period, action sequence, team category, area, fire extinguishing use scheme and effect and the like so as to summarize and summarize. The fire scene reinforcement scheduling function is also provided: the alarm, the aid in the fire scene, the cooperative acceptance and other conditions are repeatedly processed and automatically recorded. The system can receive and record the fire scene condition feedback report at any time, and timely carry out the reinforcement scheduling and the organization and coordination of the rescue materials through the wired/wireless communication system according to the fire scene condition and the actual requirement, thereby ensuring the smooth scheduling command. And information inquiry and management functions: the system has the function of a Web remote query module, and ensures the convenience of information query at any time and any place. The query conditions are flexibly and conveniently set, and the method has the advantages of accurate query and fuzzy query. The system can provide convenience for each level of commanders and related departments to know and master the fire alarm condition at any time so as to inquire the fire alarm condition (fire starting place, shunting condition and related support condition), the defense strength condition (vehicle, personnel, time and the like), the water source facility condition, the related unit condition and the like. The system can manage the conditions of fire roads, fire water sources, fire-fighting operation spaces, danger characteristics, fire fighting methods, fire vehicles (including types, quantity, tactical technical performance and using and operating methods), fire extinguishing agents (including types, quantity and using and calculating methods), unit building structures, fire-resistant grades, internal fire-fighting facilities, fire-proof networks and the like. If necessary, the remote terminal can provide information support to the site by using a Web access server mode.
The cloud intelligent data processing center in this embodiment can also simulate a drilling recording function: the drilling system is combined with a fire alarm and a command platform, is operated in a modularized mode, is used in a combined mode, and facilitates training of daily combat readiness and training work. The system can provide simulation drilling functions for operators, commanders and staff officers respectively. The simulation has the functions of various difficulty selections, automatic scoring, whole-course simulation, whole-course monitoring of a class and a preschool, and the like, and provides various analysis tools to improve the training effect. The fire-fighting command simulation training provides a commander with simulation analysis auxiliary tools such as fire-fighting command plotting, fire-fighting cooperative command, fire-fighting prepress practice, fire fighting assessment and the like.
An intelligent reconnaissance fire-extinguishing control method is suitable for the intelligent reconnaissance fire-extinguishing system and comprises the following steps that firstly, field equipment collects field data; secondly, the cloud server processes the field data; thirdly, the terminal equipment receives and issues command commands; fourthly, the firefighter who is equipped with wearable intelligent fire-fighting thermal infrared imager cooperates the field device to execute corresponding fire-fighting work according to the command, and the system is characterized in that: in the fourth step, robots in the field equipment operate in a group robot formation and navigation control mode based on a leader, and the field positioning of the robots is based on an indoor positioning method combining wireless positioning and IMU (inertial measurement unit).
In the second step, the image provided in this embodiment needs to be repaired, the picture is obtained by field shooting, based on the restoration technology of the motion and defocus blurred images, the speed and height of the focus blurred image and the motion blurred image are unchanged at the moment of exposure of two common modes, so that the generated image blur is mainly caused by uniform linear motion, and the blur degree in one image is equal; the defocus blur is a blur generated on an original image by irradiation of an auxiliary light source, and is characterized in that the influence of radiation from the center of the light source to the outside on the original image is reduced in sequence, and the blur degree in the image shows a change in a disk shape. In the image restoration technology, a system effective processing method such as inverse filtering (InverseFiltering), wiener filtering (wiener filtering), an iterative method with constraints and the like is proposed according to the quantity and the characteristics of the prior information of the ideal image. The wiener filter has better denoising performance, so that the mean square error between the blurred image and the restored image is minimum.
In this embodiment, an inverse filter and a wiener filter are mainly used to process an image, and in a linear translation space invariant motion blur system, a blurred image g (x, y) can be represented as a two-dimensional convolution of an original image f (x, y) and a PSF point spread function h (x, y), where the two-dimensional convolution is represented as:
Figure BDA0002739818340000091
where n (x, y) is additive noise, the fourier transform of equation (1) is:
G(u,v)=F(u,v)H(u,v)+N(u,v) (2)
wherein G (u, v), H (u, v), N (u, v) represent fourier transforms of G (x, y), H (x, y), N (x, y), respectively, and H (u, v) is called Modulation Transfer Function (MTF) of optical imaging. If the influence of noise is neglected, the approximate value of F (u, v) can be directly calculated, that is:
Figure BDA0002739818340000092
equation (3) represents the inverse filter recovery method. Obviously, because of the noise effect, the calculation error of the inverse filtering is:
Figure BDA0002739818340000093
as can be seen from the above formula, if H (u, v) is zero or small, the calculation error will be large, and the recovery result will become poor; in a degenerate system, because the noise N (u, v) generally varies slowly close to a constant, while the transfer function H (u, v) decays to zero near high frequencies, the inverse filter computation error E (u, v) is over-amplified at these frequency locations, affecting the recovery quality. In practical applications, to solve the H (u, v) zero noise amplification problem and reduce the calculation error, the inverse filtering is usually modified as follows:
Figure BDA0002739818340000094
wherein H (u, v) is the complex conjugate of H (u, v); r is the signal-to-noise ratio. Equation (5) is called wiener filtering, i.e. a minimum mean square error filter based on statistical characteristics, and has better effect than inverse filtering when the image is affected by noise, and the stronger the noise is, the more obvious the effect is. The range of the SNR r is generally 0.0001-0.01. Typically, when r >0.01, the Ghost (Ghost) effect occurs; and when r is less than 0.0001, the ringing (Ring) effect is more prominent. By selecting an appropriate r, occurrence of ghost and ringing in the central area of the image can be eliminated and suppressed. The transfer function H (u, v) of the wiener filter is expressed as:
Figure BDA0002739818340000101
the wiener filtering algorithm is a filter which is realized in a frequency domain and meets the condition of the minimum mean square error between an original image and a recovered image, so that the wiener filtering algorithm has some limitations and is more obvious in the Gibbs effect. The gibbs effect is a phenomenon in which original image information is covered due to oscillation between adjacent pixel points, so that an image shows interference fringes where a change in gray scale is severe. When discrete Fourier transform is carried out, the image data is required to have periodicity, and in practice, the image almost cannot meet the condition, pixel points must be supplemented at the edge of the image to meet the requirement of the Fourier transform, if a wiener filter is directly used, the value of an expansion part of the pixel points is defaulted to 0, a large edge error can be brought, the solution can adopt a degraded edge, the edge is reduced, and the gray level of the edge of the image is smoothly transited to the edge meeting the size of the Fourier transform image. Setting the original image, the smoothed edge image and the smoothing according to the inverse process of the change of the original image as follows:
Figure BDA0002739818340000102
Figure BDA0002739818340000103
Figure BDA0002739818340000104
the image is expanded to the size required by Fourier transformation according to the rule and is stopped, so that the original image can be expanded to meet the size required by Fourier transformation, a pseudo edge with violent gray scale change is not formed at the edge of the image, the shape of the original image is kept, and the occurrence of Gibbs effect can be inhibited.
The wiener filter and the appropriate signal-to-noise ratio r are selected, so that the influence of noise in a blurred image can be solved, and the generation of ringing effect can be inhibited; selecting different point spread function models for different fuzzy categories, such as defocusing fuzzy or motion fuzzy point spread function models and the like; and the occurrence of the Gibbs effect is inhibited by adopting a periodic edge filling method for the edge. Therefore, through the solution of these key technologies, the complete image restoration process can be expressed as that the cloud server adopts an inverse filter and a wiener filter to restore the image, which includes the following sub-steps:
a first restoring and repairing substep, extracting an original blurred image;
a second restoring and repairing substep, namely judging and selecting a point spread function PSF, and if the point spread function PSF is defocused and fuzzy, extracting a fuzzy radius by using an edge extraction operator;
a third recovery and repair substep, performing two-dimensional Fourier transform on the point spread function PSF to obtain the frequency spectrum of the wiener filter;
a fourth recovery and restoration substep, namely expanding the original image to obtain an image meeting the size required by Fourier transform;
a fifth recovery and repair substep, performing two-dimensional Fourier transform on the expanded image to obtain the frequency spectrum of the image;
a sixth recovery and restoration substep, namely multiplying the frequency spectrum of the image after Fourier transform by the frequency spectrum of the wiener filter to obtain the frequency spectrum of the recovered image;
a restoration repair sub-step seven of performing inverse Fourier transform on the frequency spectrum of the restored image to obtain an expanded restored image;
a restoring and repairing substep eight, which is to carry out low-pass filtering noise removal and equalization processing on the expanded restored image;
and a ninth restoring and repairing substep, namely extracting the original image from the upper left corner of the expanded restored image, and finally obtaining the restored and repaired image of the original blurred image. Through the image enhancement and restoration system, the image displayed on the background can be well restored, background personnel can transmit instructions to the fire fighter through the image, and the life safety of the fire fighter is guaranteed.
In the third step, the method for realizing VR command by the active shutter type time-sharing display method comprises the following steps:
the active shutter type time-sharing display method comprises the following substeps: transmitting the model vertex coordinate position, the patch information and the camera position in a scene space into a video memory of a video card through calculation, calling the video card to perform linear matrix calculation, dividing an image into two frames by taking a camera in an original scene as a base point, and extending the images by 3cm to the left and the right for rendering to form two groups of continuous staggered pictures corresponding to the left eye and the right eye so as to simulate views seen by the two eyes of a human;
the active shutter type time-sharing display method comprises the following substeps: the left and right lens switches of the shutter type 3D glasses are synchronously controlled, so that the left and right eyes can see corresponding pictures, and VR images are formed for users to command and use.
In the fourth step, the robots in the field devices are operated in a leader-based group robot formation and navigation control mode, and the method comprises the following substeps:
the first robot control substep is to establish formation based on leader consistency for a plurality of intelligent reconnaissance robots and intelligent fire extinguishing robots,
the robot control sub-step II is used for realizing formation control based on the leader, forming consistency for controlling the direction angles of the group robots based on the consistency of the alignment behaviors of the leader, enabling the group robots to move towards the same direction, and enabling the gathering and scattering behaviors to be used for controlling the relative distances of the group robots to form consistency and enabling the group robots to keep a certain formation;
a robot control sub-step II, dividing formation control into a target layer, a leading layer and a following layer, sending target information of the target layer to a leader of the leading layer, and enabling the leader to move towards a target point by adopting a position consistency method; meanwhile, each leader sends self state information to a follower group corresponding to the leader in the follower layer, and the formation control method and the followers keep a certain formation shape to move towards the target. More specifically:
the system comprises a plurality of robots, wherein a network topological graph is G ═ { V, E }, wherein V ═ {1,2,3, …, n }, and represents a vertex set of the network topological graph;
Figure BDA0002739818340000125
representing a set of edges of a network topology graph. If undirected, the nodes are unordered, i.e.
Figure BDA0002739818340000126
G is also called an information interaction graph.
The adjacent matrix a (G) corresponding to the interaction graph G is an n × n matrix, and a (G) ═ aij]E.g. Rn x n, wherein,
Figure BDA0002739818340000121
eijis the connecting edge of two nodes. The information interaction matrix c (G) ═ a (G) + d (G) ═ c corresponding to the interaction map Gij]Where D (G) is an n x n diagonal matrix with non-positive diagonal elements, i.e.
Figure BDA0002739818340000122
The elements except the main diagonal in C (G) are all non-negative real numbers, and the sum of the elements in each row is zero.
Here, a group robot system including n +1 robots is considered, in which the robot numbered 0 is a leader and the other robots numbered 1 to n are followers. In the information interaction map G of the n +1 robots, { V, E }, V ═ 0,1,2, …, n }, an information interaction matrix is:
Figure BDA0002739818340000123
the mathematical model of the follower robot is:
Figure BDA0002739818340000124
wherein t is time; xiiE.g. R and ζiAnd epsilon R respectively represents the state information quantity and the control input quantity of the ith follower robot.
The mathematical model of the leader robot is:
Figure BDA0002739818340000131
xi in the formula0E.g. R and ζ0And e.R respectively represent the state information quantity and the control input quantity of the leader robot.
For any initial condition, if satisfied
Figure BDA0002739818340000132
That is, the state information volumes of all followers finally converge to the state information volume of the leader, indicating that the group robot system forms a consistency with the leader.
The method for establishing the consistency based on the leader is as follows:
Figure BDA0002739818340000133
in the formula bi=ai0≥0;wij(t) and wi0(t) is a weighted value at time t.
The basic static continuous time consistency method (equation (3)) progressively forms consistency if and only if its corresponding interaction graph G has a directed spanning tree.
A typical mobile robot motion model is as follows:
Figure BDA0002739818340000134
in the formula, xi、yiRespectively an abscissa and an ordinate of the robot i; thetai、vi、ωiRespectively, the direction angle, linear velocity and angular velocity of the robot i, where viAnd ωiTo control the input amount.
The alignment action is used for controlling the directional angles of the group robots to be consistent, so that the group robots move towards the same direction. The leader-based alignment behavior consistency approach can be expressed as:
Figure BDA0002739818340000135
the gathering and scattering behaviors are used for controlling the relative distances of the swarm robots to be consistent, so that the swarm robots keep a certain formation. The leader-based aggregation and dispersion behavior consistency approach can be expressed as:
Figure BDA0002739818340000141
Figure BDA0002739818340000142
vx i(t) and vy i(t) is the speed of the ith follower robot in the x-axis and y-axis directions at time t; dxAnd dySetting the distance between the x-axis direction and the y-axis direction in the formation control; 0 < eta < 1.
Let r bei=[xi,yj]T、rj=[xj,yj]T、rij=[xij,yjj]TThen there are:
Figure BDA0002739818340000143
in the formula:
Figure BDA0002739818340000144
on the basis of a formation control method based on a leader, the embodiment provides a group robot layered formation navigation control method. Dividing the control system into a target layer, a leader layer and a follower layer, wherein: the target layer comprises a target point, the leader layer comprises one or more leaders, the follower layer comprises one or more unconnected follower groups, and each leader has a follower group corresponding to it. The specific control strategy is as follows: sending the target information of the target layer to a leader of a leader layer, and enabling the leader to move towards a target point by adopting a position consistency method; meanwhile, each leader sends the state information of the leader to a follower group corresponding to the leader in the follower layer, and the leader and the follower keep a certain formation to move towards the target by adopting the formation control method provided by the text, so that different groups are not influenced.
A group robot system composed of N robots in the leader and follower layers is denoted as g ═ { r ═ r1,r2,…,rNThere are m disconnected subsets, each containing a leader and n followers. The group of robots in a subset is marked as gK0={rK0,rK1,…,r KN1 < k < m, wherein the leader is rK0. Sending the target information to the leader r by adopting a position consistency algorithmK0The navigation control method comprises the following steps:
Figure BDA0002739818340000151
wherein (alpha, beta) is the coordinate of the target point vX K0(t) and vy K0k0(t) is the leader rK0Speed in the x-axis and y-axis directions at time t; x is the number ofK0(t) and yK0(t) is the leader rK0The abscissa and ordinate at time t; w (t) is a weighted value at time t.
When the information interaction graph of the group of robots is a directed spanning tree, all the robots finally converge to the state information of the root node robot. And sending the position information of the target point to the leader, wherein the state information of the leader is finally converged to the state information of the target point, namely the leader moves to the position of the target point. And when the leader moves to the target point, the formation movement is carried out between the leader and the follower corresponding to the leader by adopting a formation control method based on the leader, and the leader reaches the target point and the vicinity thereof, so that the formation navigation movement is completed. Similarly, the leaders and the corresponding followers of other subsets also adopt the control method, so that the formation navigation control of the whole group is completed.
The above-described embodiments are only preferred embodiments of the present invention, and are not intended to limit the present invention in any way, and other variations and modifications may be made without departing from the spirit of the invention as set forth in the claims.

Claims (9)

1. An intelligence reconnaissance fire extinguishing systems which characterized in that: including field device, communications facilities and terminal equipment, communications facilities includes portable basic station and cloud ware, terminal equipment includes PC terminal, command center and cell-phone terminal, field device passes through portable basic station and cloud ware communication, cloud ware and PC terminal, command center and cell-phone terminal communication connection, field device including intelligent reconnaissance unmanned aerial vehicle, intelligent reconnaissance robot, intelligent fire-fighting robot, on-the-spot command platform and portable investigation cloth accuse platform, intelligent reconnaissance unmanned aerial vehicle, intelligent reconnaissance robot, intelligent fire-fighting robot, on-the-spot command platform and portable investigation cloth accuse platform all with portable basic station communication connection, all dispose outdoor location and IMU indoor positioner in intelligent reconnaissance robot and the intelligent fire-fighting robot.
2. The intelligent reconnaissance fire suppression system of claim 1, wherein: the field device also comprises at least one of a handheld intelligent fire-fighting thermal infrared imager or a wearable intelligent fire-fighting thermal infrared imager.
3. The intelligent reconnaissance fire suppression system of claim 2, wherein: the handheld intelligent fire-fighting thermal infrared imager or the wearable intelligent fire-fighting thermal infrared imager respectively comprises a shell and a circuit unit, and the circuit unit is connected with the shell through a silica gel piece.
4. The intelligent reconnaissance fire suppression system of claim 3, wherein: the casing is the casing that the PET material was made, the casing with be provided with between the circuit unit and be used for radiating convection current groove, the surface of casing is provided with the sand blasting layer, the surface on sand blasting layer scribbles the high-temperature resistant lacquer layer of one deck 300 ℃ organosilicon.
5. The intelligent reconnaissance fire suppression system of claim 2, wherein: the circuit unit of the handheld intelligent fire-fighting thermal infrared imager comprises an STC single chip microcomputer, a display circuit, a battery electric quantity monitoring circuit, a power circuit, a character superposition chip and a temperature measurement chip, wherein the STC single chip microcomputer, the display circuit, the character superposition chip and the temperature measurement chip are all powered by the power circuit, the battery electric quantity monitoring circuit is connected with a P3.5 port of the STC single chip microcomputer, and the display circuit, the character superposition chip and the temperature measurement chip are all electrically connected with the STC single chip microcomputer.
6. An intelligent reconnaissance fire extinguishing control method is suitable for the intelligent reconnaissance fire extinguishing system according to claim 2, and comprises the following steps of firstly, acquiring field data by field equipment; secondly, the cloud server processes the field data; thirdly, the terminal equipment receives and issues command commands; fourthly, the field device executes corresponding fire-fighting work according to the command, and is characterized in that: in the fourth step, robots in the field equipment operate in a group robot formation and navigation control mode based on the leader, and the field positioning of the robots is based on an indoor positioning method combining wireless positioning and IMU (inertial measurement Unit) for positioning.
7. The intelligent reconnaissance fire suppression system of claim 6, wherein: in the third step, the method for realizing VR command by the active shutter type time-sharing display method comprises the following steps:
the active shutter type time-sharing display method comprises the following substeps: transmitting the model vertex coordinate position, the patch information and the camera position in the scene space into a video memory of a video card through calculation, calling the video card to perform linear matrix calculation, dividing an image into two frames by taking the camera in the original scene as a base point, and extending the images by 3cm to the left and the right for rendering to form two groups of continuous staggered pictures corresponding to the left eye and the right eye so as to simulate the view seen by the two eyes of a human;
the active shutter type time-sharing display method comprises the following substeps: the left and right lens switches of the shutter type 3D glasses are synchronously controlled to enable the left and right eyes to see corresponding pictures, so that VR images are formed for users to command and use.
8. The intelligent reconnaissance fire suppression system of claim 6, wherein: the cloud server adopts an inverse filter and a wiener filter to restore the image and comprises the following substeps:
a first restoring and repairing substep, extracting an original blurred image;
a second restoring and repairing substep, namely judging and selecting a point spread function PSF, and if the point spread function PSF is defocused and fuzzy, extracting a fuzzy radius by using an edge extraction operator;
a third recovery and repair substep, performing two-dimensional Fourier transform on the point spread function PSF to obtain the frequency spectrum of the wiener filter; a fourth recovery and restoration substep, namely expanding the original image to obtain an image meeting the size required by Fourier transform;
a fifth recovery and repair substep, performing two-dimensional Fourier transform on the expanded image to obtain the frequency spectrum of the image;
a sixth recovery and restoration substep, namely multiplying the frequency spectrum of the image after Fourier transform by the frequency spectrum of the wiener filter to obtain the frequency spectrum of the recovered image;
a restoration repair sub-step seven of performing inverse Fourier transform on the frequency spectrum of the restored image to obtain an expanded restored image;
a restoring and repairing substep eight, which is to carry out low-pass filtering noise removal and equalization processing on the expanded restored image;
and a ninth restoring and repairing substep, namely extracting the original image from the upper left corner of the expanded restored image, and finally obtaining the restored and repaired image of the original blurred image.
9. The intelligent reconnaissance fire suppression system of claim 6, wherein: in the fourth step, the robots in the field devices are operated in a leader-based group robot formation and navigation control mode, and the method comprises the following substeps: the first robot control substep is to establish formation based on leader consistency for a plurality of intelligent reconnaissance robots and intelligent fire extinguishing robots,
the robot control sub-step II is used for realizing formation control based on the leader, forming consistency for controlling the direction angles of the group robots based on the consistency of the alignment behaviors of the leader, enabling the group robots to move towards the same direction, and enabling the gathering and scattering behaviors to be used for controlling the relative distances of the group robots to form consistency so as to enable the group robots to keep a certain formation; a robot control sub-step II, dividing formation control into a target layer, a leading layer and a following layer, sending target information of the target layer to a leader of the leading layer, and enabling the leader to move towards a target point by adopting a position consistency method; meanwhile, each leader sends self state information to a follower group corresponding to the leader in the follower layer, and the formation control method and the followers keep a certain formation shape to move towards the target.
CN202011146352.XA 2020-10-23 2020-10-23 Intelligent reconnaissance fire extinguishing system and fire extinguishing control method thereof Pending CN112214025A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011146352.XA CN112214025A (en) 2020-10-23 2020-10-23 Intelligent reconnaissance fire extinguishing system and fire extinguishing control method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011146352.XA CN112214025A (en) 2020-10-23 2020-10-23 Intelligent reconnaissance fire extinguishing system and fire extinguishing control method thereof

Publications (1)

Publication Number Publication Date
CN112214025A true CN112214025A (en) 2021-01-12

Family

ID=74054978

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011146352.XA Pending CN112214025A (en) 2020-10-23 2020-10-23 Intelligent reconnaissance fire extinguishing system and fire extinguishing control method thereof

Country Status (1)

Country Link
CN (1) CN112214025A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112774073A (en) * 2021-02-05 2021-05-11 燕山大学 Unmanned aerial vehicle guided multi-machine cooperation fire extinguishing method and fire extinguishing system thereof
TWI748849B (en) * 2021-01-20 2021-12-01 實踐大學 Stunt control device for unmanned aerial vehicle formation flying smoke pulling
CN115129085A (en) * 2022-07-25 2022-09-30 中国安全生产科学研究院 Method for cooperatively executing tasks by multiple group robots
CN117138291A (en) * 2023-10-27 2023-12-01 江苏庆亚电子科技有限公司 Fire-fighting robot fire-extinguishing method and fire-extinguishing system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101867797A (en) * 2010-07-09 2010-10-20 公安部上海消防研究所 Helmet type infrared detection and image transmission processing system
CN104113600A (en) * 2014-07-29 2014-10-22 江西华宇软件股份有限公司 Fire scene emergency command system platform
CN206993295U (en) * 2017-06-05 2018-02-09 浙江雷邦光电技术有限公司 A kind of fire command system based on audio, video data transmission
CN110180112A (en) * 2019-06-05 2019-08-30 山东国兴智能科技股份有限公司 A kind of unmanned plane and fire-fighting robot coordinated investigation extinguishing operation method
CN110180114A (en) * 2019-06-05 2019-08-30 山东国兴智能科技股份有限公司 Fire-fighting robot co-located, scouting, fire source identification and aiming extinguishing method
CN110673603A (en) * 2019-10-31 2020-01-10 郑州轻工业学院 Fire scene autonomous navigation reconnaissance robot

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101867797A (en) * 2010-07-09 2010-10-20 公安部上海消防研究所 Helmet type infrared detection and image transmission processing system
CN104113600A (en) * 2014-07-29 2014-10-22 江西华宇软件股份有限公司 Fire scene emergency command system platform
CN206993295U (en) * 2017-06-05 2018-02-09 浙江雷邦光电技术有限公司 A kind of fire command system based on audio, video data transmission
CN110180112A (en) * 2019-06-05 2019-08-30 山东国兴智能科技股份有限公司 A kind of unmanned plane and fire-fighting robot coordinated investigation extinguishing operation method
CN110180114A (en) * 2019-06-05 2019-08-30 山东国兴智能科技股份有限公司 Fire-fighting robot co-located, scouting, fire source identification and aiming extinguishing method
CN110673603A (en) * 2019-10-31 2020-01-10 郑州轻工业学院 Fire scene autonomous navigation reconnaissance robot

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
王枚: "运动和散焦模糊图像的复原方法及其应用研究", 《激光与红外》 *
罗永强: "《消防装备与应用手册》", 30 April 2013 *
薛鸿民: "《大学文科计算机基础》", 28 February 2019 *
赵坚勇: "《平板显示与3D显示技术》", 31 January 2012 *
陈浩: "基于领导者的群体机器人编队及导航控制", 《武汉科技大学学报》 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI748849B (en) * 2021-01-20 2021-12-01 實踐大學 Stunt control device for unmanned aerial vehicle formation flying smoke pulling
CN112774073A (en) * 2021-02-05 2021-05-11 燕山大学 Unmanned aerial vehicle guided multi-machine cooperation fire extinguishing method and fire extinguishing system thereof
CN115129085A (en) * 2022-07-25 2022-09-30 中国安全生产科学研究院 Method for cooperatively executing tasks by multiple group robots
CN115129085B (en) * 2022-07-25 2024-04-30 中国安全生产科学研究院 Method for cooperatively executing tasks by multiple groups of robots
CN117138291A (en) * 2023-10-27 2023-12-01 江苏庆亚电子科技有限公司 Fire-fighting robot fire-extinguishing method and fire-extinguishing system
CN117138291B (en) * 2023-10-27 2024-02-06 江苏庆亚电子科技有限公司 Fire-fighting robot fire-extinguishing method and fire-extinguishing system

Similar Documents

Publication Publication Date Title
CN112214025A (en) Intelligent reconnaissance fire extinguishing system and fire extinguishing control method thereof
CN108646589B (en) Combat simulation training system and method for attacking unmanned aerial vehicle formation
Skorput et al. The use of Unmanned Aerial Vehicles for forest fire monitoring
CN105913604A (en) Fire occurrence determining method and device based on unmanned aerial vehicle
CN108259625A (en) A kind of escape and rescue method based on Buildings Modeling and personnel&#39;s running fix
CN109646853A (en) A kind of autonomous fire fighting robot device and monitoring system
CN107292989A (en) High voltage power transmission cruising inspection system based on 3DGIS technologies
CN110598655B (en) Artificial intelligent cloud computing multispectral smoke high-temperature spark fire monitoring method
CN110365946A (en) Generation method, device, storage medium and the electronic device of the condition of a disaster counte-rplan
CN106054928A (en) All-region fire generation determination method based on unmanned plane network
CN116308944B (en) Emergency rescue-oriented digital battlefield actual combat control platform and architecture
CN105611253A (en) Situation awareness system based on intelligent video analysis technology
CN113688921A (en) Fire operation identification method based on graph convolution network and target detection
CN106647813A (en) Intelligent vehicle-mounted dual-light inspection system expert diagnosing and abnormity processing method
CN111840855A (en) All-round intelligent emergency rescue linkage command system
Beachly et al. UAS-Rx interface for mission planning, fire tracking, fire ignition, and real-time updating
CN109785574B (en) Fire detection method based on deep learning
CN105761275A (en) Fire-fighting early warning aircraft with binocular visual structure
CN116091723B (en) Fire emergency rescue live-action three-dimensional modeling method and system based on unmanned aerial vehicle
CN107291092B (en) WiFi-supported air-ground cooperative unmanned aerial vehicle system
CN210428556U (en) Multipurpose electronic security inspection system based on unmanned aerial vehicle technology
CN114863352B (en) Personnel group behavior monitoring method based on video analysis
CN110675522A (en) Multipurpose electronic security inspection system based on unmanned aerial vehicle technology
CN115146933A (en) Processing method, system, equipment and storage medium for dangerous source explosion accident
KR100390600B1 (en) Apparatus for monitoring woodfire and position pursuit and a method for operating the same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210112

RJ01 Rejection of invention patent application after publication