WO2021262020A1 - Autonomous portabtle firefighting system and respective method of operation - Google Patents

Autonomous portabtle firefighting system and respective method of operation Download PDF

Info

Publication number
WO2021262020A1
WO2021262020A1 PCT/PT2020/050026 PT2020050026W WO2021262020A1 WO 2021262020 A1 WO2021262020 A1 WO 2021262020A1 PT 2020050026 W PT2020050026 W PT 2020050026W WO 2021262020 A1 WO2021262020 A1 WO 2021262020A1
Authority
WO
WIPO (PCT)
Prior art keywords
nozzle
unit
module
pixel
control unit
Prior art date
Application number
PCT/PT2020/050026
Other languages
French (fr)
Inventor
Aníbal TRAÇA CARVALHO DE ALMEIDA
António Paulo MENDES BREDA DIAS COIMBRA
Carlos Xavier PAIS VIEGAS
Luis Miguel DA SILVA FERREIRA
Miguel COSTA SILVA ANTUNES
Original Assignee
Instituto De Sistemas E Robótica Da Universidade De Coimbra
Universidade De Coimbra
Adai – Associação Para O Desenvolvimento Da Aerodinâmica Industrial
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Instituto De Sistemas E Robótica Da Universidade De Coimbra, Universidade De Coimbra, Adai – Associação Para O Desenvolvimento Da Aerodinâmica Industrial filed Critical Instituto De Sistemas E Robótica Da Universidade De Coimbra
Priority to PCT/PT2020/050026 priority Critical patent/WO2021262020A1/en
Publication of WO2021262020A1 publication Critical patent/WO2021262020A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A62LIFE-SAVING; FIRE-FIGHTING
    • A62CFIRE-FIGHTING
    • A62C3/00Fire prevention, containment or extinguishing specially adapted for particular objects or places
    • A62C3/02Fire prevention, containment or extinguishing specially adapted for particular objects or places for area conflagrations, e.g. forest fires, subterranean fires
    • AHUMAN NECESSITIES
    • A62LIFE-SAVING; FIRE-FIGHTING
    • A62CFIRE-FIGHTING
    • A62C3/00Fire prevention, containment or extinguishing specially adapted for particular objects or places
    • A62C3/02Fire prevention, containment or extinguishing specially adapted for particular objects or places for area conflagrations, e.g. forest fires, subterranean fires
    • A62C3/0271Detection of area conflagration fires
    • AHUMAN NECESSITIES
    • A62LIFE-SAVING; FIRE-FIGHTING
    • A62CFIRE-FIGHTING
    • A62C3/00Fire prevention, containment or extinguishing specially adapted for particular objects or places
    • A62C3/02Fire prevention, containment or extinguishing specially adapted for particular objects or places for area conflagrations, e.g. forest fires, subterranean fires
    • A62C3/0292Fire prevention, containment or extinguishing specially adapted for particular objects or places for area conflagrations, e.g. forest fires, subterranean fires by spraying extinguishants directly into the fire
    • AHUMAN NECESSITIES
    • A62LIFE-SAVING; FIRE-FIGHTING
    • A62CFIRE-FIGHTING
    • A62C31/00Delivery of fire-extinguishing material
    • A62C31/28Accessories for delivery devices, e.g. supports
    • AHUMAN NECESSITIES
    • A62LIFE-SAVING; FIRE-FIGHTING
    • A62CFIRE-FIGHTING
    • A62C37/00Control of fire-fighting equipment
    • A62C37/36Control of fire-fighting equipment an actuating signal being generated by a sensor separate from an outlet device
    • A62C37/38Control of fire-fighting equipment an actuating signal being generated by a sensor separate from an outlet device by both sensor and actuator, e.g. valve, being in the danger zone
    • A62C37/40Control of fire-fighting equipment an actuating signal being generated by a sensor separate from an outlet device by both sensor and actuator, e.g. valve, being in the danger zone with electric connection between sensor and actuator

Definitions

  • the present invention is enclosed in the field of firefighting systems. More specifically, the present invention relates to an autonomous portable firefighting system.
  • the system Blazequel's PYROsmart [S] is a fixed and self-controlled intelligent firefighting system with video camera, which allows indoor fire detection and automatic or remote jet control and steering towards the flame, even without a direct view of the equipment.
  • the intelligent control algorithm implemented does not consider the use of sensors for environment characterization, and therefore being ineffective in providing a correct and automatic adaptation of the system to indoor and outdoor environments.
  • the Akron Brass FireFox [A] is a remotely controlled fire-fighting equipment that allows installation in vehicles, thus having a higher degree of mobility and being capable of operating outdoors. However, it lacks fire detection and tracking capabilities, and also does not have an autonomous functioning.
  • the Apollo PE Monitor [5] and the Partner 2 [6] are remote-controlled portable firefighting systems, allowing its transportation and operation anywhere. However, they do not have systems for fire detection and tracking and autonomous operation. They also do not possess cameras for long-range remote combat with no direct view of the equipment.
  • the present solution intends to innovatively overcome such issues.
  • Such system comprises a nozzle unit constituted by a nozzle, an operation module and an orientation module. More particularly, the operation module is programmed to control the water flow, the jet velocity and the shape of the water jet coming out of the nozzle, whereas the orientation module is programmed to control the variation of vertical and horizontal angles of the nozzle.
  • the system is comprised by a sensory unit and a control unit.
  • the sensory unit is responsible for collecting sensory data, that is used by a 3D environmental characterization module to generate a virtual three-dimensional space record of the nozzle's surrounding environment.
  • the control unit is configured to control the operation of the system based on the record generated by the 3D environmental characterization module, which is processed by a detection module to detect and locate a specific target in said record.
  • the target can be a fire front or a heat source.
  • the control unit is further comprised by processing means programmed to execute a control algorithm which is configured to command the operation of the nozzle unit, based on data provided by the 3D environmental characterization module and by the detection module.
  • the autonomous portable firefighting system of the present invention is comprised by a communication unit for establishing a wireless bidirectional data communication protocol with a remote server or a remote monitoring device, such as a smart phone, a tablet or a computer.
  • a remote server or a remote monitoring device such as a smart phone, a tablet or a computer.
  • Figure 3 representation of an embodiment of the autonomous portable firefighting system developed, mounted on a tripod structure.
  • the numerical references represent:
  • Figure 5 Representation of the various units/modules that can be assembled and disassembled for easy storage and transportation of all equipment in a suitcase or backpack.
  • the numerical references represent:
  • Figure 8 - representation of an implementation scenario of the invention, wherein the autonomous portable firefighting system developed is installed in a trailer for an independent mobile system.
  • the numerical references represent:
  • the autonomous portable firefighting system of the present invention is a mobile and easily transportable firefighting system, operable in a fully autonomously fashion.
  • the autonomous system is comprised by a nozzle unit, a sensory unit (6) and a control unit (3) which are connected together, for the purpose of data exchange and control. Additionally, the system may be also comprised by a communication unit configured to established a wireless bidirectional data communication protocol with a remote server or a remote monitoring device.
  • the nozzle unit comprises a nozzle (1) from where the water is projected. It may also comprise an operation module and an orientation module (2).
  • the operation module can be inserted into the nozzle itself (1), and is programmed to control a water- flow solenoid valve for controlling the water flow, the jet velocity and the shape of the water jet (12). Said shape can be hollow jet or a hollow cone.
  • the orientation module (2) in its turn, is programmed to control the variation of vertical and horizontal angles of the nozzle (1), and consequently, of the water projection coming out of the nozzle (1). It can be a two-axis guidance mechanism that consist, for example, of servos or electric actuators to control each of the axes of rotation.
  • the nozzle unit may also be comprised by a video camera (7), adapted to collect video data from the scene.
  • the sensory unit (6) its main function is to collect data corresponding to the nozzle's surrounding environment that is used by a 3D environmental characterization module to generate a virtual three-dimensional space record.
  • the sensory unit (6) comprises sensory means, specifically, a plurality of sensors of several types. Such sensors may be adapted for image capture, temperature and wind speed and direction measurements, infrared radiation detection, heat flow measurement and for multi or hyperspectral imaging.
  • the 3D environmental characterization module is a processing module adapted to execute intelligent algorithms for generating a virtual three-dimensional space record that characterizes the surroundings of the nozzle (1) based on the sensory information collected by the sensory means of the sensory unit (6).
  • the control unit (3) comprises a detection module that is a processing module configured to process the virtual three-dimensional space record generated in the sensory unit (6), for detecting and locating a predetermined target.
  • a target can be a fire front (13) or a heat source.
  • the location information acquired by the detection module relates to a coordinate that maps the detected target within the three-dimensional space record.
  • a control algorithm executed in the processing means of the control unit (3) is adapted to control the nozzle unit, in particularly the operation module and the orientation module (2), completely autonomously, to start the projection of water (12) in the direction of the target (13), by automatically adjusting and correcting the nozzle (1) orientation according to environmental conditions.
  • the control algorithm executed in the control unit (3) for operating the system autonomously may use as input an infrared image collected by an infrared sensor of the sensory unit (6).
  • This sensor is coupled to the nozzle (1), moving in solidarity with it.
  • two infrared sensor arrays can be used, being located on opposite sides of the nozzle's body, such as at the top and bottom or at the left and right sides of the nozzle.
  • four infrared sensor arrays can be used, being located at the top, bottom, left and right sides of the nozzle.
  • the infrared image will be processed by the 3D environmental characterization model in order to generate the three-dimensional space record defining the surroundings of the nozzle (1).
  • each pixel value corresponds to a temperature of the scene.
  • the algorithm output is the nozzle (1) orientation and water jet (12) actuation.
  • the water actuation is initiated when the fire is within the actuation range of the nozzle.
  • the parameter actuation range can be programmed by the control unit (3) as a function of at least one or a combination of the following information: water pressure, water flow and water shape coming out of the nozzle, the orientation of the nozzle and its height from the ground, the configuration of the soil surface and surrounding environment, particularly considering the existence of obstacles such as vegetation and the wind speed and direction.
  • the water actuation is started when the number of pixels with an intensity higher than a predetermined value of "FIRE_TEMPERATURE_THRESHOLD" in the three-dimensional space record is higher than the predetermined value "FIRE_PIXEL_NUMBER_THRESHOLD". These pixels are called “HOTEST_PIXELS”.
  • the pixel corresponding to the point where the water jet (12) contacts the scene is identified as the center of an area of pixels with higher temperature drop to a temperature below a predetermined value of "WATER_TEMPERATURE_THRESHOLD”. This pixel is identified by the name of "WATER_HIT_PIXEL".
  • the system operating in this autonomous fashion may have several tactics of fire extinction.
  • the tactic can be automatically determined by the algorithm taking into account the number and the scattering of the "HOTEST_PIXELS", so that a scene characterization is made. In all of these tactics the nozzle is moved slowly.
  • One of the consequences of this is that between two consecutive records the water "WATER_HIT_PIXEL” is moved slowly, making it easy to track. This slow movement is also necessary to let the water cool and extinguish the flames in the water hit point.
  • One of the tactics of actuation for fire extinction is to move the water jet to the most intense heat source, i.e., the pixel with the highest temperature value, among the set of "HOTEST_PIXELS". This is done by slowly actuating the orientation module (2) in such a way as to make the "WATER_HIT_PIXEL” to coincide with the record's hot spot, that is, with the pixel of the record with the highest temperature value. This tactic is useful to prevent spot fires to develop into large fires.
  • Another of the tactics of actuation for fire extinction is to move the water jet (12) through the base of the fire front (IB) that is identified by the contour line of the temperature gradient of the record. This is done by slowly actuating the orientation module (2) in such a way as to make the "WATER_HIT_PIXEL” to go through the pixels of the base of the flames. This tactic should be employed to suppress an approaching fire front.
  • Yet another of the tactics of actuation for fire extinction is to move the water jet (12), i.e., the "WATER_HIT_PIXEL", through the area of the pixels in the record with intensity higher than the set value of "FIRE_TEMPERATURE_THRESHOLD".
  • This tactic can be used for an approaching fire front and also during fire mop up operations.
  • the tactic is to move the water jet (12) vertically in the full tilt range repeatedly until the "WATER_HIT_PIXEL" is detected again. This situation may happen when the base of the flames is obstructed by an object or the terrain. This tactic should be performed with the nozzle (1) facing the most intense heat source.
  • the control unit (3) may consist of a printed circuit board with a microcontroller, for example, a Raspberry Pi or chicken, which is responsible for controlling the various modules and units of the system, such as the orientation module (2) and the nozzle operating module of the nozzle unit, the sensory unit (6) and, additionally, the communication unit.
  • the system By operating in an autonomous mode, the system being controlled by the algorithm running in the control unit (3), allows for optimal firefighting including optimized use of resources, such as energy and water, by triggering water flow only when required, and by optimizing the direction and flow of this water jet according to the characteristics of the environment and the fire front.
  • resources such as energy and water
  • the system also comprises a communication unit.
  • a communication unit uses medium/long-range wireless data transmission technology, such as radio or internet, to send information about system status and video feed.
  • the communication unit allows the system to receive command instructions, enabling the remote configuration of basic control features of the system, for example, in case of failure in autonomous operation.
  • Said control can be provided via a dedicated remote controller or a web application for a smartphone or tablet, that provides a user an access to the control unit (3) for command purposes, in particular for actuating the nozzle unit.
  • the system now developed can be configured to work according to operation schedules, acting in a preventive way in fighting fires, allowing the system to be programmed periodically to wet the ground of a dwelling (15) allowing a distribution of water over a wider area, increasing fuel moisture and decreasing its probability of ignition.
  • This can be a programmed routine to use on days with high temperatures, strong winds and low humidity, which are favorable conditions for a fire occurrence. It can also be used during the occurrence of a fire, and before the fire front (13) reaches a dwelling (15) perimeter for example. It can also be used in the aftermath of a fire, to prevent reignitions in locations where the fire was already put out.
  • the system can also be strategically distributed across several spots in the forest to act as fire ignition detection and suppression mechanism, to support the first responders and prevent the large spread of the fire and its rapid progression.
  • the system is also comprised by a power unit having integrated batteries (7) adapted to supply power to the various mechatronics, control and communication units/modules, which can be rechargeable or disposable. All electronic elements, including the batteries (7), are integrated into the system structure in order to be protected from the exterior high temperatures and high radiation levels, as well as being protected from water.
  • the water supply to the system is done through a hose (5) which can be connected to the public water supply network, an emergency supply network, a tank or reservoir.
  • the tank may be coupled to a vehicle, as in the case of a fire brigade, placed in a trailer, or in a stand-alone robotic vehicle.
  • This hose (5) is in turn connected to the nozzle (1), for example, by means of an adapter (4).
  • This adapter (4) allows the coupling of hoses (5) of various sizes, from traditional irrigation hoses to different hoses connections used by firefighters, and also allows to operate with various pressures of the water supply - from normal pressure of the public water supply system, around 3 bar up to high pressures of 10 bar and more.
  • the entire system is designed to be easily transportable by one person, with reduced mass and size.
  • the various units/modules can be assembled and disassembled for easy storage and transportation of all equipment in a suitcase (17) or backpack (16).
  • a complete system with the support tripod is designed so that it can be carried in a backpack (16) and easily and quickly mounted on-site without the use of any tools, and only by one person, whether it is a fireman or a civilian.
  • the system may include a tripod structure (9) for ground mounting, or be placed on a mounting-base (8) on the ground or infrastructure to be protected.
  • the tripod structure (9) may be automated, being controlled by the control unit (3), which is further programmed to control the vertical height of the system's nozzle by varying the length of the tripod's legs. In this case, the system is fixed to the mounting tripod (9) which ensures stability.
  • the nozzle (1) may be oriented manually by the user, controlling the vertical and horizontal angles of the water jet (12), or possibly to place it in a desired fixed position, by varying the extension of each the tripod legs.
  • the mounting-base (8) is adapted to provide the installation of the autonomous portable system in a platform or in a vehicle (14) for specific applications, such as for the protection of dwellings (15) or vehicles (14).
  • autonomous portable firefighting system In a preferred embodiment of the autonomous portable firefighting system developed, it is comprised by:
  • a nozzle unit comprising a nozzle (1), an operation module and an orientation module (2); wherein the operation module is programmed to control the water flow, the jet velocity and the shape of the water jet (12) coming out of the nozzle (1); and the orientation module (2) is programmed to control the variation of vertical and horizontal angles of the nozzle (1);
  • a sensory unit (6) comprising sensory means configured to collect data corresponding to the nozzle's surrounding environment, being comprised by a 3D environmental characterization module configured to generate a virtual three- dimensional space record from the data collected by sensory means;
  • a control unit (3) comprising: a detection module, configured to detect and locate a target (13) in the record generated by the 3D environmental characterization module; and processing means programmed to execute a control algorithm configured to command the operation of the nozzle unit, based on data provided by the 3D environmental characterization module and by the detection module.
  • the sensory unit (6) comprises a plurality of sensors adapted to image capture, temperature and wind speed and direction measurement, infrared radiation detection, heat flow measurement and to multi or hyperspectral imaging.
  • the sensory unit (6) comprises at least one infrared sensor coupled to the nozzle (1), moving solidary with it.
  • the infrared sensor moves solidary with the nozzle only in the pan movement, which causes the sensor to operate in a levelled and stabilized reference in relation to the horizontal plane, allowing more effectively to monitor the surrounding scenario and the existing fire fronts.
  • infrared sensor arrays can be used, being located on opposite sides of the nozzle's body, such as at the top and bottom or at the left and right sides of the nozzle.
  • four infrared sensor arrays can be used, being located at the top, bottom, left and right sides of the nozzle.
  • Said infrared sensors are configured to collect infrared images of a scene; such images being processed by the SD environmental characterization module to generate the virtual three-dimensional space record; said record is the input of the control algorithm to be executed in the control unit (S), wherein said control algorithm is configured to program:
  • the orientation module (2) by setting a vertical angle parameter and a horizontal angle parameter
  • the operation module by activating the water flow, setting the water pressure value and the shape of the water jet (12), based on the target's (IS) location detected by the detection module.
  • the virtual three- dimensional space record generated by the 3D environmental characterization module of the sensory unit (6) comprises data related to wind speed and direction; such data being used by the control algorithm to adapt the parameters of the orientation module (2) and of the operation module, to compensate the wind speed and direction.
  • it further comprises an adapter (4) for coupling of hoses (5) of different sizes to the nozzle (1).
  • a tripod structure (9) for ground-mounting of the system.
  • Said tripod structure (9) is an automated structure, being operable by the control unit (3) which is further programmed to control the vertical height of the nozzle (1) in relation to the ground level, by varying the vertical lengths of the legs of the tripod structure (9).
  • it further comprises a mounting-base (8) suitable for installing the system in a platform or in a vehicle (14).
  • it further comprises a communication unit configured to established a wireless bidirectional data communication protocol with a remote server or a remote processing device.
  • the present invention also relates to a method of operation of the autonomous portable firefighting system described above. In a preferred embodiment of the method, it comprises the following steps:
  • control unit (3) is programmed to execute the following steps: — detect and locate a target (13) within the virtual three-dimensional space record generated by the 3D characterization module of the sensory unit (6) from the environmental data collected, by means of the detection module;
  • the target (13) is a fire front or a heat source.
  • the sensory unit (6) is configured to periodically collect infrared images from a scene, wherein each image is used by the 3D characterization module to generate a three-dimensional space record.
  • control algorithm executed in the control unit (3) of the autonomous system comprises the following steps:
  • the predefined values - FIRE_TEMPERATURE_THRESHOLD, FIRE_PIXEL_NUMBER_THRESHOLD and WATER_TEMPERATURE_THRESHOLD - are parameters that can be initially configured by the user, or can be remotely programmed when the system is connected through the communication unit to a network.
  • the three-dimensional space record further comprises data related to the wind speed and direction; and wherein:
  • the actuation on the water flow, the configuration of the jet pressure value and the shape of the water jet (12) coming out of the nozzle (1), to command the operation module, are adapted to compensate the wind speed and direction.
  • the direction of the nozzle (1) is defined in relation to a HOTEST_PIXEL with the higher intensity value, in such a way as to make the WATER_HIT_PIXEL to coincide with said HOTEST_PIXEL in the record.
  • the autonomous operation of the control unit starts dependent on at least one sensorial parameter collected by the sensory unit (6).
  • sensorial parameter is a temperature and/or a wind speed and/or humidity level
  • the autonomous operation of the control unit is initiated if the referred parameters are higher than predetermined values, considering the actuation range of the nozzle.
  • the autonomous operation of the control unit (3) is initiated by a user remotely connected to said control unit (3), when the system is connected through the communication unit to a network.
  • the autonomous operation of the control unit (3) is schedule to run in a future point in time selected by the user.
  • control unit (3) has a schedule entity configured to manage the autonomous operation of the control unit (3).

Abstract

AUTONOMOUS PORTABTLE FIREFIGHTING SYSTEM AND RESPECTIVE METHOD OF OPERATION The present invention related to an autonomous portable firefighting system and respective method of operation, allowing a fully autonomous operation. The system comprises a control unit (3) for controlling an operation module and an orientation module (2) responsible for directing the water jet (12) coming out of the nozzle (1) to a fire front. For that purpose, the control unit executes a control algorithm that processes a virtual three-dimensional space record generated by a sensory unit (6), to detect and locate a specific target (13) such as a fire front or a heat source.

Description

DESCRIPTION
AUTONOMOUS PORTABTLE FIREFIGHTING SYSTEM AND RESPECTIVE METHOD OF
OPERATION
FIELD OF THE INVENTION
The present invention is enclosed in the field of firefighting systems. More specifically, the present invention relates to an autonomous portable firefighting system.
PRIOR ART
The most serious consequences of forest fires are the loss of human lives and the destruction of assets and infrastructure. Current firefighting mechanisms almost always presuppose direct control and handling by an operator, which becomes exposed to the dangers of the fire and very unfavourable conditions such as high temperatures and radiation levels. It is, therefore, necessary to develop new mechanisms to remotely or fully automatically combat fire anywhere, eliminating the need to endanger the lives of firefighters and the population.
Solutions exist in the art concerning fixed and self-contained intelligent indoor firefighting systems such as Ruishi's ZTZ-123 RS30 [1] or the Automatic fire targeting and extinguishing system and method [2]. The referred systems do not allow transport and mounting elsewhere, and also do not have video cameras, so in remote operation mode, the operator must have a direct view of the equipment.
The system Blazequel's PYROsmart [S] is a fixed and self-controlled intelligent firefighting system with video camera, which allows indoor fire detection and automatic or remote jet control and steering towards the flame, even without a direct view of the equipment. However, the intelligent control algorithm implemented does not consider the use of sensors for environment characterization, and therefore being ineffective in providing a correct and automatic adaptation of the system to indoor and outdoor environments.
The Akron Brass FireFox [A], is a remotely controlled fire-fighting equipment that allows installation in vehicles, thus having a higher degree of mobility and being capable of operating outdoors. However, it lacks fire detection and tracking capabilities, and also does not have an autonomous functioning.
Finally, the Apollo PE Monitor [5] and the Partner 2 [6] are remote- controlled portable firefighting systems, allowing its transportation and operation anywhere. However, they do not have systems for fire detection and tracking and autonomous operation. They also do not possess cameras for long-range remote combat with no direct view of the equipment.
All the existing solutions have the issue of not considering the environmental characterization of the area of operation for controlling the system, and therefore, not being able to provide its correct adaption to different indoor and outdoor conditions, which is of an extreme importance for portable systems.
The present solution intends to innovatively overcome such issues.
SUMMARY OF THE INVENTION
It is therefore an object of the present invention an autonomous portable firefighting system. Such system comprises a nozzle unit constituted by a nozzle, an operation module and an orientation module. More particularly, the operation module is programmed to control the water flow, the jet velocity and the shape of the water jet coming out of the nozzle, whereas the orientation module is programmed to control the variation of vertical and horizontal angles of the nozzle. In order that the system can operate autonomously, while ensuring the efficiency of its operation, it is comprised by a sensory unit and a control unit. The sensory unit is responsible for collecting sensory data, that is used by a 3D environmental characterization module to generate a virtual three-dimensional space record of the nozzle's surrounding environment. The control unit is configured to control the operation of the system based on the record generated by the 3D environmental characterization module, which is processed by a detection module to detect and locate a specific target in said record. The target can be a fire front or a heat source. The control unit is further comprised by processing means programmed to execute a control algorithm which is configured to command the operation of the nozzle unit, based on data provided by the 3D environmental characterization module and by the detection module.
In an advantageous configuration of the autonomous portable firefighting system of the present invention, it is comprised by a communication unit for establishing a wireless bidirectional data communication protocol with a remote server or a remote monitoring device, such as a smart phone, a tablet or a computer.
It is also an object of the present invention, a method for operating the system developed. Such method being comprised by the steps of:
— collecting data corresponding to a nozzle's surrounding environmental, by means of the sensory unit (6); and — programming the control unit (3) to operate the nozzle unit.
The advantageous configurations of the portable firefighting system developed represents major improvements regarding the state of art, particularly relating to:
— Possibility of autonomous firefighting at long range using data collected from a video camera installed on the nozzle unit and from the sensory unit;
— Possibility to transport and quickly assemble the equipment anywhere, provided that water is supplied through a hose;
— The use of control algorithms to control the operation of the system allows for optimized firefighting and the use of firefighting resources, including energy and water, by activating the water flow only when required, and by optimizing the direction and flow of this water jet. DESCRIPTION OF FIGURES
Figure 1 - representation of the autonomous portable firefighting system of the invention, in its generic concept, wherein the numerical references represent:
1 - nozzle;
2 - orientation module;
3 - control unit;
4 - hose adapter;
5 - water hose;
6 - sensory unit;
7 - video camera;
8 - mounting base.
Figure 2 - representation of an embodiment of the autonomous portable firefighting system developed, mounted on a tripod structure. The numerical references represent:
1 - nozzle;
2 - orientation module;
3 - control unit;
4 - hose adapter;
5 - water hose;
6 - sensory unit;
7 - video camera;
9 - tripod;
10 - battery.
Figure 3 - representation of an embodiment of the autonomous portable firefighting system developed, mounted on a tripod structure. The numerical references represent:
1 - nozzle;
2 - orientation module;
4 - hose adapter;
5 - water hose; 6 - sensory unit;
7 - video camera;
9 - tripod;
11 - remote controller.
Figure 4 - representation of an embodiment of the autonomous portable firefighting system. The numerical references represent:
1 - nozzle; 11 - remote controller;
12 - water jet;
13 - target - fire front.
Figure 5 - Representation of the various units/modules that can be assembled and disassembled for easy storage and transportation of all equipment in a suitcase or backpack. The numerical references represent:
1 - nozzle;
2 - orientation module;
4 - hose adapter; 6 - sensory unit;
7 - video camera;
9 - tripod;
11 - remote controller;
16 - backpack;
17 - suitcase. Figure 6 - representation of an implementation scenario of the invention, wherein the autonomous portable firefighting system developed is installed in a fire truck and mounted on a tripod structure. The numerical references represent:
8 - mounting-base, installed on top of the fire truck;
9 - tripod;
14 -fire truck.
Figure 7 - representation of an implementation scenario of the invention, wherein the autonomous portable firefighting system developed is installed in a fire truck. The numerical references represent:
1 - nozzle;
2 - orientation module;
5 - water hose;
6 - sensory unit;
7 - video camera;
8 - mounting base;
18 - transport handle.
Figure 8 - representation of an implementation scenario of the invention, wherein the autonomous portable firefighting system developed is installed in a trailer for an independent mobile system. The numerical references represent:
1 - nozzle;
5 - water hose;
7 - video camera;
8 - mounting base;
19 -water tank;
20 - hydraulic pump;
21 -trail. Figure 9- representation of another implementation scenario of the invention, wherein the autonomous portable firefighting system developed is installed to protect a dwelling. The numerical references represent:
8 - mounting base, installed on the property wall;
9 - tripod;
15 -dwelling.
DETAILED DESCRIPTION
The more general and advantageous configurations of the present invention are described in the Summary of the invention. Such configurations are detailed below in accordance with other advantageous and/or preferred embodiments of implementation of the present invention.
The autonomous portable firefighting system of the present invention is a mobile and easily transportable firefighting system, operable in a fully autonomously fashion.
For that purpose, the autonomous system is comprised by a nozzle unit, a sensory unit (6) and a control unit (3) which are connected together, for the purpose of data exchange and control. Additionally, the system may be also comprised by a communication unit configured to established a wireless bidirectional data communication protocol with a remote server or a remote monitoring device.
The nozzle unit comprises a nozzle (1) from where the water is projected. It may also comprise an operation module and an orientation module (2). The operation module can be inserted into the nozzle itself (1), and is programmed to control a water- flow solenoid valve for controlling the water flow, the jet velocity and the shape of the water jet (12). Said shape can be hollow jet or a hollow cone. The orientation module (2) in its turn, is programmed to control the variation of vertical and horizontal angles of the nozzle (1), and consequently, of the water projection coming out of the nozzle (1). It can be a two-axis guidance mechanism that consist, for example, of servos or electric actuators to control each of the axes of rotation. In an alternative embodiment, the nozzle unit may also be comprised by a video camera (7), adapted to collect video data from the scene.
In what concerns the sensory unit (6), its main function is to collect data corresponding to the nozzle's surrounding environment that is used by a 3D environmental characterization module to generate a virtual three-dimensional space record. To achieve that, the sensory unit (6) comprises sensory means, specifically, a plurality of sensors of several types. Such sensors may be adapted for image capture, temperature and wind speed and direction measurements, infrared radiation detection, heat flow measurement and for multi or hyperspectral imaging. The 3D environmental characterization module is a processing module adapted to execute intelligent algorithms for generating a virtual three-dimensional space record that characterizes the surroundings of the nozzle (1) based on the sensory information collected by the sensory means of the sensory unit (6). With this approach it is possible to define the area of operation - the scene -, for example concerning to the wind speed and direction, which can be used to adapt the operation of the system, in particularly defining a proper orientation for the nozzle (1) and the configuration of the parameters related to the operation of the water flow solenoid valve, allowing it to work more efficiently.
The control unit (3) comprises a detection module that is a processing module configured to process the virtual three-dimensional space record generated in the sensory unit (6), for detecting and locating a predetermined target. In this context, a target can be a fire front (13) or a heat source. The location information acquired by the detection module relates to a coordinate that maps the detected target within the three-dimensional space record. Once this detection and location is made in the three- dimensional space, a control algorithm executed in the processing means of the control unit (3), is adapted to control the nozzle unit, in particularly the operation module and the orientation module (2), completely autonomously, to start the projection of water (12) in the direction of the target (13), by automatically adjusting and correcting the nozzle (1) orientation according to environmental conditions.
The control algorithm executed in the control unit (3) for operating the system autonomously, may use as input an infrared image collected by an infrared sensor of the sensory unit (6). This sensor is coupled to the nozzle (1), moving in solidarity with it. Alternatively, two infrared sensor arrays can be used, being located on opposite sides of the nozzle's body, such as at the top and bottom or at the left and right sides of the nozzle. In another embodiment, four infrared sensor arrays can be used, being located at the top, bottom, left and right sides of the nozzle. The infrared image will be processed by the 3D environmental characterization model in order to generate the three-dimensional space record defining the surroundings of the nozzle (1). In the resulting record, each pixel value corresponds to a temperature of the scene. The algorithm output is the nozzle (1) orientation and water jet (12) actuation. The water actuation is initiated when the fire is within the actuation range of the nozzle. The parameter actuation range can be programmed by the control unit (3) as a function of at least one or a combination of the following information: water pressure, water flow and water shape coming out of the nozzle, the orientation of the nozzle and its height from the ground, the configuration of the soil surface and surrounding environment, particularly considering the existence of obstacles such as vegetation and the wind speed and direction. Particularly, the water actuation is started when the number of pixels with an intensity higher than a predetermined value of "FIRE_TEMPERATURE_THRESHOLD" in the three-dimensional space record is higher than the predetermined value "FIRE_PIXEL_NUMBER_THRESHOLD". These pixels are called "HOTEST_PIXELS". After the water solenoid valve is opened, the pixel corresponding to the point where the water jet (12) contacts the scene is identified as the center of an area of pixels with higher temperature drop to a temperature below a predetermined value of "WATER_TEMPERATURE_THRESHOLD". This pixel is identified by the name of "WATER_HIT_PIXEL". In case of multiple infrared sensor arrays, it is easier to identify the water jet and the "WATER_HIT_PIXEL". In this case, a global image is obtained by merging the different images of each sensor array.
The system operating in this autonomous fashion may have several tactics of fire extinction. The tactic can be automatically determined by the algorithm taking into account the number and the scattering of the "HOTEST_PIXELS", so that a scene characterization is made. In all of these tactics the nozzle is moved slowly. One of the consequences of this is that between two consecutive records the water "WATER_HIT_PIXEL" is moved slowly, making it easy to track. This slow movement is also necessary to let the water cool and extinguish the flames in the water hit point.
One of the tactics of actuation for fire extinction is to move the water jet to the most intense heat source, i.e., the pixel with the highest temperature value, among the set of "HOTEST_PIXELS". This is done by slowly actuating the orientation module (2) in such a way as to make the "WATER_HIT_PIXEL" to coincide with the record's hot spot, that is, with the pixel of the record with the highest temperature value. This tactic is useful to prevent spot fires to develop into large fires. Another of the tactics of actuation for fire extinction is to move the water jet (12) through the base of the fire front (IB) that is identified by the contour line of the temperature gradient of the record. This is done by slowly actuating the orientation module (2) in such a way as to make the "WATER_HIT_PIXEL" to go through the pixels of the base of the flames. This tactic should be employed to suppress an approaching fire front.
Yet another of the tactics of actuation for fire extinction is to move the water jet (12), i.e., the "WATER_HIT_PIXEL", through the area of the pixels in the record with intensity higher than the set value of "FIRE_TEMPERATURE_THRESHOLD". This tactic can be used for an approaching fire front and also during fire mop up operations. When the "WATER_HIT_PIXEL" is not identified and the water pressure is above a preset threshold, meaning that there is enough flow of water, the tactic is to move the water jet (12) vertically in the full tilt range repeatedly until the "WATER_HIT_PIXEL" is detected again. This situation may happen when the base of the flames is obstructed by an object or the terrain. This tactic should be performed with the nozzle (1) facing the most intense heat source.
The elements constituting the system now presented, due to the 3D environmental characterization performed by the sensory unit (6), allows to automatically compensate the effect of the wind in the water jet (12) by detecting the "WATER_HIT_PIXEL" and not the direction of the nozzle (1). The control unit (3) may consist of a printed circuit board with a microcontroller, for example, a Raspberry Pi or Arduino, which is responsible for controlling the various modules and units of the system, such as the orientation module (2) and the nozzle operating module of the nozzle unit, the sensory unit (6) and, additionally, the communication unit.
By operating in an autonomous mode, the system being controlled by the algorithm running in the control unit (3), allows for optimal firefighting including optimized use of resources, such as energy and water, by triggering water flow only when required, and by optimizing the direction and flow of this water jet according to the characteristics of the environment and the fire front.
In an alternative embodiment, the system also comprises a communication unit. Such unit uses medium/long-range wireless data transmission technology, such as radio or internet, to send information about system status and video feed. Additionally, the communication unit allows the system to receive command instructions, enabling the remote configuration of basic control features of the system, for example, in case of failure in autonomous operation. Said control can be provided via a dedicated remote controller or a web application for a smartphone or tablet, that provides a user an access to the control unit (3) for command purposes, in particular for actuating the nozzle unit.
The system now developed can be configured to work according to operation schedules, acting in a preventive way in fighting fires, allowing the system to be programmed periodically to wet the ground of a dwelling (15) allowing a distribution of water over a wider area, increasing fuel moisture and decreasing its probability of ignition. This can be a programmed routine to use on days with high temperatures, strong winds and low humidity, which are favorable conditions for a fire occurrence. It can also be used during the occurrence of a fire, and before the fire front (13) reaches a dwelling (15) perimeter for example. It can also be used in the aftermath of a fire, to prevent reignitions in locations where the fire was already put out. The system can also be strategically distributed across several spots in the forest to act as fire ignition detection and suppression mechanism, to support the first responders and prevent the large spread of the fire and its rapid progression.
The system is also comprised by a power unit having integrated batteries (7) adapted to supply power to the various mechatronics, control and communication units/modules, which can be rechargeable or disposable. All electronic elements, including the batteries (7), are integrated into the system structure in order to be protected from the exterior high temperatures and high radiation levels, as well as being protected from water.
The water supply to the system is done through a hose (5) which can be connected to the public water supply network, an emergency supply network, a tank or reservoir. For superior mobility, the tank may be coupled to a vehicle, as in the case of a fire brigade, placed in a trailer, or in a stand-alone robotic vehicle. This hose (5) is in turn connected to the nozzle (1), for example, by means of an adapter (4). This adapter (4) allows the coupling of hoses (5) of various sizes, from traditional irrigation hoses to different hoses connections used by firefighters, and also allows to operate with various pressures of the water supply - from normal pressure of the public water supply system, around 3 bar up to high pressures of 10 bar and more.
The entire system is designed to be easily transportable by one person, with reduced mass and size. The various units/modules can be assembled and disassembled for easy storage and transportation of all equipment in a suitcase (17) or backpack (16). A complete system with the support tripod is designed so that it can be carried in a backpack (16) and easily and quickly mounted on-site without the use of any tools, and only by one person, whether it is a fireman or a civilian.
The system may include a tripod structure (9) for ground mounting, or be placed on a mounting-base (8) on the ground or infrastructure to be protected. The tripod structure (9) may be automated, being controlled by the control unit (3), which is further programmed to control the vertical height of the system's nozzle by varying the length of the tripod's legs. In this case, the system is fixed to the mounting tripod (9) which ensures stability. In the event of electrical or mechanical failure the nozzle (1) may be oriented manually by the user, controlling the vertical and horizontal angles of the water jet (12), or possibly to place it in a desired fixed position, by varying the extension of each the tripod legs.
On the other hand, the mounting-base (8) is adapted to provide the installation of the autonomous portable system in a platform or in a vehicle (14) for specific applications, such as for the protection of dwellings (15) or vehicles (14).
DESCRIPTION OF THE EMBODIMENTS
In a preferred embodiment of the autonomous portable firefighting system developed, it is comprised by:
— A nozzle unit comprising a nozzle (1), an operation module and an orientation module (2); wherein the operation module is programmed to control the water flow, the jet velocity and the shape of the water jet (12) coming out of the nozzle (1); and the orientation module (2) is programmed to control the variation of vertical and horizontal angles of the nozzle (1);
— A sensory unit (6) comprising sensory means configured to collect data corresponding to the nozzle's surrounding environment, being comprised by a 3D environmental characterization module configured to generate a virtual three- dimensional space record from the data collected by sensory means;
— A control unit (3) comprising: a detection module, configured to detect and locate a target (13) in the record generated by the 3D environmental characterization module; and processing means programmed to execute a control algorithm configured to command the operation of the nozzle unit, based on data provided by the 3D environmental characterization module and by the detection module.
In one alternative embodiment of the autonomous system, the sensory unit (6) comprises a plurality of sensors adapted to image capture, temperature and wind speed and direction measurement, infrared radiation detection, heat flow measurement and to multi or hyperspectral imaging. Yet in another alternative embodiment of the system, the sensory unit (6) comprises at least one infrared sensor coupled to the nozzle (1), moving solidary with it. Alternatively, the infrared sensor moves solidary with the nozzle only in the pan movement, which causes the sensor to operate in a levelled and stabilized reference in relation to the horizontal plane, allowing more effectively to monitor the surrounding scenario and the existing fire fronts. In another embodiment, two infrared sensor arrays can be used, being located on opposite sides of the nozzle's body, such as at the top and bottom or at the left and right sides of the nozzle. Yet in another embodiment, four infrared sensor arrays can be used, being located at the top, bottom, left and right sides of the nozzle. Said infrared sensors are configured to collect infrared images of a scene; such images being processed by the SD environmental characterization module to generate the virtual three-dimensional space record; said record is the input of the control algorithm to be executed in the control unit (S), wherein said control algorithm is configured to program:
— the orientation module (2) by setting a vertical angle parameter and a horizontal angle parameter;
— the operation module by activating the water flow, setting the water pressure value and the shape of the water jet (12), based on the target's (IS) location detected by the detection module.
Yet in another alternative embodiment of the system, the virtual three- dimensional space record generated by the 3D environmental characterization module of the sensory unit (6) comprises data related to wind speed and direction; such data being used by the control algorithm to adapt the parameters of the orientation module (2) and of the operation module, to compensate the wind speed and direction.
Yet in another alternative embodiment of the system, it further comprises an adapter (4) for coupling of hoses (5) of different sizes to the nozzle (1). Yet in another embodiment of the system, it further comprises a video camera (7) installed in the nozzle unit.
Yet in another alternative embodiment of the system, it further comprises a tripod structure (9) for ground-mounting of the system. Said tripod structure (9) is an automated structure, being operable by the control unit (3) which is further programmed to control the vertical height of the nozzle (1) in relation to the ground level, by varying the vertical lengths of the legs of the tripod structure (9).
Yet in another alternative embodiment of the system, it further comprises a mounting-base (8) suitable for installing the system in a platform or in a vehicle (14).
Yet in another embodiment of the system, it further comprises a communication unit configured to established a wireless bidirectional data communication protocol with a remote server or a remote processing device.
The present invention also relates to a method of operation of the autonomous portable firefighting system described above. In a preferred embodiment of the method, it comprises the following steps:
— collecting data corresponding to a nozzle's surrounding environmental by means of the sensory unit (6);
— generating a virtual three-dimensional space record by the 3D characterization module of the sensory unit (6), from the environmental data collected;
— programming the control unit (3) to operate the nozzle unit.
In an alternative embodiment of the method, the control unit (3) is programmed to execute the following steps: — detect and locate a target (13) within the virtual three-dimensional space record generated by the 3D characterization module of the sensory unit (6) from the environmental data collected, by means of the detection module;
— Execute a control algorithm in the processing means, wherein the inputs of the control algorithm are:
— the location of the target (13), provided by the detection module; and
— the three-dimensional space record generated by the 3D characterization module; and the outputs of the control algorithm are:
— a vertical angle parameter and a horizontal angle parameter to program the orientation module (2), corresponding to a coordinate representing the location of the target (13);
— an activation or deactivation water flow signal, a water pressure value and a water jet (12) shape parameter, to program the operation module;
— program the orientation module (2) and the operation module of the nozzle unit.
In one embodiment of the method, the target (13) is a fire front or a heat source.
Yet in an alternative embodiment of the method, the sensory unit (6) is configured to periodically collect infrared images from a scene, wherein each image is used by the 3D characterization module to generate a three-dimensional space record.
Yet in an alternative embodiment of the method, the control algorithm executed in the control unit (3) of the autonomous system, comprises the following steps:
— determine an intensity value of each pixel in the record; — identify pixels - "HOTEST_PIXEL"- with intensity values higher than a predefined value - "FIRE_TEMPERATURE_THRESHOLD";
— direct the nozzle unit to a HOTEST_PIXEL by commanding the orientation module (2) and the operation module, when the number of HOTEST_PIXEL is higher than a predefined value - "FIRE_PIXEL_NUMBER_THRESHOLD";
— analyse subsequent records to detect a pixel corresponding to the point where the water jet (12) comes into contact with the scene; said pixel - "WATER_HIT_PIXEL" - being the centre of an area of pixels with higher intensity value drop; in case of multiple infrared sensor arrays, it is easier to identify the water jet and the "WATER_HIT_PIXEL", since a global image is obtained by merging the different images of each sensor array;
— re-direct the nozzle unit to another HOTEST_PIXEL in the record when the WATER_HIT_PIXEL has an intensity value less than a predefined value - "WATER_TEMPERATURE_THRESHOLD".
Yet in an alternative embodiment of the method, the predefined values - FIRE_TEMPERATURE_THRESHOLD, FIRE_PIXEL_NUMBER_THRESHOLD and WATER_TEMPERATURE_THRESHOLD - are parameters that can be initially configured by the user, or can be remotely programmed when the system is connected through the communication unit to a network.
Yet in an alternative embodiment of the method, the three-dimensional space record further comprises data related to the wind speed and direction; and wherein:
— the configuration of a horizontal angle and a vertical angle into the nozzle (1) to command of the orientation module (2); and
— the actuation on the water flow, the configuration of the jet pressure value and the shape of the water jet (12) coming out of the nozzle (1), to command the operation module, are adapted to compensate the wind speed and direction. Yet in an alternative embodiment of the method, the direction of the nozzle (1) is defined in relation to a HOTEST_PIXEL with the higher intensity value, in such a way as to make the WATER_HIT_PIXEL to coincide with said HOTEST_PIXEL in the record.
Yet in an alternative embodiment of the method, the autonomous operation of the control unit starts dependent on at least one sensorial parameter collected by the sensory unit (6). Such sensorial parameter is a temperature and/or a wind speed and/or humidity level, and the autonomous operation of the control unit is initiated if the referred parameters are higher than predetermined values, considering the actuation range of the nozzle.
Yet in an alternative embodiment of the method, the autonomous operation of the control unit (3) is initiated by a user remotely connected to said control unit (3), when the system is connected through the communication unit to a network.
Yet in an alternative embodiment of the method, the autonomous operation of the control unit (3) is schedule to run in a future point in time selected by the user.
Yet in an alternative embodiment of the method, the control unit (3) has a schedule entity configured to manage the autonomous operation of the control unit (3).
As will be clear to one skilled in the art, the present invention should not be limited to the embodiments described herein, and a number of changes are possible which remain within the scope of the present invention. Of course, the preferred embodiments shown above are combinable, in the different possible forms, being herein avoided the repetition of all such combinations. REFERENCES
[1] ZTZ-123 RS30, Automatic Fire Fighting Water Monitor, Nanjing Ruishi Fire-Fighting Safety, Equipment Co., Ltd.
[2] US 20150021054 Al, McNamara, I.E. and Rapp, N.L. and Toombs, N.J. and Kim, J., Automatic fire targeting and extinguishing system and method
[3] PYROsmart, Automatic Water Cannon - Fire Extinguishing System, Blazequel
[4] FireFox, Vehicle Mounted Electric Firefighting Monitor, Akron Brass Company
[5] Apollo PE Monitor, Style 3419 Apollo Portable Electric monitor, Akron Brass Company
[6] Partner 2, Partner 2 portable monitor, LEADER, a company of the group Genetech

Claims

1. An autonomous portable firefighting system comprising:
— A nozzle unit comprising a nozzle (1), an operation module and an orientation module (2); wherein the operation module is programmed to control the water flow, the jet velocity and the shape of the water jet (12) coming out of the nozzle (1); and the orientation module (2) is programmed to control the variation of vertical and horizontal angles of the nozzle (1);
— A sensory unit (6) comprising sensory means configured to collect data corresponding to the nozzle's surrounding environment, being comprised by a 3D environmental characterization module configured to generate a virtual three- dimensional space record from the data collected by sensory means;
— A control unit (3) comprising: a detection module, configured to detect and locate a target (13) in the record generated by the 3D environmental characterization module; and processing means programmed to execute a control algorithm configured to command the operation of the nozzle unit, based on data provided by the 3D environmental characterization module and by the detection module;
— A power unit adapted to supply power to the nozzle unit, the sensory unit (6) and the control unit (3).
2. The system according to claim 1, wherein the sensory unit (6) comprises a plurality of sensors adapted to image capture, temperature and wind speed and direction measurement, infrared radiation detection, heat flow measurement and to multi or hyperspectral imaging.
3. The system according to any of the previous claims, wherein the sensory unit (6) comprises at least one infrared sensor coupled to the nozzle (1); said at least one infrared sensor is configured to collect infrared images of a scene; such images being processed by the 3D environmental characterization module to generate the virtual three-dimensional space record; said record being the input of the control algorithm being executed in the control unit (3), wherein said control algorithm is configured to program:
— the orientation module (2) by setting a vertical angle parameter and a horizontal angle parameter;
— the operation module by activating the water flow, setting the water pressure value and the shape of the water jet (12), based on the target's (13) location detected by the detection module.
4. The system according to claim 3, wherein the sensory unit (6) comprises two infrared sensor arrays located on opposite sides of the nozzle, such as at the top and bottom or at the left and right sides of the nozzle.
5. The system according to claim 3, wherein the sensory unit (6) comprises four infrared sensor arrays located at the top, bottom, left and right sides of the nozzle.
6. System according to any of the claims 3 to 5, wherein the at least one infrared sensor is installed in the nozzle in such a way to move solidary with it.
7. System according to any of the previous claims 3 to 6, wherein the at least one infrared sensor is installed in the nozzle in such a way to move solidary with it only in the pan movement.
8. The system according to any of the claims 2 to 7, wherein the virtual three-dimensional space record generated by the 3D environmental characterization module of the sensory unit (6) comprises data related to wind speed and direction; such data being used by the control algorithm to adapt the parameters of the orientation module (2) and of the operation module, to compensate the wind speed and direction.
9. The system according to any of the previous claims, further comprising a video camera (7) installed in the nozzle unit.
10. The system according to any of the previous claims, further comprising a tripod structure (9) for ground-mounting of the system.
11. The system according to claim 10, wherein the tripod structure (9) is an automated structure, being operable by the control unit (3) which is further programmed to control the vertical height of the nozzle (1) in relation to the ground level, by varying the vertical lengths of the legs of the tripod structure (9).
12. The system according to any of the previous claims, further comprising a mounting-base (8) suitable for installing the system in a platform or in a vehicle (14).
13. The system according to any of the previous claims, further comprising a communication unit configured to established a wireless bidirectional data communication protocol with a remote server or a remote processing device.
14. Method of operation of the autonomous portable firefighting system of claims 1 to 13; the method comprising the following steps:
— collecting data corresponding to a nozzle's surrounding environmental by means of the sensory unit (6);
— generating a virtual three-dimensional space record by the 3D characterization module of the sensory unit (6), from the environmental data collected;
— programming the control unit (3) to operate the nozzle unit.
15. Method according to claim 14, wherein the control unit (3) is programmed to execute the following steps: — detect and locate a target (13) within the virtual three-dimensional space record generated by the 3D characterization module of the sensory unit (6), by means of the detection module;
— execute a control algorithm in the processing means, wherein the inputs of the control algorithm are:
— the location of the target (13), provided by the detection module; and
— the three-dimensional space record generated by the 3D characterization module; and the outputs of the control algorithm are:
— a vertical angle parameter and a horizontal angle parameter to program the orientation module (2), corresponding to a coordinate representing the location of the target (13);
— an activation or deactivation water flow signal, a water pressure value and a water jet (12) shape parameter, to program the operation module;
— program the orientation module (2) and the operation module of the nozzle unit.
16. The method according to claim 15, wherein the target (13) is a fire front or a heat source.
17. The method according to any of the claims 14 to 16, wherein the sensory unit (6) is configured to periodically collect infrared images from a scene, wherein each image is used by the 3D characterization module to generate a three- dimensional space record.
18. The method according to any of the claims 15 to 17, wherein the control algorithm executed in the control unit (3) comprises the following steps:
— determine an intensity value of each pixel in the record; — identify pixels - "HOTEST_PIXEL"- with intensity values higher than a predefined value - "FIRE_TEMPERATURE_THRESHOLD";
— direct the nozzle unit to a HOTEST_PIXEL by commanding the orientation module (2) and the operation module, when the number of HOTEST_PIXEL is higher than a predefined value - "FIRE_PIXEL_NUMBER_THRESHOLD";
— analyse subsequent records to detect a pixel corresponding to the point where the water jet (12) comes into contact with the scene; said pixel - "WATER_HIT_PIXEL" - being the centre of an area of pixels with higher intensity value drop;
— re-direct the nozzle unit to another HOTEST_PIXEL in the record when the WATER_HIT_PIXEL has an intensity value less than a predefined value - "WATER_TEMPERATURE_THRESHOLD", wherein, predefined values - FIRE_TEMPERATURE_THRESHOLD, FIRE_PIXEL_NUMBER_THRESHOLD and WATER_TEMPERATURE_THRESHOLD - are parameters configured by a user.
19. The method according to any of the claims 14 to 18, wherein the three-dimensional space record further comprises data related to the wind speed and direction; and wherein the configuration of a horizontal angle and a vertical angle to the nozzle to command of the orientation module (2); and the actuation on the water flow, the configuration of the jet pressure value and the shape of the water jet (12) coming out of the nozzle (1), to command the operation module, are adapted to compensate the wind speed and direction.
20. Method according to claims 18 and 19, wherein the direction of the nozzle (1) is defined in relation to a HOTEST_PIXEL with the higher intensity value, in such a way as to make the WATER_HIT_PIXEL to coincide with said HOTEST_PIXEL in the record.
21. Method according to any of the claims 14 to 20, wherein the operation of the control unit starts upon collection of at least one sensorial parameter by the sensorial unit (6); the sensorial parameter being a temperature and/or a wind speed and/or humidity level; and wherein, the operation of the control unit is initiated if the referred parameters are higher than predetermined values.
22. Method according to any of the claims 14 to 20, wherein the operation of the control unit (3) is initiated by a user remotely connected to said control unit (3).
23. Method according to any of the claims 14 to 22, wherein programming the control unit (3) is performed remotely by a user by means of a dedicated controller such as a smartphone or a web application.
24. Method according to claim 23, wherein the operation of the control unit (3) is schedule to run in a future point in time selected by the user, and wherein, the control unit (3) has a schedule entity configured to manage the autonomous operation of the control unit (3).
PCT/PT2020/050026 2020-06-24 2020-06-24 Autonomous portabtle firefighting system and respective method of operation WO2021262020A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/PT2020/050026 WO2021262020A1 (en) 2020-06-24 2020-06-24 Autonomous portabtle firefighting system and respective method of operation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/PT2020/050026 WO2021262020A1 (en) 2020-06-24 2020-06-24 Autonomous portabtle firefighting system and respective method of operation

Publications (1)

Publication Number Publication Date
WO2021262020A1 true WO2021262020A1 (en) 2021-12-30

Family

ID=72240464

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/PT2020/050026 WO2021262020A1 (en) 2020-06-24 2020-06-24 Autonomous portabtle firefighting system and respective method of operation

Country Status (1)

Country Link
WO (1) WO2021262020A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150021054A1 (en) 2013-07-19 2015-01-22 Ian Edward McNamara Automatic fire targeting and extinguishing system and method
WO2016090414A1 (en) * 2014-12-10 2016-06-16 The University Of Sydney Automatic target recognition and dispensing system
US20170113079A1 (en) * 2015-10-23 2017-04-27 Garry Dale Thomsen Autonomous Firefighting Tower
WO2019183530A1 (en) * 2018-03-23 2019-09-26 Tyco Fire Products Lp Automated self-targeting fire suppression systems and methods
KR20200068367A (en) * 2018-12-05 2020-06-15 주식회사 태영테크 Wildfire spread prevention and extinguish device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150021054A1 (en) 2013-07-19 2015-01-22 Ian Edward McNamara Automatic fire targeting and extinguishing system and method
WO2016090414A1 (en) * 2014-12-10 2016-06-16 The University Of Sydney Automatic target recognition and dispensing system
US20170113079A1 (en) * 2015-10-23 2017-04-27 Garry Dale Thomsen Autonomous Firefighting Tower
WO2019183530A1 (en) * 2018-03-23 2019-09-26 Tyco Fire Products Lp Automated self-targeting fire suppression systems and methods
KR20200068367A (en) * 2018-12-05 2020-06-15 주식회사 태영테크 Wildfire spread prevention and extinguish device

Similar Documents

Publication Publication Date Title
CN212586744U (en) Fire protection robot for controlling fire protection equipment, corresponding fire protection system
US10930129B2 (en) Self-propelled monitoring device
KR100865129B1 (en) The forest fire surveillance device and The device use forest fire surveillance system
JP7312272B2 (en) counter drone system
US20190168035A1 (en) Remotely-controlled methods and systems for preventing wildfire embers from entering into the interior spaces of buildings during wildfire ember storms
JP7093164B2 (en) Fire monitoring system
CN111408089A (en) Fire-fighting robot and fire-fighting robot fire extinguishing system
KR20130106208A (en) Method for sensing a fire and transferring a fire information
KR101756603B1 (en) Unmanned Security System Using a Drone
KR101530843B1 (en) fire fighting robot having high waterproof property and high heat resistance
KR101692018B1 (en) Drone with self camera and photographer chasing function
CN112915420A (en) Fire scene positioning and fire extinguishing method of warehouse intelligent fire extinguishing robot
CN108096745A (en) A kind of Intelligent fire-fighting robot
CN212439798U (en) Fire-fighting robot
KR101530704B1 (en) fire fighting robot having high waterproof property
CN207996374U (en) A kind of Intelligent fire-fighting robot
CN110251861A (en) Forest fire-fighting system
CN112675459A (en) Autonomous detection positioning and accurate fire extinguishing robot system and use method
CN108939384A (en) A kind of Initial Stage of Fire automatic fire extinguishing system based on robot
KR20200113391A (en) Autonomous driving drone for fire extinguish and method for fire suppression using the same
KR102342206B1 (en) 119 fire fighting drone system
WO2021262020A1 (en) Autonomous portabtle firefighting system and respective method of operation
CN109663258B (en) Intelligent control fire extinguishing system
CN113521616A (en) Fire-fighting robot, scheduling method and fire extinguishing system
CN210612762U (en) Intelligent fire extinguisher, intelligent fire extinguishing vehicle and remote fire extinguishing system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20761649

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20761649

Country of ref document: EP

Kind code of ref document: A1