Disclosure of Invention
In view of the above problems, the invention provides an intelligent fire fighting system based on interconnection of an unmanned aerial vehicle, a cloud platform and AR glasses, which at least solves some of the above technical problems.
In order to realize the purpose, the invention adopts the technical scheme that:
the embodiment of the invention provides an intelligent fire fighting system based on interconnection of an unmanned aerial vehicle, a cloud platform and AR glasses, which comprises: unmanned aerial vehicles, cloud platforms and AR glasses; wherein:
the unmanned aerial vehicle is used for autonomously flying to a fire scene, completing intelligent obstacle avoidance, carrying out fire investigation in the fire scene and uploading fire data to the cloud platform;
the cloud platform is used for receiving the fire data returned by the unmanned aerial vehicle, processing and storing the fire data, and sending the processed data serving as cloud data to a decision end and the AR glasses;
the AR glasses are used for receiving cloud data pushed by the cloud platform and fire data transmitted by the unmanned aerial vehicle, acquiring site information, integrating information and feeding back the information to a decision-making end through the cloud platform.
Further, unmanned aerial vehicle contains sensor linkage module, data interconnection device, intelligent core processor and power module.
Further, the sensor linkage module comprises: the system comprises an obstacle avoidance sensor, an image acquisition device and a holder; wherein:
the obstacle avoidance sensor is used for acquiring the speed of the unmanned aerial vehicle and the distance and position information between the unmanned aerial vehicle and an obstacle;
the image acquisition device is used for acquiring a fire scene image;
and the holder is used for stabilizing the obstacle avoidance sensor and the image acquisition device when the unmanned aerial vehicle flies.
Further, the data interconnection device comprises: the cloud platform communication module and the AR glasses communication module are used for enabling the unmanned aerial vehicle to perform data interaction with the cloud platform and the AR glasses.
Further, the intelligent core processor comprises: the fire behavior detection system comprises an obstacle avoidance module, a fire behavior exploration module and a fusion identification module; wherein:
in the obstacle avoidance module, information acquired by the obstacle avoidance sensor is used as input, and an intelligent obstacle avoidance algorithm is combined to calculate the optimal flight path of the unmanned aerial vehicle so as to complete obstacle avoidance;
in the fire situation exploration module, the position of a fire source in a fire scene image is judged and labeled by taking the fire scene image acquired by the image acquisition device as input;
the fusion module is used for packaging the operating parameters of the unmanned aerial vehicle and the processed fire scene images, and transmitting the operating parameters and the processed fire scene images to the cloud platform and the AR glasses through the data interconnection device.
Further, the cloud platform comprises: the data processing module, the data storage module and the data sending module; wherein:
the data processing module is used for processing the fire situation data uploaded by the unmanned aerial vehicle, and the processing method comprises the following steps: correcting a fire point, collecting information of personnel trapped in a fire scene and estimating an optimal rescue route;
the data storage module is used for storing data received by the cloud platform;
the data sending module is used for sending the processed data to the AR glasses and the decision end as cloud data.
Further, the front end of AR glasses is provided with high definition digtal camera.
Further, the power module is a rechargeable battery and is composed of any one of the following components: ternary lithium batteries, lithium iron phosphate batteries, nickel-metal hydride batteries and lead-acid batteries.
Further, the obstacle avoidance sensor is a millimeter wave radar.
Further, the image acquisition device is a binocular vision camera.
Compared with the prior art, the intelligent fire fighting system based on interconnection of the unmanned aerial vehicle, the cloud platform and the AR glasses has the following beneficial effects:
1. according to the intelligent fire fighting system based on interconnection of the unmanned aerial vehicle, the cloud platform and the AR glasses, the unmanned aerial vehicle, the cloud platform and the AR glasses are interconnected, information is processed in a cooperative mode, the AR glasses are used for assisting positioning, information effectiveness is improved, accurate decision making is facilitated, and rescue efficiency is improved.
2. The system converts the functions of rescue workers, liberates the rescue workers from the flyers, carries out information reconnaissance, reduces the false alarm rate after the functions of autonomous obstacle avoidance and fire investigation of the unmanned aerial vehicle are realized, and greatly improves the accuracy of information.
3. The data are processed at the cloud end, so that a high-precision and high-performance data processing algorithm can be carried, messages can be pushed in time, and a redundant information transmission process is omitted.
AR glasses cooperate unmanned aerial vehicle to carry out information integration, provide the data support of multidimension degree in supplementary quick location, and information kind is abundant.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
The technical solution of the present invention is further described in detail by the accompanying drawings and embodiments.
Detailed Description
In order to make the technical means, the creation characteristics, the achievement purposes and the effects of the invention easy to understand, the invention is further described with the specific embodiments.
In the description of the present invention, it should be noted that the terms "upper", "lower", "inner", "outer", "front", "rear", "both ends", "one end", "the other end", and the like indicate orientations or positional relationships based on orientations or positional relationships shown in the drawings, and are only for convenience of description and simplification of description, but do not indicate or imply that the device or element referred to must have a specific orientation, be configured in a specific orientation, and operate, and thus, should not be construed as limiting the present invention. Furthermore, the terms "first" and "second" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
In the description of the present invention, it is to be noted that, unless otherwise explicitly specified or limited, the terms "comprising", "providing", "connecting", and the like are to be construed broadly, such as "connecting", may be fixedly connected, detachably connected, or integrally connected; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meanings of the above terms in the present invention can be understood in specific cases to those skilled in the art.
Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
Referring to fig. 1, an embodiment of the present invention provides an intelligent fire fighting system based on interconnection of an unmanned aerial vehicle, a cloud platform, and AR glasses, specifically including: unmanned aerial vehicles, cloud platforms and AR glasses; wherein:
the unmanned aerial vehicle is responsible for autonomously flying to a fire scene under the condition of no flying hand control, completing intelligent obstacle avoidance, carrying out fire investigation in the fire scene, acquiring fire data, uploading the fire data to the cloud platform, and waiting for instructions of movement, standby, return journey and the like of the cloud platform;
the cloud platform receives fire data returned by the unmanned aerial vehicle, processes and stores the fire data, and sends the processed data serving as cloud data to the decision end and the AR glasses;
the cloud platform is the core of information integration and processing, and the cloud platform can receive different data information from a decision-making end, an unmanned aerial vehicle side, an AR glasses side and the like, and according to the information, the cloud platform can monitor the state of the unmanned aerial vehicle, execute the superior decision, receive feedback information of the AR glasses side, and complete tasks such as fire confirmation, fire classification, directional transmission of fire pictures and the like.
The AR glasses are augmented reality equipment worn by rescuers and used for receiving cloud data pushed by the cloud platform and field data (including fire data) transmitted by the unmanned aerial vehicle; in the embodiment, the high-definition camera is arranged at the front end of the AR glasses, so that the information of the fire-fighting rescue worker can be acquired at the same time, namely the AR glasses acquire the scene image of the environment where the fire-fighting rescue worker is located; and packaging and integrating the information, and feeding the information back to the decision end through the cloud platform.
In this embodiment, the AR glasses are a receiving end of the visual data and also an input end of the feedback data, for example; when the decision end is positioned in the rescue command center, the decision end is connected with the monitoring screen, and a decision-making person can make a decision more accurately and efficiently according to augmented reality, visual and visual data fed back by the AR glasses; and the fire information collected by the unmanned aerial vehicle is judged by the rescue personnel in the safe area of the rescue scene, so that a specific rescue scheme can be formulated more comprehensively.
As shown in fig. 2, the unmanned aerial vehicle includes a sensor linkage module, a data interconnection device, an intelligent core processor and a power module.
Wherein, contain at least in unmanned aerial vehicle's the sensor linkage module: the system comprises an obstacle avoidance sensor, an image acquisition device and a holder; in this embodiment, the obstacle avoidance sensor preferably employs a millimeter wave radar for accurately acquiring the speed of the unmanned aerial vehicle and the distance and position information between the unmanned aerial vehicle and the obstacle; the image acquisition device preferably adopts a binocular vision camera for effectively acquiring a fire scene image; the cloud platform is triaxial machinery cloud platform for stable obstacle sensor and the image acquisition device of keeping away when unmanned aerial vehicle flies keeps the level, does not produce the shake.
As shown in fig. 2, the data interconnection device of the drone includes: cloud platform communication module and AR glasses communication module for make unmanned aerial vehicle carry out data interaction with cloud platform and AR glasses.
As shown in fig. 2, the smart core processor of the drone at least includes: the fire behavior detection system comprises an obstacle avoidance module, a fire behavior exploration module and a fusion identification module; wherein: in the obstacle avoidance module, information acquired by a millimeter wave radar is used as input, and an intelligent obstacle avoidance algorithm is combined to calculate the optimal flight path of the unmanned aerial vehicle so as to complete obstacle avoidance; in the fire situation exploration module, a fire scene image collected by a binocular vision camera is used as input, and the position of a fire source in the fire scene image is judged and labeled; the fusion module is used for packing the operating parameters (such as longitude and latitude height, electric quantity, speed and other key parameters) of the unmanned aerial vehicle and the processed fire scene images, and transmitting the operating parameters and the processed fire scene images to the cloud platform and the AR glasses through the cloud platform communication module and the AR glasses communication module.
In this embodiment, unmanned aerial vehicle's power module is rechargeable battery, can constitute by any one of following: ternary lithium batteries, lithium iron phosphate batteries, nickel hydride batteries and lead-acid batteries are preferably used in consideration of cost and weight.
Other functions and structural compositions of the unmanned aerial vehicle and the AR glasses, which are not shown in the invention, can be obtained from the existing unmanned aerial vehicle technology and AR glasses technology, and are not described too much here.
In this embodiment, the cloud platform includes: the data processing module, the data storage module and the data sending module; wherein: the data processing module processes the fire data uploaded by the unmanned aerial vehicle, and the processing method at least comprises the following steps: correcting fire points, collecting information of trapped personnel in a fire scene, estimating an optimal rescue route and the like; the data storage module stores the data received by the cloud platform; the data sending module is used for sending the processed data to the AR glasses and the decision-making terminal as cloud data; data interaction of the cloud platform is shown in fig. 3.
In this embodiment, the decision end may be installed in a monitoring system of a command center, and performs data display by monitoring a large screen, or may be installed in a mobile terminal device (e.g., a mobile phone, an iPad, a notebook, etc.) for data viewing by an APP, or may be a web page end, and displays and browses fire data by clicking a web page; or other clients and the like receive the data of the cloud platform to complete the interaction of the information, which is not limited herein.
In the embodiment, the AR glasses are used as individual devices of rescue personnel, wherein a camera which is in common view with the rescue personnel and a module which is interconnected with the cloud platform are mainly arranged; the structure of the AR glasses is shown in fig. 4.
Rescue personnel can receive fire information returned by the unmanned aerial vehicle visually and stereoscopically by wearing AR glasses, and can shoot an environment image where the rescue personnel are located, so that the functions of the rescue personnel are converted from 'flyers' into 'inspectors', decision making is facilitated, correct information can be fed back timely under the condition that the unmanned aerial vehicle cannot complete tasks due to extreme conditions, and further loss is avoided.
Further, taking the decision end located in the fire command center as an example, the rescue steps using the intelligent fire fighting system of the invention are as follows:
the first step is as follows: unmanned aerial vehicle independently flies to the scene of a fire, accomplishes intelligence and keeps away the barrier, accomplishes the condition of a fire reconnaissance back in the scene of a fire, uploads data to the cloud platform.
The second step: the cloud platform side receives data returned by the unmanned aerial vehicle, and after primary processing, the data are sent to rescue workers and a command center accompanying the rescue scene.
The third step: the command center makes a decision, uploads the decision to the cloud platform, and during the decision, on-site rescue personnel feed back according to on-site conditions at any time, the AR glasses cooperate with the unmanned aerial vehicle to carry out information integration, the geographical position, the fire field temperature and the information of the plurality of onboard sensors returned by the single machine side and the data of the glasses end are arranged into a data packet, the detection condition of the fire condition of the single machine side is confirmed, the data packet and the confirmation result are transmitted to the platform, and the information sent by the unmanned aerial vehicle is corrected.
The fourth step: rescue work is carried out, and the unmanned aerial vehicle receives the instruction and smoothly returns to the air.
The intelligent fire fighting system based on interconnection of the unmanned aerial vehicle, the cloud platform and the AR glasses is described in detail in the following by specific embodiments.
When a fire occurs, rescue workers arrive at the scene and a flight area is established; unblock unmanned aerial vehicle, unmanned aerial vehicle rely on millimeter wave radar and built-in intelligence to keep away the barrier module, independently go to the scene of a fire and carry out information collection. When the unmanned aerial vehicle arrives at a fire scene, acquiring scene image data by using a configured binocular vision camera, preprocessing the scene image data, judging and labeling the position of a suspected fire source in an image, and preliminarily discriminating the fire; and uploading the shot live images to a cloud platform.
After the cloud platform receives the images sent by the unmanned aerial vehicle, secondary processing is performed, such as correction of fire points, collection of information of personnel trapped in a fire scene, estimation of an optimal rescue route and the like. And after secondary processing, the information is respectively pushed to a command center and field rescue workers.
The rescue worker receives the information pushed by the cloud platform by wearing the AR glasses, meanwhile, the rescue worker can also receive the information directly transmitted from the unmanned aerial vehicle side, the information acquired by the camera which is viewed by the rescue worker is combined, the fire scene information is timely checked, the accuracy of the information such as the fire occurrence condition and the fire classification of the single machine side is judged, the AR glasses and the information acquired by the unmanned aerial vehicle are cooperatively sent to the cloud platform, and the effectiveness of the information is guaranteed.
After the command center obtains information such as a fire scene image of the unmanned aerial vehicle through the decision-making terminal, a scientific decision-making and efficient rescue scheme is made, feedback information of rescuers is received at any time in the making process, the decision is corrected, the decision is finally put down to the rescuers, and finally the fire scene rescue task is completed safely and efficiently.
According to the intelligent fire fighting system based on interconnection of the unmanned aerial vehicle, the cloud platform and the AR glasses, the success rate and safety of fire rescue activities are effectively improved through interconnection of the unmanned aerial vehicle, the cloud platform and the AR glasses. The system has the advantages that fire investigation is carried out through independent intelligence of a single machine, fire scene data are efficiently processed through the big data cloud platform, rescue workers wear AR glasses, and the real information aid decision-making is achieved, and the intelligent fire fighting and scientific decision-making concepts are reflected. Function conversion of rescue workers is liberated from flyers through the system, information is checked, after the functions of unmanned aerial vehicle autonomous obstacle avoidance and fire investigation are achieved, the false alarm rate is reduced, and the accuracy of information is greatly improved. The data are processed at the cloud end, so that a high-precision and high-performance data processing algorithm can be carried, messages can be pushed in time, and a redundant information transmission process is omitted. AR glasses carry out information integration in coordination with unmanned aerial vehicle, provide the data support of multidimension degree in supplementary quick location, and the information is of a wide variety.
The invention has the innovative points that based on an unmanned aerial vehicle intelligent algorithm, from the function of changing a pilot, information cooperative processing is carried out through the unmanned aerial vehicle, the cloud platform and the AR glasses, visual and multidimensional data are received by the AR glasses for auxiliary positioning, and therefore the purposes of improving information effectiveness, facilitating decision making and promoting the construction of an intelligent fire-fighting system are achieved. Moreover, the obstacle avoidance function and the fire recognition function are independently completed by the unmanned aerial vehicle, so that the function of rescue workers is converted from 'flyers' to 'inspectors', fire information returned by the unmanned aerial vehicle is intuitively and stereoscopically received through AR glasses, information correction is carried out while 'double insurance' is formed, correct information is timely fed back under the condition that the unmanned aerial vehicle cannot complete tasks due to extreme conditions, and further loss is avoided; the invention is expected to play an irreplaceable role in a future intelligent fire-fighting system and has strong guiding significance.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present invention without departing from the spirit and scope of the invention. Thus, if such modifications and variations of the present invention fall within the scope of the claims of the present invention and their equivalents, the present invention is also intended to include such modifications and variations.