CN107095384B - Intelligent fire control helmet device based on WIFI transmission - Google Patents
Intelligent fire control helmet device based on WIFI transmission Download PDFInfo
- Publication number
- CN107095384B CN107095384B CN201710282950.1A CN201710282950A CN107095384B CN 107095384 B CN107095384 B CN 107095384B CN 201710282950 A CN201710282950 A CN 201710282950A CN 107095384 B CN107095384 B CN 107095384B
- Authority
- CN
- China
- Prior art keywords
- image data
- module
- visible light
- information
- fusion
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 230000005540 biological transmission Effects 0.000 title claims abstract description 26
- 230000004927 fusion Effects 0.000 claims abstract description 77
- 230000006854 communication Effects 0.000 claims abstract description 31
- 238000004891 communication Methods 0.000 claims abstract description 31
- 238000007781 pre-processing Methods 0.000 claims description 14
- 238000012545 processing Methods 0.000 claims description 12
- 239000011521 glass Substances 0.000 claims description 11
- 101000685982 Homo sapiens NAD(+) hydrolase SARM1 Proteins 0.000 claims description 6
- 102100023356 NAD(+) hydrolase SARM1 Human genes 0.000 claims description 6
- 238000013507 mapping Methods 0.000 claims description 6
- 239000000849 selective androgen receptor modulator Substances 0.000 claims description 6
- 238000013500 data storage Methods 0.000 claims description 3
- 238000004148 unit process Methods 0.000 claims description 2
- 238000010586 diagram Methods 0.000 description 8
- 230000006870 function Effects 0.000 description 6
- 230000015654 memory Effects 0.000 description 4
- 238000000034 method Methods 0.000 description 3
- 238000001035 drying Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000010365 information processing Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 206010000369 Accident Diseases 0.000 description 1
- 206010063385 Intellectualisation Diseases 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 230000007175 bidirectional communication Effects 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000007499 fusion processing Methods 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A42—HEADWEAR
- A42B—HATS; HEAD COVERINGS
- A42B3/00—Helmets; Helmet covers ; Other protective head coverings
- A42B3/04—Parts, details or accessories of helmets
- A42B3/18—Face protection devices
-
- A—HUMAN NECESSITIES
- A42—HEADWEAR
- A42B—HATS; HEAD COVERINGS
- A42B3/00—Helmets; Helmet covers ; Other protective head coverings
- A42B3/04—Parts, details or accessories of helmets
- A42B3/0406—Accessories for helmets
-
- A—HUMAN NECESSITIES
- A42—HEADWEAR
- A42B—HATS; HEAD COVERINGS
- A42B3/00—Helmets; Helmet covers ; Other protective head coverings
- A42B3/04—Parts, details or accessories of helmets
- A42B3/30—Mounting radio sets or communication systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06K—GRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K17/00—Methods or arrangements for effecting co-operative working between equipments covered by two or more of main groups G06K1/00 - G06K15/00, e.g. automatic card files incorporating conveying and reading operations
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Alarm Systems (AREA)
Abstract
The invention discloses an intelligent fire-fighting helmet device based on WIFI transmission, which is characterized in that an image acquisition fusion module acquires visible light image data and infrared image data, and the acquired visible light image data and infrared image data are fused to obtain fusion image data; the gesture information module collects gesture information of a user and sends the gesture information to the main controller module; the positioning navigation module acquires current position information and sends the current position information to the main controller module; the main controller module generates a corresponding gating control signal according to the gesture information and sends the gating control signal to the image acquisition fusion module, wherein the gating control signal is used for controlling the image acquisition fusion module to gate visible light image data, infrared image data or fusion image data and send the data to the main controller module; the main controller module outputs the received visible light image data, infrared image data or fusion image data, gesture information and current position information to the background server through the WIFI communication module and sends the information to the AR display module for display.
Description
Technical Field
The invention relates to the technical field of fire protection, in particular to an intelligent fire protection helmet device based on WIFI transmission.
Background
Under the current communication condition, when a fire disaster occurs, the judgment of a commander in a back scene and the decision making of a fight are mostly information transmitted back orally through communication soldiers in the fire scene, so that great errors can be generated in time, personnel rescue and transmission of fire information. At present, the fire-fighting communication command system is just a soft rib equipped by the fire-fighting departments in China, and various advanced communication technologies and concepts are difficult to truly enter the fire-fighting field for practical application. Therefore, the fire-fighting communication command system developed by utilizing the advanced communication technology has great market prospect and application space.
Fire control is the guarantee of people's production life, and firefighters are in the scene of a fire and are in a frontal array, and life danger is raised to protect people's life and property safety. The fire-fighting fire-extinguishing book is a work full of danger, a plurality of uncontrollable situations can be generated in a fire scene originally, fire accidents frequently occur, firefighters are injured, and sacrificial messages are frequently reported, so that in actual fire-fighting, if the environment information of the fire scene and the position information of the firefighters can be known in real time, the firefighters can respond to the potential danger in time. The fire command personnel can optimally schedule the fire according to the condition of the fire scene, so that the fire command personnel can be better commanded; on the premise of guaranteeing the safety of firefighters, the disaster site is extremely likely to be saved, and the life and property safety is saved.
Traditional fire-fighting helmets are single in function and mostly only play a role in protection, or have simple communication functions, and lack the intelligence of information perception and information communication of a fire scene. In practical use, there are a number of disadvantages. Accordingly, the development of firefighting helmets to intellectualization is a strong demand of the firefighting industry.
Disclosure of Invention
The embodiment of the invention provides an intelligent fire-fighting helmet device based on WIFI transmission, which can display fire scene information to a helmet user in real time AR and transmit the fire scene information to a background server, so that the efficiency of fire-fighting treatment is improved.
The embodiment of the invention provides an intelligent fire-fighting helmet device based on WIFI transmission, which comprises an image acquisition fusion module, a main controller module, a WIFI communication module, a gesture information module, a positioning navigation module and an AR display module;
the image acquisition and fusion module is used for acquiring visible light image data and infrared image data in real time and fusing the acquired visible light image data and infrared image data to obtain fused image data; the gesture information module is used for acquiring gesture information of a user wearing the helmet and sending the gesture information to the main controller module; the positioning navigation module is used for acquiring current position information and sending the acquired current position information to the main controller module; the main controller module generates a corresponding gating control signal according to the gesture information and sends the gating control signal to the image acquisition fusion module, wherein the gating control signal is used for controlling the image acquisition fusion module to gate the visible light image data, the infrared image data or the fusion image data and send the data to the main controller module; the main controller module outputs the received visible light image data, infrared image data or fusion image data, the gesture information and the current position information to a background server through the WIFI communication module and sends the received visible light image data, infrared image data or fusion image data, the gesture information and the current position information to the AR display module for display.
As an improvement of the scheme, the image acquisition fusion module comprises a visible light camera and an infrared camera which are arranged at the front part of the helmet, wherein the visible light camera is used for acquiring visible light image data, and the second infrared camera is used for acquiring infrared image data.
As an improvement of the scheme, the image acquisition fusion module further comprises an FPGA, the FPGA comprises a visible light image preprocessing unit, a first SDRAM controller, an infrared image preprocessing unit and a second SDRAM controller, the FPGA controls the visible light camera and the infrared camera to acquire visible light image data and infrared image data respectively through the IIC controller, the FPGA preprocesses the acquired visible light image data and infrared image data respectively through the visible light image preprocessing unit and the infrared image preprocessing unit, the FPGA also writes the preprocessed visible light image data into the visible light image SRAM or reads the preprocessed visible light image data from the visible light image SRAM through the first SARM controller, and the preprocessed infrared image data is written into the infrared image SRAM or reads the preprocessed infrared image data from the infrared image SRAM through the second SARM controller.
As an improvement of the scheme, the FPGA further comprises an image fusion system and three signal gating switches, wherein the image fusion system comprises an image registration unit and an image fusion unit; the image registration unit is used for carrying out image registration on the visible light image data read out from the visible light image SRAM and the infrared image data read out from the infrared image SRAM, and then carrying out image fusion through the image fusion unit to obtain fusion image data; the three-way signal gating switch selects the visible light image data, the infrared image data or the fusion image data to send to the main controller module according to the gating control signal sent by the main controller module.
As an improvement of the above scheme, the main controller module adopts an ARM processor, and comprises a signal processing unit and a display buffer unit, wherein the signal processing unit processes the visible light image data, the infrared image data or the fusion image data received and stored in the DDR2 data storage unit and then sends the processed data to the display buffer unit, and the signal processing unit is also used for processing the gesture information and the current position information and then sending the processed information to the display buffer unit; and the data stored in the display buffer unit is sent to the WIFI module and the AR display module.
As an improvement of the scheme, the gesture information module is a nine-axis sensor, and the nine-axis sensor comprises a three-axis gyroscope, a three-axis accelerometer and a three-axis magnetometer to acquire nine-axis information of the gesture of the user;
the main controller module is used for analyzing the nine-axis information sent by the gesture information module, acquiring gesture information of a user, and generating a corresponding gating control signal according to the gesture information of the user and a preset mapping table; wherein the mapping table records the correspondence between the user gesture and the image signal gating.
As an improvement of the scheme, the positioning navigation module comprises a GPS module, and the main controller module is connected with the GPS module through a serial port so as to control the GPS module to work and acquire positioning information acquired by the GPS module.
As an improvement of the scheme, the positioning navigation module further comprises an RFID electronic tag, the main controller module is connected with the RFID electronic tag through an SPI interface so as to control the RFID electronic tag to work, and a plurality of RFID readers arranged on a fire scene are used for sending the read RFID electronic tag information to a background server, and the background server calculates the position of the RFID electronic tag based on the received RFID electronic tag information.
The voice device is characterized by further comprising a voice device, wherein the main controller module is connected with the voice device through an IIS interface, so that real-time voice information sent by the voice device is sent to a background server through a wireless communication module, and real-time voice information returned by the background server is received through the wireless communication module and is sent to the voice device.
As an improvement of the scheme, the AR display module comprises AR glasses, and the main controller module is connected with the AR glasses through an AR interface.
Compared with the prior art, the intelligent fire-fighting helmet device based on WIFI transmission provided by the embodiment of the invention acquires visible light image data and infrared image data in real time through the image acquisition fusion module, fuses the acquired visible light image data and infrared image data to obtain fused image data, acquires gesture information of a helmet wearing user through the gesture information module, sends the gesture information to the main controller module, acquires current position information through the positioning navigation module, sends the acquired current position information to the main controller module, and generates a corresponding gating control signal according to the gesture information by the main controller module and sends the gating control signal to the image acquisition fusion module, wherein the gating control signal is used for controlling the image acquisition fusion module to gate the visible light image data, the infrared image data or the fused image data to be sent to the main controller module; the main controller module outputs the received visible light image data, infrared image data or fusion image data, the gesture information and the current position information to a background server through the WIFI communication module and sends the received visible light image data, infrared image data or fusion image data, the gesture information and the current position information to the AR display module for display. Therefore, the intelligent fire-fighting helmet device based on WIFI transmission provided by the embodiment of the invention can display fire scene information to a helmet user in real time AR and transmit the fire scene information to a background server, so that the efficiency of fire-fighting treatment is improved.
Drawings
Fig. 1 is a block diagram of a smart fire helmet device based on WIFI transmission in an embodiment of the present invention.
Fig. 2 is a schematic structural diagram of an image acquisition fusion module of an intelligent fire-fighting helmet device based on WIFI transmission in an embodiment of the invention.
Fig. 3 is an image fusion schematic diagram of an intelligent fire-fighting helmet device based on WIFI transmission in an embodiment of the invention.
Fig. 4 is a positioning information processing block diagram of an intelligent fire fighting helmet device based on WIFI transmission in an embodiment of the invention.
Fig. 5 is a gesture information processing block diagram of an intelligent fire helmet apparatus based on WIFI transmission in an embodiment of the present invention.
Fig. 6 is a schematic block diagram of information display and transmission of an intelligent fire-fighting helmet device based on WIFI transmission in an embodiment of the invention.
Fig. 7 is an RFID positioning schematic diagram of an intelligent firefighting helmet apparatus based on WIFI transmission in an embodiment of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Referring to fig. 1, a block diagram of an intelligent fire-fighting helmet device based on WIFI transmission in embodiment 1 of the present invention is shown. The intelligent fire-fighting helmet device based on WIFI transmission comprises an image acquisition fusion module 1, a main controller module 2, a WIFI communication module 3, a gesture information module 4, a positioning navigation module 5 and an AR display module 6.
The image acquisition and fusion module 1 is used for acquiring visible light image data and infrared image data in real time and fusing the acquired visible light image data and infrared image data to obtain fused image data. The gesture information module 4 is used for acquiring gesture information of a helmet wearing user and sending the gesture information to the main controller module 2. The positioning navigation module 5 is configured to obtain current position information, and send the obtained current position information to the main controller module 2. The main controller module 2 generates a corresponding gating control signal according to the gesture information and sends the gating control signal to the image acquisition fusion module 1, wherein the gating control signal is used for controlling the image acquisition fusion module 1 to gate the visible light image data, the infrared image data or the fusion image data and send the data to the main controller module 2. The main controller module 2 outputs the received visible light image data, infrared image data or fusion image data, the gesture information and the current position information to a background server through the WIFI communication module 3 and sends the received visible light image data, infrared image data or fusion image data to the AR display module 6 for display.
Referring to fig. 1 and 2, the image acquisition fusion module 1 includes a visible light camera 11 and an infrared camera 12 disposed at the front of the helmet, the visible light camera 11 is used for acquiring visible light image data, and the second infrared camera 12 is used for acquiring infrared image data.
The image acquisition fusion module 1 further comprises an FPGA13, the FPGA13 comprises a visible light image preprocessing unit 131, a first SDRAM controller 132, an infrared image preprocessing unit 133 and a second SDRAM controller 134, the FPGA13 controls the visible light camera 11 and the infrared camera 12 to acquire the visible light image data and the infrared image data through the IIC controller, the visible light image preprocessing unit 131 and the infrared image preprocessing unit 133 respectively preprocess the acquired visible light image data and infrared image data (including preprocessing such as drying enhancement), the FPGA13 also writes the preprocessed visible light image data into the visible light image SRAM or reads the preprocessed visible light image SRAM from the visible light image SRAM through the first SARM controller 132, and writes the preprocessed infrared image data into the infrared image SRAM or reads the preprocessed infrared image SRAM from the infrared image SRAM through the second SARM controller 134, wherein the first writing FIFO, the first reading FIFO, the second writing FIFO and the second reading FIFO are used for matching the reading speed.
Referring to fig. 2 and 3, the FPGA13 further includes an image fusion system 135 and a three-way signal gating switch 136, and the image fusion system 135 includes an image registration unit and an image fusion unit. The image registration unit is used for carrying out image registration on the visible light image data read out from the visible light image SRAM and the infrared image data read out from the infrared image SRAM, and then carrying out image fusion through the image fusion unit to obtain fusion image data. The three-way signal gating switch 136 selects the visible light image data, the infrared image data or the fusion image data to transmit to the main controller module 2 according to the gating control signal transmitted from the main controller module 2.
The three-way signal gating switch 136 is used for selectively acquiring the acquired infrared image, visible light image or fusion image. When the fusion image needs to be acquired, the visible light image stored in the SDRAM1 and the infrared image in the SDRAM2 are read out and sent to an image fusion system for image fusion. Fig. 3 is an image fusion process of the present embodiment. The collected visible light image and infrared image can reflect the environmental information of the fire rescue scene more truly through preprocessing operations such as drying and enhancing. According to the environment state and the actual requirement of the rescue site, the preprocessed visible light image and infrared image can be directly sent, displayed and transmitted without fusion. When the images are required to be fused, the two paths of image information are read into an image fusion system to be subjected to image registration firstly so as to align the two paths of images, then the images are fused, the characteristics of the fused images are required to be extracted, the images are identified, whether the fusion requirements are met or not is judged, and the fused images meeting the requirements can be sent, displayed and transmitted. The three-way signal gating switch 136 selectively transmits the infrared image, the visible light image and the fusion image, the control signal of the gating switch comes from the main controller module 2, and the selected image information is sent to the main controller module 2 for other processing.
Referring back to fig. 1, the main controller module 2 adopts an ARM processor, and includes a signal processing unit 21 and a display buffer unit 22, where the signal processing unit 21 processes visible light image data, infrared image data or fusion image data received and stored in the DDR2 data storage unit, and sends the processed data to the display buffer unit 22, and the signal processing unit 21 is further configured to process information of the gesture information and current position information, and send the processed information to the display buffer unit 22; the data stored in the display buffer unit 22 is sent to the WIFI communication module 3 and the AR display module 6.
The gesture information module 4 comprises a nine-axis sensor, wherein the nine-axis sensor comprises a three-axis gyroscope, a three-axis accelerometer and a three-axis magnetometer to acquire nine-axis information of a gesture of a user. As shown in fig. 4, the main controller module 2 receives nine-axis information sent by the nine-axis sensor through a serial port, analyzes the nine-axis information, and obtains gesture information of a user. The main controller module 2 stores the analyzed gesture information into the display buffer unit 22 for subsequent display and transmission, and generates a corresponding gating control signal according to the analyzed gesture information and a preset mapping table; wherein the mapping table records the correspondence between the user gesture and the image signal gating.
As shown in fig. 5, the positioning navigation module 5 includes a GPS module, and the main controller module 2 is connected with the GPS module through a serial port, so as to control the GPS module to work and obtain positioning information collected by the GPS module. The positioning navigation module 5 further comprises an RFID electronic tag, and the main controller module 2 is connected with the RFID electronic tag through an SPI interface, so as to control the RFID electronic tag to work and acquire RFID electronic tag information, and after information analysis is performed on positioning information and RFID electronic tag information acquired by the GPS module, the positioning information and RFID electronic tag information are sent to the display buffer unit 22 for subsequent sending, displaying and transmitting.
In addition, a plurality of RFID readers arranged on the fire scene are used for sending the read RFID electronic tag information to a background server, and the background server calculates the position of the RFID electronic tag based on the received RFID electronic tag information. The positioning navigation module 5 of the embodiment comprises a GPS and an RFID positioning, and can quickly confirm the fire scene according to the GPS positioning, but the GPS is easily shielded by a building in a closed environment, and the positioning accuracy is affected. The active RFID tag comprises a power supply, so that the tag can generate active external signals, and the tag has a long reading distance and a large-capacity memory and can store more information, so that the active RFID tag is used for positioning. Because each RFID tag has a unique electronic code, according to the electronic tag of the helmet device, the RFID positioning principle of fig. 7 can be used for specific people, and a proper reader is placed on a fire scene to accurately position each firefighter according to the electronic tag of each firefighter helmet device. The background server (monitoring system of command center) can receive the position information of multiple paths of site data including firefighters at the same time, and can carry out real-time voice communication with any team member to make correct judgment for firefighters.
Specifically, the RFID electronic tag is an active tag, and the number and basic information of a firefighter are stored in the tag and are arranged in the helmet. The RFID reader is a fixed multi-frequency card reader, is fixedly arranged at each place of the fire scene, reads RFID electronic tag information, and performs non-contact two-way communication exchange data with the RFID electronic tag in a radio frequency mode so as to achieve the functions of identification and positioning. The RFID reader and the RFID electronic tag perform non-contact bidirectional communication exchange data in a radio frequency mode so as to achieve the functions of identification and positioning. Each RFID reader can perform data communication with a background server through WIFI, and the fireman number, basic information and positioning information in the RFID electronic tag are sent.
As shown in fig. 6, the information to be sent and transmitted mainly consists of three parts, namely, the image information of the fire rescue site and the positioning information and the gesture information of the firefighter, which are temporarily stored in the display buffer unit 22 after being processed by the main controller module 2, and then are transmitted to the background server in the form of network data packets through the WIFI module on one hand, and are transmitted to the AR display module 6 for display through the AR interface on the other hand. The AR display module 6 includes AR glasses, and the main controller module 2 is connected with the AR display module through an HDMI interface. Therefore, the intelligent fire-fighting helmet device of the embodiment can transmit the disaster relief site information, the positioning information and the gesture information of the firefighters to the command center of the background server in real time through the WIFI communication module 3, and can display the surrounding environment information of the fire disaster, the positioning information and the gesture information of the firefighters for the firefighters in real time through the AR glasses.
As a preferred embodiment, the embodiment of the present invention further includes a voice device, where the main controller module 2 is connected to the voice device through an IIS interface, so as to send real-time voice information sent by the voice device to a background server through the WIFI communication module 3, and send real-time voice information returned by the background server and received by the WIFI communication module 3 to the voice device. Thus, a commander of the background server can simultaneously receive the multipath field data including the position information of the firefighters and can carry out real-time voice communication with any team member to make correct judgment for the firefighters.
The AR glasses of the embodiment of the present invention are embedded in the front of the helmet body, and the AR display module 6 further includes an AR glasses driving module for driving the AR glasses, where the AR glasses driving module is connected with the AR glasses through a corresponding interface.
In addition, the intelligent fire helmet apparatus of the present embodiment further includes a power management module 7 for supplying power to the respective modules.
In summary, according to the intelligent fire-fighting helmet device based on WIFI transmission provided by the embodiment of the present invention, the image acquisition fusion module acquires the visible light image data and the infrared image data in real time, and fuses the acquired visible light image data and infrared image data to obtain fused image data, the gesture information module acquires gesture information of a user wearing the helmet and sends the gesture information to the main controller module, the positioning navigation module is used for acquiring current position information and sends the acquired current position information to the main controller module, and the main controller module generates a corresponding gating control signal according to the gesture information and sends the gating control signal to the image acquisition fusion module, wherein the gating control signal is used for controlling the image acquisition fusion module to gate the visible light image data, the infrared image data or the fused image data to be sent to the main controller module; the main controller module outputs the received visible light image data, infrared image data or fusion image data, the gesture information and the current position information to a background server through the WIFI communication module and sends the received visible light image data, infrared image data or fusion image data, the gesture information and the current position information to the AR display module for display. Therefore, the intelligent fire-fighting helmet device based on WIFI transmission provided by the embodiment of the invention can display fire scene information to a helmet user in real time AR and transmit the fire scene information to a background server, so that the efficiency of fire-fighting treatment is improved.
It should be noted that the above-described apparatus embodiments are merely illustrative, and the units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment. In addition, in the drawings of the embodiment of the device provided by the invention, the connection relation between the modules represents that the modules have communication connection, and can be specifically implemented as one or more communication buses or signal lines. Those of ordinary skill in the art will understand and implement the present invention without undue burden.
From the above description of the embodiments, it will be apparent to those skilled in the art that the present invention may be implemented by means of software plus necessary general purpose hardware, or of course by means of special purpose hardware including application specific integrated circuits, special purpose CPUs, special purpose memories, special purpose components, etc. Generally, functions performed by computer programs can be easily implemented by corresponding hardware, and specific hardware structures for implementing the same functions can be varied, such as analog circuits, digital circuits, or dedicated circuits. However, a software program implementation is a preferred embodiment for many more of the cases of the present invention. Based on such understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a readable storage medium, such as a floppy disk, a usb disk, a removable hard disk, a Read-Only Memory (ROM), a random-access Memory (RAM, random Access Memory), a magnetic disk or an optical disk of a computer, etc., including several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to execute the method according to the embodiments of the present invention.
While the foregoing is directed to the preferred embodiments of the present invention, it will be appreciated by those skilled in the art that changes and modifications may be made without departing from the principles of the invention, such changes and modifications are also intended to be within the scope of the invention.
Claims (3)
1. The intelligent fire-fighting helmet device based on WIFI transmission is characterized by comprising an image acquisition fusion module, a main controller module, a WIFI communication module, a gesture information module, a positioning navigation module and an AR display module;
the image acquisition and fusion module is used for acquiring visible light image data and infrared image data in real time and fusing the acquired visible light image data and infrared image data to obtain fused image data; the gesture information module is used for acquiring gesture information of a user wearing the helmet and sending the gesture information to the main controller module; the positioning navigation module is used for acquiring current position information and sending the acquired current position information to the main controller module; the main controller module generates a corresponding gating control signal according to the gesture information and sends the gating control signal to the image acquisition fusion module, wherein the gating control signal is used for controlling the image acquisition fusion module to gate the visible light image data, the infrared image data or the fusion image data and send the data to the main controller module; the main controller module outputs the received visible light image data, infrared image data or fusion image data, the gesture information and the current position information to a background server through the WIFI communication module and sends the received visible light image data, infrared image data or fusion image data to an AR display module for display;
the AR display module comprises AR glasses, and the main controller module is connected with the AR glasses through an AR interface;
the image acquisition fusion module comprises a visible light camera and an infrared camera which are arranged at the front part of the helmet, wherein the visible light camera is used for acquiring visible light image data, and the infrared camera is used for acquiring infrared image data;
the image acquisition fusion module further comprises an FPGA, the FPGA comprises a visible light image preprocessing unit, a first SDRAM controller, an infrared image preprocessing unit and a second SDRAM controller, the FPGA respectively controls the visible light camera and the infrared camera to acquire visible light image data and infrared image data through the IIC controller, the visible light image preprocessing unit and the infrared image preprocessing unit respectively preprocess the acquired visible light image data and infrared image data, the FPGA also writes the preprocessed visible light image data into a visible light image SRAM or reads the preprocessed visible light image data from the visible light image SRAM through the first SARM controller, and the preprocessed infrared image data is written into the infrared image SRAM or read from the infrared image SRAM through the second SARM controller;
the FPGA further comprises an image fusion system and three signal gating switches, wherein the image fusion system comprises an image registration unit and an image fusion unit; the image registration unit is used for carrying out image registration on the visible light image data read out from the visible light image SRAM and the infrared image data read out from the infrared image SRAM, and then carrying out image fusion through the image fusion unit to obtain fusion image data; the three-way signal gating switch selects the visible light image data, the infrared image data or the fusion image data to send to the main controller module according to the gating control signal sent by the main controller module;
the main controller module adopts an ARM processor and comprises a signal processing unit and a display buffer unit, wherein the signal processing unit processes visible light image data, infrared image data or fusion image data received and stored in the DDR2 data storage unit and then sends the processed data to the display buffer unit, and the signal processing unit is also used for processing the information of the gesture information and the current position information and then sending the processed information to the display buffer unit; the data stored in the display buffer unit are sent to the WIFI module and the AR display module;
the gesture information module is a nine-axis sensor, and the nine-axis sensor comprises a three-axis gyroscope, a three-axis accelerometer and a three-axis magnetometer to acquire nine-axis information of the gesture of the user;
the main controller module is used for analyzing the nine-axis information sent by the gesture information module to obtain gesture information of a user, and storing the analyzed gesture information into the display buffer unit for subsequent display and transmission on one hand, and generating a corresponding gating control signal according to the analyzed gesture information and a preset mapping table on the other hand; the mapping table records the corresponding relation between the user gesture and the image signal gating;
the positioning navigation module further comprises an RFID electronic tag, the main controller module is connected with the RFID electronic tag through an SPI interface so as to control the RFID electronic tag to work, a plurality of RFID readers arranged on a fire scene are used for sending the read RFID electronic tag information to a background server, and the background server calculates the position of the RFID electronic tag based on the received RFID electronic tag information.
2. The intelligent fire fighting helmet device based on WIFI transmission according to claim 1, wherein the positioning navigation module comprises a GPS module, and the main controller module is connected with the GPS module through a serial port to control the GPS module to work and acquire positioning information acquired by the GPS module.
3. The intelligent fire helmet apparatus based on WIFI transmission according to claim 1, further comprising a voice apparatus, wherein the main controller module is connected with the voice apparatus through an IIS interface, so as to transmit real-time voice information transmitted by the voice apparatus to a background server through a wireless communication module, and transmit real-time voice information returned by the background server received through the wireless communication module to the voice apparatus.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710282950.1A CN107095384B (en) | 2017-04-26 | 2017-04-26 | Intelligent fire control helmet device based on WIFI transmission |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710282950.1A CN107095384B (en) | 2017-04-26 | 2017-04-26 | Intelligent fire control helmet device based on WIFI transmission |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107095384A CN107095384A (en) | 2017-08-29 |
CN107095384B true CN107095384B (en) | 2023-11-24 |
Family
ID=59657236
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710282950.1A Active CN107095384B (en) | 2017-04-26 | 2017-04-26 | Intelligent fire control helmet device based on WIFI transmission |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107095384B (en) |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107411212A (en) * | 2017-08-31 | 2017-12-01 | 中山市程博工业产品设计有限公司 | It is a kind of can teletransmission data adjustable-angle helmet video camera |
CN109584197A (en) * | 2018-12-20 | 2019-04-05 | 广东浪潮大数据研究有限公司 | A kind of image interfusion method and relevant apparatus |
CN109921818B (en) * | 2019-03-08 | 2021-07-20 | 长春理工大学 | Intelligent multi-point real-time three-dimensional interactive helmet display system |
CN110432576A (en) * | 2019-07-30 | 2019-11-12 | 北京科技大学 | A kind of intelligence ski helmets and management system for monitoring |
CN110859352A (en) * | 2019-11-12 | 2020-03-06 | 陕西禾宁电子科技有限公司 | AR fire helmet based on distributed network |
CN110989168A (en) * | 2019-11-12 | 2020-04-10 | 陕西禾宁电子科技有限公司 | AR fire control goggles |
WO2021184388A1 (en) * | 2020-03-20 | 2021-09-23 | Oppo广东移动通信有限公司 | Image display method and apparatus, and portable electronic device |
CN112333370A (en) * | 2020-11-11 | 2021-02-05 | 冯辙恺 | Epidemic prevention/chemical prevention comprehensive information display helmet |
CN112933447A (en) * | 2021-03-12 | 2021-06-11 | 南京瀚海星河信息技术有限公司 | Infrared AR air respirator that possesses multiple scene of a fire monitoring auxiliary function |
CN113287822B (en) * | 2021-05-31 | 2023-07-18 | 深圳市卓炜视讯科技有限公司 | Split type AR communication intelligent helmet |
Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6411328B1 (en) * | 1995-12-01 | 2002-06-25 | Southwest Research Institute | Method and apparatus for traffic incident detection |
KR100829215B1 (en) * | 2008-03-13 | 2008-05-14 | 주식회사 엘트로닉스 | Composite image generating system for detecting hidden objects |
WO2008127360A2 (en) * | 2006-10-11 | 2008-10-23 | Thermal Matrix, Inc. | Real time threat detection system |
CN202956192U (en) * | 2012-03-16 | 2013-05-29 | 东莞华仪仪表科技有限公司 | Device and system of intelligent thermal infrared imager |
CN103402044A (en) * | 2013-08-07 | 2013-11-20 | 重庆大学 | Target recognition and tracking system based on multi-source video integration |
CN103417196A (en) * | 2013-08-23 | 2013-12-04 | 中山大学 | Venous visualizer and visualizing method |
CN103530853A (en) * | 2013-10-17 | 2014-01-22 | 中北大学 | Infrared intensity image and infrared polarization image enhancement and fusion method |
CN103593494A (en) * | 2013-07-19 | 2014-02-19 | 北京赛四达科技股份有限公司 | System and method for generating visible light and infrared images in real time |
CN104155006A (en) * | 2014-08-27 | 2014-11-19 | 湖北久之洋红外系统股份有限公司 | Handheld thermal infrared imager and method for same to carry out quick locking and ranging on small target |
CN104639912A (en) * | 2015-02-11 | 2015-05-20 | 尼森科技(湖北)有限公司 | Individual soldier fire protection and disaster rescue equipment and system based on infrared three-dimensional imaging |
CN105512667A (en) * | 2014-09-22 | 2016-04-20 | 中国石油化工股份有限公司 | Method for fire identification through infrared and visible-light video image fusion |
CN105809640A (en) * | 2016-03-09 | 2016-07-27 | 长春理工大学 | Multi-sensor fusion low-illumination video image enhancement method |
CN207306182U (en) * | 2017-04-26 | 2018-05-04 | 左志权 | A kind of Intelligent fire-fighting helmet device based on WIFI transmission |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE202014103729U1 (en) * | 2014-08-08 | 2014-09-09 | Leap Motion, Inc. | Augmented reality with motion detection |
-
2017
- 2017-04-26 CN CN201710282950.1A patent/CN107095384B/en active Active
Patent Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6411328B1 (en) * | 1995-12-01 | 2002-06-25 | Southwest Research Institute | Method and apparatus for traffic incident detection |
WO2008127360A2 (en) * | 2006-10-11 | 2008-10-23 | Thermal Matrix, Inc. | Real time threat detection system |
KR100829215B1 (en) * | 2008-03-13 | 2008-05-14 | 주식회사 엘트로닉스 | Composite image generating system for detecting hidden objects |
CN202956192U (en) * | 2012-03-16 | 2013-05-29 | 东莞华仪仪表科技有限公司 | Device and system of intelligent thermal infrared imager |
CN103593494A (en) * | 2013-07-19 | 2014-02-19 | 北京赛四达科技股份有限公司 | System and method for generating visible light and infrared images in real time |
CN103402044A (en) * | 2013-08-07 | 2013-11-20 | 重庆大学 | Target recognition and tracking system based on multi-source video integration |
CN103417196A (en) * | 2013-08-23 | 2013-12-04 | 中山大学 | Venous visualizer and visualizing method |
CN103530853A (en) * | 2013-10-17 | 2014-01-22 | 中北大学 | Infrared intensity image and infrared polarization image enhancement and fusion method |
CN104155006A (en) * | 2014-08-27 | 2014-11-19 | 湖北久之洋红外系统股份有限公司 | Handheld thermal infrared imager and method for same to carry out quick locking and ranging on small target |
CN105512667A (en) * | 2014-09-22 | 2016-04-20 | 中国石油化工股份有限公司 | Method for fire identification through infrared and visible-light video image fusion |
CN104639912A (en) * | 2015-02-11 | 2015-05-20 | 尼森科技(湖北)有限公司 | Individual soldier fire protection and disaster rescue equipment and system based on infrared three-dimensional imaging |
CN105809640A (en) * | 2016-03-09 | 2016-07-27 | 长春理工大学 | Multi-sensor fusion low-illumination video image enhancement method |
CN207306182U (en) * | 2017-04-26 | 2018-05-04 | 左志权 | A kind of Intelligent fire-fighting helmet device based on WIFI transmission |
Also Published As
Publication number | Publication date |
---|---|
CN107095384A (en) | 2017-08-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107095384B (en) | Intelligent fire control helmet device based on WIFI transmission | |
CN107024772B (en) | Fire control AR helmet device | |
JP5067850B2 (en) | System, head-mounted display device, and control method thereof | |
US10007338B2 (en) | Method of interaction by gaze and associated device | |
CN109117684A (en) | System and method for the selective scanning in binocular augmented reality equipment | |
CN109671118A (en) | A kind of more people's exchange methods of virtual reality, apparatus and system | |
CN207306182U (en) | A kind of Intelligent fire-fighting helmet device based on WIFI transmission | |
WO2013052451A4 (en) | Data center infrastructure management system having real time enhanced reality tablet | |
CN106843511A (en) | A kind of intelligent display device system of whole scene covering and application | |
CN105639818A (en) | Intelligent safety helmet based on augmented reality, space scanning and gesture recognition technologies | |
KR101437467B1 (en) | A disaster situation management system and a terminal for personal recognition linked to the system | |
CN106507092A (en) | Camera head and its image processing method, virtual reality device | |
US20170278453A1 (en) | Head mount display | |
CN106355835A (en) | Firefighter positioning tracking rescue device | |
CN107019280A (en) | A kind of Intelligent fire-fighting helmet device communicated based on 4G | |
KR102243903B1 (en) | Comand and control system for supporting compound disasters accident | |
CN105116544A (en) | Electronic glasses operated by head | |
KR101682705B1 (en) | Method for Providing Augmented Reality by using RF Reader | |
CN109921818A (en) | A kind of intelligence multiple spot real-time three-dimensional interaction helmet-mounted display system | |
CN104161502A (en) | Monitoring system, combination use method and combination use monitoring method of carry-on equipment | |
CN104732682A (en) | Wearable fire fighting reminding device and escape reminding system | |
CN207054942U (en) | Intelligent fire-fighting helmet device based on 4G communications | |
CN104596511A (en) | Positioning information source terminal device capable of being worn by firefighter | |
CN108012141A (en) | The control method of display device, display system and display device | |
CN206451251U (en) | A kind of fire-fighting equipment patrol instrument |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
TA01 | Transfer of patent application right | ||
TA01 | Transfer of patent application right |
Effective date of registration: 20230801 Address after: Room 1111, Block B, Science and Technology Building, No. 7186 Satellite Road, Changchun City, Jilin Province, 130000 Applicant after: Zuo Zhiquan Address before: 130022 No. 7089 Satellite Road, Changchun, Jilin, Chaoyang District Applicant before: CHANGCHUN University OF SCIENCE AND TECHNOLOGY |
|
GR01 | Patent grant | ||
GR01 | Patent grant |