Utility model content
The utility model embodiment provides a kind of Intelligent fire-fighting helmet device based on 4G communications, can believe scene of fire
Cease real-time AR to be shown to helmet user and be transferred to background server, so as to improve the efficiency of fire-fighting processing.
The utility model embodiment provides a kind of Intelligent fire-fighting helmet device based on 4G communications, including mutually interconnects
The helmet portion and portion of the handle connect, the helmet portion include IMAQ concatenation module, and the portion of the handle includes master control
Device module, 4G communication modules, positioning navigation module and AR display modules processed;
Described image acquisition module is used to gather first via visible images data and the second road visible images number in real time
According to, and the panoramic image data obtained after the two-way visible images data collected are spliced is sent to the Handheld Division
The main controller module divided;The main controller module is used to after the panoramic image data is carried out into denoising enhancing processing encode
Obtain being suitable to the image information sent by the 4G communication modules;The positioning navigation module is used to obtain present bit confidence
Breath, and the current location information got is sent to the main controller module;The main controller module is used for by described in
Image information and current location information are exported to background server by the 4G communication modules and are sent to AR display modules
Shown;The main controller module is shown for the data message of background server return to be sent into AR display modules
Show.
As the improvement of such scheme, the first visible ray that described image collection concatenation module includes being placed in helmet front portion is taken the photograph
As head and it is placed in the second visible image capturing head at helmet rear portion, the first visible image capturing head is used to gathering the first via visible
Light image data, the second visible image capturing head are used to gather the second road visible images data.
As the improvement of such scheme, described image collection concatenation module also includes FPGA, and the FPGA includes video figure
As receiving unit, video image concatenation unit, video image compression coding unit, the FPGA passes through IIC controller controls respectively
The first visible image capturing head, the second visible image capturing head collection two-way visible images data are made, video image receives single
The two-way visible images data received are sent to the video image concatenation unit and are spliced to obtain panoramic picture by member
Data, the video image compression coding unit are based on H.656 code database and the panoramic image data are compressed and encoded
Handle the panoramic image data after being compressed.
As the improvement of such scheme, the FPGA is also write the panoramic image data after compression by SARM controllers
SRAM memory is read from SRAM memory, and as needed sends the panoramic image data in SRAM memory to described
The main controller module of portion of the handle.
As the improvement of such scheme, the main controller module uses arm processor, including image denoising enhancement unit
And the panoramic image data received is write SRAM memory or from SRAM memory by MFC, the ARM by SARM controllers
Read, and the panoramic image data is carried out at denoising enhancing with image denoising enhancement unit based on OPENCV visions Cooley
Reason, and using MFC to the panoramic image data is compressed and encoded based on H.264 code database after denoising enhancing processing
Processing, to obtain being easy to the view data being transmitted by the 4G communication modules.
As the improvement of such scheme, the portion of the handle also includes environmental monitoring module, and the main controller module leads to
IIC interfaces are crossed to be connected with the environmental monitoring module to control the environmental monitoring module to work and obtain the environmental monitoring mould
The environmental information that block collects;The environmental information that the main controller module is additionally operable to receive is communicated by the 4G
Module exports to background server and is sent to the AR display modules and is shown.
As the improvement of such scheme, the positioning navigation module includes GPS module, and the main controller module passes through
UART interface is connected with the GPS module, to control the GPS module to work and obtain the positioning that the GPS module collects
Information.
As the improvement of such scheme, the positioning navigation module also includes RFID, the master controller mould
Block is connected by SPI interface with the RFID, to control the RFID to work, is arranged at scene of fire
Multiple RFID readers be used to the RFID information read being sent to background server, by background server base
The position of RFID is calculated in the RFID information received.
As the improvement of such scheme, the portion of the handle also includes voice device, and the main controller module passes through IIS
Interface is connected with the voice device, by the Instant audio messages that the voice device is sent by 4G communication modules to from the background
Server is sent, and by by 4G communication modules receive the Instant audio messages that the background server returns be sent to it is described
Voice device.
As the improvement of such scheme, the AR display modules include AR glasses, and the main controller module passes through HDMI
Interface is connected with the AR display modules.
Compared with prior art, a kind of Intelligent fire-fighting helmet device based on 4G communications that the utility model embodiment provides
First via visible images data and the second road visible images are gathered by the IMAQ concatenation module of helmet portion in real time
Data, and the panoramic image data obtained after the two-way visible images data collected are spliced is sent to described hold
Partial main controller module;The main controller module encodes after the panoramic image data is carried out into denoising enhancing processing
To the image information suitable for being sent by the 4G communication modules;The positioning navigation module is used to obtain current location information,
And the current location information got is sent to the main controller module;The main controller module is used for described image
Information and current location information are exported to background server by the 4G communication modules and are sent to the progress of AR display modules
Display;The main controller module is shown for the data message of background server return to be sent into AR display modules.
Therefore, the Intelligent fire-fighting helmet device based on 4G communications that the utility model embodiment provides can be real-time by scene of fire information
AR is shown to helmet user and is transferred to background server, so as to improve the efficiency of fire-fighting processing.
Embodiment
Below in conjunction with the accompanying drawing in the utility model embodiment, the technical scheme in the embodiment of the utility model is carried out
Clearly and completely describing, it is clear that described embodiment is only the utility model part of the embodiment, rather than whole
Embodiment.Based on the embodiment in the utility model, those of ordinary skill in the art are not under the premise of creative work is made
The every other embodiment obtained, belong to the scope of the utility model protection.
It is a kind of structural representation of fire-fighting AR Hull-Mounted Sets in the utility model embodiment 1 with reference to figure 1.This practicality is new
The fire-fighting AR Hull-Mounted Sets of type embodiment include helmet portion 1 and portion of the handle 2, and helmet portion 1 and portion of the handle 2 pass through cable
Connection.Fig. 1 illustrates sets the first visible image capturing first 111 and the second visible ray figure respectively in the front portion of helmet main body and back
As 112, cable interface is left in the side of helmet main body, portable equipment is connected by cable interface with the helmet.
Wherein, as shown in Fig. 2 the helmet portion 1 includes IMAQ concatenation module 11, the portion of the handle 2 includes
Main controller module 21,4G communication modules 22, positioning navigation module 23 and AR display modules 24.
Described image collection concatenation module 11 gathers first via visible images data and the second road visible images in real time
Data, and the panoramic image data obtained after the two-way visible images data collected are spliced is sent to described hold
Partial main controller module 21.The main controller module 21 is used to the panoramic image data carrying out denoising enhancing processing
Coding obtains being suitable to the image information sent by the 4G communication modules 22 afterwards.The positioning navigation module 23 is used to obtain and worked as
Front position information, and the current location information got is sent to the main controller module 21.The main controller module
21 are used to by the 4G communication modules 22 export described image information and current location information to background server and hair
AR display modules 24 are given to be shown.The data message that the main controller module 21 is additionally operable to return to background server is sent out
AR display modules 24 are given to be shown.
As shown in figure 3, the described image collection concatenation module 11 of the helmet portion 1 includes being placed in the first of helmet front portion
Visible image capturing first 111 and the second visible image capturing first 112 for being placed in helmet rear portion, first visible image capturing first 111
For gathering first via visible images data, second visible image capturing first 112 is used to gather the second road visible images
Data.Visible image capturing head collection image is respectively adopted in helmet anterior posterior, it is possible to achieve scene of fire pan-shot, is fire-fighting
Personnel thoroughly put out the condition of a fire and provide great guarantee at the scene.
Described image collection concatenation module 11 also includes FPGA123, and the FPGA123 includes video image receiving unit
1231st, video image concatenation unit 1232 and video image compression coding unit 1233.The FPGA123 is controlled by IIC respectively
Device processed controls the first 112 collection two-way visible images data of first 111, second visible image capturing of the first visible image capturing, depending on
The two-way visible images data received are sent to the video image concatenation unit 1232 by frequency image receiving unit 1231
Spliced to obtain panoramic image data, the video image compression coding unit 1233 is based on H.656 code database to described complete
Scape view data is compressed the panoramic image data after being compressed with coded treatment.The FPGA123 is also controlled by SARM
Panoramic image data after compression is write SRAM memory or read from SRAM memory by device 1234 processed, and as needed will
Panoramic image data in SRAM memory is sent to the main controller module 21 of the portion of the handle 2.
With reference to figure 3, the main controller module 21 uses arm processor, including the He of image denoising enhancement unit 211
MFC212, the arm processor by SARM controllers 213 by the panoramic image data received write SRAM memory or from
SRAM memory is read, and the panoramic image data is entered based on OPENCV vision Cooley image denoisings enhancement unit 211
Row denoising enhancing is handled, and utilizes MFC212 based on H.264 code database to the panoramic image data after denoising enhancing processing
It is compressed and coded treatment, to obtain being easy to the view data being transmitted by the 4G communication modules 22.
In the present embodiment, the arm processor of the portion of the handle 2 uses Cortex-A9 frameworks, its small volume, low work(
Consumption, low cost, high-performance, can meet the requirement of scan picture.Arm processor is connected by system bus with FPGA,
FPGA is transferred to arm processor to the video image of acquisition by system bus, arm processor operation (SuSE) Linux OS, leads to
Cross and call image procossing storehouse to carry out denoising and enhancing processing to the image of acquisition, then by the MFC modules of arm processor to video
Image Coding is easy to be wirelessly transferred H.264 to format.
The positioning navigation module 23 includes GPS module.The arm processor of the portion of the handle 2 passes through UART interface 214
The location information that the GPS module is sent is received, and the location information is sent to backstage by the 4G communication modules 22
Server.The positioning navigation module 23 also includes RFID, and the main controller module 21 passes through SPI interface 215
It is connected with the RFID, to control the RFID to work.The multiple RFID for being arranged at scene of fire are read
The device is taken to be used to the RFID information read being sent to background server, by background server based on receiving
The position of RFID is calculated in RFID information.The positioning navigation module 23 of the present embodiment include GPS and
RFID is positioned, and can confirm scene of fire rapidly according to GPS location, but GPS is easily blocked by building in enclosed environment, shadow
Ring positioning precision.And use RFID to position, because active RFID tag includes power supply, the power supply of label chip can be given, makes mark
Label can produce positive external signal, have very long reading distance and Large Copacity internal memory on label, can store more information,
Therefore active RFID is used to position.Because each RFID label tag has unique electronic code, according to the electronics mark of Hull-Mounted Set
Label, can specific to everyone, from Fig. 4 RFID positioning principles, in scene of fire the appropriate reader of placement according to
The electronic tag of each fireman's Hull-Mounted Set, can accurately navigate to each fireman.Background server be (command centre
Monitoring system) multichannel field data can be received simultaneously includes the positional information of fireman, and can be carried out with any team member real
When voice communication, make correct judgement for fire fighter.
Specifically, RFID is active label, tag memory storage fireman numbering and its essential information, it is arranged at
In portion of the handle (such as portion of the handle) 2.RFID reader is fixed multifrequency card reader, and it is each to be fixedly installed on scene of fire
Individual place, RFID information is read, contactless two-way communication friendship is carried out by RF-wise with RFID
Data are changed to reach the function of identification and positioning.RFID reader is carried out contactless with RFID by RF-wise
Function of the bidirectional communication data exchange to reach identification and position.Each RFID reader can be entered by WIFI and background server
Row data communication, send fireman's numbering, essential information and location information in RFID electronic labels.
With reference to figure 3, the portion of the handle 2 also includes voice device 13, and the main controller module 21 passes through IIS interfaces
220 are connected with the voice device 13, and the Instant audio messages that the voice device 13 is sent are passed through into 4G communication modules 22
Sent to background server, and the Instant audio messages that the background server return is received by 4G communication modules 22 are sent out
Give the voice device.So, the commander of background server, which can receive multichannel field data simultaneously, includes fireman's
Positional information, and real-time speech communicating can be carried out with any team member, make correct judgement for fire fighter.
With reference to figure 3, the portion of the handle 2 also includes environmental monitoring module 14, and the main controller module 21 is connect by IIC
Mouth 216 is connected with the environmental monitoring module 14 to control the environmental monitoring module 14 to work and obtain the environmental monitoring mould
The environmental information that block 14 collects.The environmental information that the main controller module 21 is additionally operable to receive passes through the 4G
Communication module 22 exports to background server and is sent to the AR display modules 24 and is shown.
With reference to figure 3, the main controller module 21 of the portion of the handle 2 of the present embodiment not only can by image information, work as prelocalization
(main controller module 21 sets PCIE interfaces 217 and the 4G communication modules 22 by 4G communication modules 22 for information, environmental information
Connection) send to background server, image information, current location information, environmental information are also sent to AR display modules 24 and entered
Row real-time display.Wherein, the AR display modules 24 include AR glasses 241, and the main controller module 21 passes through HDMI
218 are connected with the AR display modules 24.Therefore, the Intelligent fire-fighting helmet device of the present embodiment can both pass through 4G communication modules
4G/WIFI by the command centre of disaster relief field data real-time delivery to background server, moreover it is possible to by AR glasses be fireman
Real-time display fire ambient condition information.
It should be understood that as shown in figure 1, the AR glasses 241 of the utility model embodiment itself and it is embedded in helmet main body
Front, the AR display modules 24 also include being used to drive the AR glasses drive modules of the AR glasses 241, the AR eyes
Mirror drive module setting is connected in the portion of the handle (equipment) 2, and by corresponding interface with the AR glasses 241.
In addition, the portion of the handle 2 of the present embodiment is also equipped with power module and reset key, power module can fill to device
Electricity, power supply and power-off, reset key can make system reset be original state.
In summary, a kind of Intelligent fire-fighting helmet device based on 4G communications that the utility model embodiment provides passes through head
The IMAQ concatenation module of helmet part gathers first via visible images data and the second road visible images data in real time, and
The panoramic image data obtained after the two-way visible images data collected are spliced is sent to the portion of the handle
Main controller module;The main controller module will encode after panoramic image data progress denoising enhancing processing to be suitable to
The image information sent by the 4G communication modules;The positioning navigation module is used to obtain current location information, and will obtain
The current location information got is sent to the main controller module;The main controller module be used for described image information and
Current location information is exported to background server by the 4G communication modules and is sent to AR display modules and is shown;Institute
Main controller module is stated to be shown for the data message of background server return to be sent into AR display modules.Therefore, originally
The Intelligent fire-fighting helmet device based on 4G communications that utility model embodiment provides can show the real-time AR of scene of fire information
To helmet user and background server is transferred to, so as to improve the efficiency of fire-fighting processing.In the case where not increasing heavy burden, lead to
Augmented reality and wireless communication technology are crossed, mobile data is provided in real time to implement the fire fighter of fire rescue, instructs fire-fighting people
Member makes correctly action and judges there is important Research Significance.
It should be noted that device embodiment described above is only schematical, wherein described be used as separating component
The unit of explanation can be or may not be physically separate, can be as the part that unit is shown or can also
It is not physical location, you can with positioned at a place, or can also be distributed on multiple NEs.Can be according to reality
Need to select some or all of module therein to realize the purpose of this embodiment scheme.It is in addition, provided by the utility model
In device embodiment accompanying drawing, the annexation between module represents there is communication connection between them, specifically can be implemented as one
Bar or a plurality of communication bus or signal wire.Those of ordinary skill in the art are without creative efforts, you can with
Understand and implement.
Through the above description of the embodiments, it is apparent to those skilled in the art that the utility model
The mode of required common hardware can be added by software to realize, naturally it is also possible to which special integrated electricity is included by specialized hardware
Road, dedicated cpu, private memory, special components and parts etc. are realized.Generally, all functions of being completed by computer program
Easily it can be realized with corresponding hardware, moreover, for realizing that the particular hardware structure of same function can also be
It is diversified, such as analog circuit, digital circuit or special circuit etc..But it is more for the utility model in the case of
It is more preferably embodiment that software program, which is realized,.Based on such understanding, the technical solution of the utility model is substantially in other words
The part to be contributed to prior art can be embodied in the form of software product, and the computer software product is stored in can
In the storage medium of reading, such as the floppy disk of computer, USB flash disk, mobile hard disk, read-only storage (ROM, Read-Only
Memory), random access memory (RAM, Random Access Memory), magnetic disc or CD etc., including some instructions
To cause a computer equipment (can be personal computer, server, or network equipment etc.) to perform the utility model
Method described in each embodiment.
Described above is preferred embodiment of the present utility model, it is noted that for the ordinary skill of the art
For personnel, on the premise of the utility model principle is not departed from, some improvements and modifications can also be made, these are improved and profit
Decorations are also considered as the scope of protection of the utility model.